November 9, 2005

The Impersistence of Vision

Ok, I had an idea the other day for software and an apparatus to help restore 'sight' to the blind. It is analogous to a device that I believe was described by John Varley in the short story collection "The Persistence of Vision," but has some modifications. The general idea is you hook a camera up to your head and have an n x n matrix of piezoelectric-driven rods strapped to your back or tummy that can poke you to convey information from the image. (Kind of like those toys you see in offices where there's two square pieces of black plastic with hundreds of tiny steel rods through them and a clear piece of plexiglass, and you make a 3D relief map of your face or hand with the rods.)

I've thought up some cool twists on this basic idea that I think could make it very useful. The idea itself doesn't require any science fiction any more; it is completely possible based on current technology. I've done a bit of research into what's currently available for the blind and everything I found was very paltry and crappy; I can't find links to current research being done. It seems pretty obvious (and apparently Varley thought of it long before it was practical), but is it being done?

I know that some of the students I had lunch with at WWDC this year were working in this general field, and I suspect some blog-readers may know a lot more about this than I.

My question for you is, do you know what state-of-the-art is for helping the completely blind form an image of the world around them? Obviously, I may have hit on an idea that others have already explored, and either discarded or already marketed. Also, it may be I'm trying to solve a problem that doesn't need a solution; maybe there isn't a lot of interest in conveying general environment images to the blind, and specialized devices (screen-readers, sonar obstacle finders) are more practical. I think I would want such a device if I lost my vision; I could easily imagine going out for walks with such a thing.

Also, does anyone know of an matrix of poking rods that's already been developed, regardless of whether it's been used for the blind? That is, something I could experiment with.



Blogger Greg said...

In Australia last night was Beyond Tomorrow, a science and tech show, where they showed a division of the Uni of Southern California have developed a system that has a camera on a pair of glasses and you have an implant that connects to the back of the eye to send the electrical signals to the brain. Currently they only have 16 pixels of resolution, but hope to have 1000 pixels within a couple of years.

November 09, 2005 6:48 PM

Blogger Wil Shipley said...

That's a cool idea, but it'll be a while until they get the resolution up, it's always going to require expensive surgery, AND they're going to need a bunch of image preprocessing done before they send the signal to the eye anyways, so I feel like we should work out what useful processing is NOW, and when the implant technology comes on-line we'll have the first part of the pipe ready.

November 09, 2005 6:51 PM

Blogger Greg said...

Found the link:

November 09, 2005 6:52 PM

Anonymous patr1ck said...

I think the difficulty in have an image "Projected", as it were, on to someone's back is that the back doesn't have nearly as many nerve ending as say, the finger, which are used to read braille. Thus, even if you could do that, it would be hard to establish and recall what certain 'scenes' feel like.

I am not a med student.

November 09, 2005 6:59 PM

Anonymous Anonymous said...

Patrick hit it exactly.

The skin isn't sensitive enough to be able to portray something in enough detail to create an image. There was research done into something similar as well as heating and cooling the rods to provide "colour" but it just wasn't serviceable enough for this colour based world. I would look for a link but it was done a decade ago and I doubt it is on the web.

The best way of doing it in my opinion is to hard wire electrodes into a suitable region of the brain and train it to recognise patterns again. This will take a lot of development though.

November 09, 2005 7:29 PM

Anonymous Justy said...

See also John Shirley's long short story "Freezone" in the Bruce Sterling edited collection "Mirrorshades." In that story, some people go around with video screens strapped to their foreheads all the time, so they have these things on their back that take in an image of what's in front of them and give them a tactile version of it.

Cool idea.

November 09, 2005 7:29 PM

Anonymous julian said...

Cognitive Science major here.. patr1ck and others are right. The resolution of haptic receptors in most areas of the body is not high enough for you to get a much more meaningful image than the apparatus greg describes.

A better bet would be generating some sort of topographic image which is then felt with the fingers like braille. "Active touch" (being able to move fingers over the object in various directions as required) gets you much more information than passive touch, especially when it is intentional (that is, moving braille under someone's fingers for them is less useful than the person moving their own fingers over braille).

But again, even with that it's probably not getting you very much, and your hands would have to move rather quickly to figure things out at a speed which is actually useful for anything. The blind are probably better off just directly poking at the world with the walking sticks or listening than spending time moving their hands over a generated image and figuring out what it means.

November 09, 2005 7:45 PM

Anonymous julian said...

I know that at least one piece of more "modern" work on ways to help the blind navigate the world is the UCSB Personal Guidance System.

The researchers involved are doing a lot of Basic research to back up their Applied research on this particular application, such as how well people (both sighted and blind) can navigate from speech directions alone, directional auditory cues, etc.

Note that one of the professors involved is also heavily involved in other haptic research and specializes in haptic research (she did some great work on active touch described above), so if she felt touch were a good way to communicate this sort of thing, she'd probably have gone for it.

November 09, 2005 7:58 PM

Anonymous mike said...

Wired had a story about simliar research a while back, except in this case the sensor was placed on the nerve-rich tongue. It was also covered by the BBC.

November 09, 2005 8:01 PM

Anonymous Anonymous said...

Other Uses:

Applied Minds built a demo map table using moving rods pushing a flexible membrane. An array of projectors drew maps onto the contoured surface.

It was really spiffy.


Carl Coryell-Martin

November 09, 2005 8:02 PM

Anonymous julian said...

mike: oo, yeah, the tongue one. That was cool, forgot about that.

November 09, 2005 8:04 PM

Anonymous Anonymous said...

Yeah I'm a CECS major here at USC and they're doing a lot of that. I believe the resolution is up to 4x4 now? And they're actually connected o the eye so basically you see a black and white dot picture. During a lecture I believe another group is doing something similar but connecting it to the back of the brain.

November 09, 2005 10:27 PM

Anonymous Anonymous said...

A scientist named Bach-y-Rita has been working on this (I believe this was the story on the bbc). There was a pretty mind blowing series of demonstrations with people who had lost their sense of balance, using this to stand up straight with their eyes closed. There was also a trial with a blind guy who byt the end of the demo vid was catching balls thrown at them. Unfortunately the URL I had bookmarked ( is no longer working. I vaguely remember they had server problems, but this was about a year ago, and my memory isn't great. I gave the problem the good old google but couldn't find any video, although the device is certainly called the brainport, as evidenced by this url: However, the principle was very similar to the one you outlined. (I believe they used piezoelectonics and i believe the array was around a humdred pixels or so.)

November 09, 2005 10:53 PM

Blogger Wil Shipley said...

Thanks for the pointers... I have written Dr. Bach-y-Rita, we'll see what he says. Maybe it'll be "screw off kid, ya bother me."

November 09, 2005 10:57 PM

Anonymous Tim said...

One approach to making a practical impact on the lives of people who can't see is to improve the tools they already use.

So here's a low tech idea: In China, there are grooves that run the length of the sidewalks, and bumps in the sidewalk at corners. Many blind people guide themselves by pushing their canes along the grooves; when they get to the bumps they know they've gotten to the corner.

The low tech idea suggests some high tech ones. A cane which provides notification of more distant obstacles (camera + computer + gyroscope?). Landmarks, like corners, that announce themselves using non-visual cues (camera +computer + headphone + 3 dimensional audio cues?).

I'm sure you'll cook up something way more interesting and compelling than those two toss offs... but while you're cooking, keep in mind that blind people have four senses intact, and that low resolution proxies of vision may not be the best way to improve the lives of people who can't see!

November 09, 2005 11:44 PM

Anonymous Tim said...

Sorry for the bad link. Here's one that works.

Chinese Sidewalks for the Blind

November 09, 2005 11:49 PM

Anonymous Nick Matsakis said...

It's definitely been done. I remember reading about some experiments on almost exactly the apparatus you describe done in the 80s in Daniel Dennett's book, Consciousness Explained.

The thing about machine vision, or in this case machine-aided vision, is that it is frightfully difficult in the general case. When we take a step forward, we don't really feel like the image we're seeing changes much, but in reality every "pixel" has changed in a very complex way according to the 3D arrangement and light that made up the image.

The amazing thing, though, is that the brain can make sense of such an avalanche of information even when it is provided in an alternate way. Dennett describes a blind patient who lept back when the zoom setting on the camera he was connected to was adjusted; the adjustment made him feel like he was flying forward at great speed.

I don't know what ever happened to that line of research.

November 10, 2005 1:09 AM

Blogger Tom said...

Some people look to be trying it with sound:

November 10, 2005 12:28 PM

Blogger Erik said...

I'm a student at WSU in MN. The chair of our department has done a bit of research into this area, and the department has a good reputation for trying to expand the accessibility of computers to blind users. This is a little OT, but I'm getting to my point.

My friend and roommate, who I met here at WSU is blind. Meeting him and getting to know him led me to do some research into this area and I've actually read some pretty interesting material on this. One good article about this was in an edition of Analog (sorry, don't remember the date of the edition).

There is a lot of experimentation going on, some devices require surgery, but one of the devices I've read about sends electrical signals through the skin to a part of the brain (not sure how this works, CS student, not Pre-Med). The resolution of this device was about 4x4, and that was about a year ago that I read the article.

The real point is that while there isn't much out there right now, there is more awareness in the CS community and the medical community and I feel that a good and viable solution to this is just around the corner.

November 10, 2005 2:51 PM

Anonymous Anonymous said...

I was thinking something along the lines of advanced 3-d stereoscopic sound enhancement that sends a pulse above typical human hearing that reflects the environment back and creates a soft echo that is relayed to the ears via something like a special hearing aide.

When someone moves in a room for example, the state of the echo changes such that the brain already uses its wiring for determining the volume of an area or distance from a particular object. All that would be needed would be to enhance it.

This is basically if IRC what blind animals use for navigation to great degree of accuracy.

November 10, 2005 4:03 PM

Anonymous Anonymous said...

I think you'd do much better mapping the information to a higher-resolution sense, not touch. I read in psych years ago about a system to do this sort of thing where the visual information was converted into sound. Pitch, tone, texture, volume, source position, vibrato, etc., can all be used to convey information about what's in front of you (or behind, if you're looking over your shoulder). Might be somewhat similar to a sonar arrangement, but could include information on colour etc. The problem, of course, is listening to other things at the same time.

November 10, 2005 5:09 PM

Blogger Dethe Elza said...

I was going to post about the same system as Tom. They are having great success mapping vision to sound, and the brain is able to adapt this to see reasonably well. Besides the main site there is an article at the BBC and you can listen to the radio show Quirks and Quarks where they interview the researchers, subjects, and hear the sound produced by their vision-to-sound system. Maybe not as sci-fi as the vision-to-touch, but it does appear to be working.

November 10, 2005 6:41 PM

Anonymous Anonymous said...

If microsurgery is sufficiently advanced, perhaps sensitive tissue (such as from fingertips) could be removed from a donor site, engineered to make it a better match for this kind of technology, then grafted somewhere else on the body.

For instance, perhaps the nerve endings could be redistributed to precisely align with the artificial stimulators. It might not be a 1:1 match, perhaps clusters of 10 nerve endings would be formed into a circular area which would be stimulated by one precisely positioned rod or electrode.

Adjacent clusters of nerve endings could b spaced apart, which might improve the sensitivity and reduce interference between nerve endings that might blur the input.

If this is too difficult with today's tech, perhaps an easier approach would be to simply numb or kill the nerve endings which are not under the stimulators. You'd get higher 'resolution', however, if you kept all the nerve endings and just moved them around.

November 11, 2005 9:39 AM

Blogger Mathieu said...

Not sure if anyone has already posted this but I aslo saw an Australian tech show some time back a system using like the one Will describes but with the poking done electronically on the tounge, which is more sensitive and has a higher 'resolution' of nerve endings. The demonstration showed a blind person training with basic shapes then being able to 'see' a candle flame.

November 11, 2005 3:47 PM

Blogger Avi said...

Hi Will. I just came across your site and this somewhat old post, but you should check out the work of Dr. Ken Goldberg (Robotics/Telepresence) at UC Berkeley. In particular a project/robotic art installation called the Data Dentata or Data Mitt ( where the sensation of touch (from a live person) was encoded from a device at one end of a phone line (with modem) and the transmitted signal was decoded at the other end to create a corresponding touch (to a person) with a similar device at the other end. I have heard it said that blind people have a very heightened sense of touch whereby they spatially perceive the world. Perhaps using the same technology that underlies the Data Dentata, a camera/computer could translate visual world into a haptic (touch perception) type of system to aid the blind. Ken Goldberg is a really nice guy (a friend of mine from college) and possibly would be open to exploring the ideas you present here toward aiding the blind to "see."

February 26, 2008 12:35 AM


Post a Comment

<< Home