The Next 100 Things You Need To Know About People: #108 — Our 5 Senses Are Swappable

picture of brainport deviceOver 285 million people are visually impaired in the world. What if they could see using their taste buds in the tongue rather than their eyes?

A woman who is blind puts on a pair of glasses that contain a camera. The image from the camera is sent to a small device about the size of a postage stamp that sits on her tongue. She feels a sensation like soda bubbles on her tongue—this is the camera signals being sent to electrodes on her tongue. This information then goes either to the visual cortex or to the part of the brain that processes taste signals from the tongue. The scientists who developed this technology say they aren’t sure which part of the brain is actually receiving the information from the tongue in this situation.The taste buds are seeing — The experience of the woman when her brain receives the signals from her tongue is that she sees shapes. The vision is not the same as normal sight, but she can see enough that she can better navigate her environment. People who are totally blind can find doorways and elevator buttons when they use the device, called a BrainPort. They can read letters and numbers and pick up everyday objects, for example, a fork at the dinner table.

The brain is learning — When someone uses the BrainPort at first, they don’t see anything. It takes fifteen minutes for them to start to interpret the signals as visual information. Interestingly, it’s not that they have to “learn” anything —it’s not that they are conscious of practicing. The brain is unconsciously learning to interpret the information as vision.

Design to augment — According to the World Health Organization, over 39 million are blind and 246 million have moderate to severe visual impairment.Over 360 million have disabling hearing loss. Until now, designing devices for people with visual, auditory, or other physical impairments has been an area that a small number of designers have worked on. The rest of the designers have been told to make their designs “accessible” so that the special devices (such as screen readers) are compatible and can use the mainstream technologies. Keeping accessibility in mind is always important, but now more designers will be directly designing devices that are specifically created to augment the impaired sense.

What do you think? Will these devices become more common? If you are a designer would you like to work in this field?


If you liked this article, and want more info like it, check out my newest book: 100 MORE Things Every Designer Needs To Know About People. 



A Podcast on Affordances and Adaptive Interfaces with Justin Davis

Photo of Justin Davis
Justin Davis of Madera Labs

Justin Davis of Madera Labs is a great speaker and a lot of fun to talk with. I met Justin in 2010 in Lisbon Portugal, where we were both speaking at the UXLX conference.  I invited him to speak on a panel with me at the HCI conference in 2011. I think we talked non-stop for 5 hours one day at the conference. Most of that was just because we can’t stop talking about user experience and designing interfaces, but for a half hour we turned on the microphones and recorded an interview together. It’s a deep dive into affordances and adaptive interfaces.

You can listen to the podcast by clicking on this link (30 minute podcast)

In this podcast we talk about:

What is an affordance 

Are affordances important on even something as mundane as a form (the answer is yes)

Have affordances been disappearing over time in interfaces?

Why it is a problem when affordances are missing

Is there a clash between visual design styling and human cognition

The 4 types of affordances — (we refer to the chart below in the podcast):

Chart about affordances

How thinking about affordances helps pay attention to the small things that are important but can be overlooked.

In addition to talking about affordances we talked about adaptive interactions – where the website/app changes based on the users’ actions, including:

Content based adaptation – change content on the page based on your past behavioral data

Content based filtering – change your interaction choices based on your past behavior

Collaborative filtering – change content and interaction based on what others have done who seem to like what you like

Interaction adaptation – the interface changes, not based on content consumption, but based on how you move through the interface.

What do you think? Are affordances important? What is the future of adaptive interfaces?

FYI — Justin’s twitter address is: @jwd2a