Jason is 20 years old and he’s deaf. He puts on a special vest that’s wired so that when it receives data, it sends pulses to his back.
The vest is connected to a tablet. When I say the word “book” into a microphone that feeds into the tablet, the tablet turns the word into a signal that is sent to the vest. Jason now feels a pattern on his back through his sense of touch. Initially, he can’t tell you what the word is. I keep saying words and he keeps feeling the patterns. Eventually, he’ll be able to tell me the words that he’s hearing. His brain learns to take the pattern and translate that into words.
The interesting thing is that this happens unconsciously. He doesn’t have to consciously work at learning the patterns.
This describes an actual project by David Eagleman, a neuroscientist from the Baylor College of Medicine.
Sensory Substitution — Eagleman calls it sensory substitution. Information comes into your body and brain from your eyes, ears, touch, and so on. But did you know that the brain is actually quite flexible and plastic in this regard? When data from the environment comes in, from any of the senses, the brain figures out the best way to analyze and interpret it. Sometimes you’re consciously aware of the data and its meaning, but most of the time your brain is analyzing data and using that data to make decisions, and you don’t even realize it.
Sensory Addition — Eagleman takes the idea of sensory substitution a step further, to sensory addition. He has people (without hearing impairments) put on the vest. He takes stock market data and uses the same program on the tablet to turn the stock market data into patterns, and sends those patterns to the vest. The people wearing the vest don’t know what the patterns are about. They don’t even know it has anything to do with the stock market. He then hands them another tablet where a screen periodically appears with a big red button and a big green button.
Eagleman tells them to press a button when the colors appear. At first they have no idea why they should press one button versus the other. They’re told to press a button anyway, and when they do, they get feedback about whether they’re wrong or right, even though they have no idea what they are wrong or right about. The buttons are actually buy and sell decisions (red is buy, green is sell) that are related to the data they’re receiving, but they don’t know that.
Eventually, however, their button presses go from random to being right all the time, even though they still don’t know anything consciously about the patterns. Eagleman is essentially sending big data to people’s bodies, and their brains interpret the data and make decisions from it—all unconsciously.
Engaging the unconcsious for big data — Big data refers to large data sets that are combed for predictive analytics. The idea is that if you can collect massive amounts of data, even disparate data, and analyze it for patterns, you can learn important information and make decisions based on that information. Data sets of Internet searches, Twitter messages, meteorology, and more are being collected and analyzed. But how do you convey the information in a way that makes sense? How can you get the human mind to see patterns in what at first seems like meaningless data? The conscious thought process is not very good at this task. The conscious mind can handle only a small subset of data at one time, but the unconscious is great at taking in large amounts of data and finding patterns. If you want to see the patterns in big data, you have to engage the unconscious.
A Sensory Room — Other scientists are also working on the idea. Jonathan Freeman, a professor of psychology at Goldsmiths, University of London, and Paul Verschure, a professor at the Universitat Pompeu Fabra in Barcelona, have created the eXperience Induction Machine (XIM). The XIM is a room with speakers, projectors, projection screens, pressure-sensitive floor tiles, infrared cameras, and a microphone. A person stands in the room and big data visualizations appear on the screen. Freeman and Verschure monitor the response of the person in the room through a headset. They can tell when the person is getting overloaded or tired, and then they can make the visuals simpler.
Go direct — When you work with big data, consider the idea of bypassing complex visual analysis and how to represent the data analytically. It’s probably better to feed the data directly to sense organs and let the brain do the analytics.
For more information — Here’s a great TED Talk by Dr. Eagleman
If you liked this article check out my new book, 100 MORE Things Every Designer Needs To Know About People.