In this episode of HumanTech we talk about the what, why, how, of journey mapping, and even veer off a little to tackle the philosophy of creating UX documents.
This is definitely a “nerdy” episode of Human Tech. We tackle a not very well known design topic:”objects and views”. It’s conceptually the hardest thing I consult on and teach about in user experience design. Learn about this powerful and slightly obscure factor that is the secret key to a usable product.
Gordon Moore was the CEO of Intel for many years. In 1965 he wrote a paper observing that the number of transistors in a dense integrated circuit is able to double (because of advances in design and production) about every two years. This was dubbed “Moore’s law” and was one of the reasons that the capability of various technologies grows and expands and often the size of the technology shrinks.
The rate of technology advance from Moore’s law held till about 2012 when it began to slow down. But now some technologists are seeing it start to accelerate again.
In this episode of Human Tech we take a look at Moore’s Law and what the return of the the original pace might mean for the technology that we use.
Research shows that people tend to make big life decisions at the first of the year, which gives us New Year Resolutions. This is the right time for changes both large and small. (FYI, If you are in a “9” year i.e., 19, 29, 39 and so on, research shows you are even more likely to make a big life decision.)
Instead of following some of the usual folksy advice about how to make and keep New Year’s resolutions, you could, instead, use brain and behavioral science to craft New Year’s resolutions that will actually work.
Here are some ideas of how to do that, and the science behind them.
1, Pick small, concrete actions. “Get more exercise” is not small. “Eat healthier” is not small. This is one reason New Year’s resolutions don’t work.
A lot of New Year’s resolutions are about habits — eating healthier, exercising more, drinking less, quitting smoking, texting less, spending more time “unplugged” or any number of other “automatic” behaviors. Habits are automatic, “conditioned” responses. Contrary to popular opinion, it’s not hard to change habits IF you do so based on science.
If it’s a habit and you want a new one it MUST be something really small and specific. For example, instead of “Get more exercise” choose “Walk for at least 20 minutes at least 4 times a week” or “Have a smoothie every morning with kale or spinach in it”
2. Use visual and/or auditory cues. Want to go for that walk everyday? Set up a place in your home where your walking shoes are. Don’t put them away in a closet. Put them in a place where you will see them when you get home from work or first thing in the morning. The shoes will act as a visual cue. And/or set an alarm on your phone called “Go for a walk” and have the alarm go off every morning at 7:30 am. People become conditioned to auditory and visual cues and that makes it easier for an action to become a habit.
3. Decide what you want, not what you DON’T want. Instead of setting a resolution of “I’m not going to check my email 10 times a day,” set it for what you ARE going to do: “I’m going to use “batching and check my email only twice a day.” Instead of “I’m going to drink less soda”, set the resolution as “I’m going to replace drinking a soda with drinking water.” Although this may seem not that different, it’s important. It’s easier for your brain networks to work on an intention stated in the “affirmative” than it is stated in the “negative”.
4. Write a new self-story. The best (and some would say the only) way to get large and long-term behavior change, is by changing your self-story.
Everyone has stories about themselves that drive their behavior. You have an idea of who you are and what’s important to you. Essentially you have a “story” operating about yourself at all times. These self-stories have a powerful influence on decisions and actions.
Whether you realize it or not, you make decisions based on staying true to your self-stories. Most of this decision-making based on self-stories happens unconsciously. You strive to be consistent. You want to make decisions that match your idea of who you are. When you make a decision or act in a way that fits your self-story, the decision or action will feel right. When you make a decision or act in a way that doesn’t fit your self-story you feel uncomfortable.
If you want to change your behavior and make the change stick, then you need to first change the underlying self-story that is operating. Do you want to be more optimistic? Then you’d better have an operating self-story that says you are an optimistic person. Want to join your local community band? Then you’ll need a self-story where you are outgoing and musical.
In his book, Redirect, Timothy Wilson describes a large body of impressive research of how stories can change behavior long-term. One technique he has researched is “story-editing”:
Write out your existing story. Pay special attention to anything about the story that goes AGAINST the new resolution you want to adopt. So if your goal is to learn how to unplug and be less stressed, then write out a story that is realistic, that shows that it’s hard for you to de-stress, for example, that you tend to get overly involved in dramas at home or at work.
Now re-write the story — create a new self-story. Tell the story of the new way of being. Tell the story of the person who appreciates life, and takes time to take care of him/her-self.
The technique of story-editing is so simple that it doesn’t seem possible that it can result in such deep and profound change. But the research shows that one re-written self-story can make all the difference.
Give it a try. What have you got to lose? This year use science to create and stick to your New Year’s resolutions.
What do you think? What has worked for you in keeping your resolutions?
For more info:
Timothy Wilson’s book Redirect:
Charles Duhigg’s book, The Power of Habit:
My book, How To Get People To Do Stuff
B.J. Fogg’s website: tinyhabits.com
What it is
Why you can’t skip it
What kind of user research to do
Making the ROI case for user research
If you want to learn more about user research by the way, we have an online video course on the topic.
Should we let go of personas? Should we stop using them?
One of the reasons that personas may be looked down on these days in some design circles is because people are making mistakes in how they create or use them. Below I’ve outlined some of the common mistakes I see around personas when I’m called in for consulting on a client project. And after discussing the mistakes, I offer a suggestion for an alternate tool in your Target Audience Toolbox. First, the mistakes:
Which leads us to the tool you need to have BEFORE you create personas, and might be the tool you use INSTEAD of personas…
When I teach about user research there is a step before I teach personas and that is identifying user groups. A persona is a fictional representation of a user group.
Let’s say that you are designing a banking app. Who is the target audience? You decide that you have three different target audiences. One is people who are already customers of your bank and are used to banking online. Another is people who are already customers of your bank, but are not used to banking online, and the third group is people who are not current customers, but are used to banking online, perhaps with one of your competitors.
The next question to ask is how these groups differ. What are the important criteria that distinguishes one from another? Is it whether or not they are current customers? Is it their familiarity with online banking? Is it something else? Where they live? What their native language is? How old they are? Based on the research you have (hopefully) done, you determine which variables are the important ones that distinguish one group from another.
Let’s say that when you look at your information you decide that whether or not they are a current customer won’t make any difference. That two of the user groups vary only on that one criteria, and your research tells you that criteria is not that big a deal in terms of using your new app. In that case you can combine those two user groups into one.
A persona then is just a representative fictional person that summarizes one user group. And the persona would summarize them only on the variables that you think are important (i.e., no cats in this case).
But this also means that maybe you don’t need a persona. Here’s a secret — after creating a lot of personas throughout my career I’m going to confess that I don’t use them when I’m designing. I’ve created the user groups first and that’s what I work off of. I only create personas if a) the client asks me for them or b) we need to share this information out to others, such as stakeholders, developers, and so on. In my experience personas are more approachable than “User Group Tables” to people who are not used to them.
In summary, go ahead and use personas, but try and avoid making these mistakes. And if the personas are just for you, consider using the prequel — the User Group Table — instead.
The Team W is compiling a list of some of the best User Experience, Human/Tech, Design, and Behavioral Science conferences coming up in 2019. If you have a favorite conference (or if you put on a conference) that you would like to be considered for the list please send:
What makes the conference special/the best/a not-to-miss event
Website if available, otherwise a contact person
Send to firstname.lastname@example.org
We’ll compile the list and post it.
I hope you’ve heard of System 1 and System 2 thinking. It’s an idea originally put together by Daniel Kahneman. System 1 is our normal state of brain activity. Watching TV, driving, looking at a picture of a sad face. It’s simple, effortless, and our favorite mode to be in. System 2 is heavy thinking, such as solving a tough math problem, or taking the bar exam to be a lawyer (which this author did and passed, so there). It’s hard, uncomfortable, and actually uses up more calories. It’s literally more work.
The idea that there are two different processing systems in the brain is not new. And it’s probably a much better analogy of how the brain works rather than the traditional “the brain is a computer” metaphor that isn’t accurate.
Much like System 1 and System 2, in 1992 Kirkpatrick and Epstein proposed another way of thinking about these networks in their paper “Cognitive-experiential self-theory and subjective probability: Further evidence for two conceptual systems.”
They propose the idea that there are two modes of processing info, one with an experiential conceptual system, and one with a rational conceptual system. Let me try and simplify this.
The first mode is an experiential conceptual system. Note, this is not experimental, it’s experiential which means observed or perceived. Our experiential system encodes information as “concrete representations” (thanks BEGUIDE 2016). Take this mind journey with me:
Think of a door alone in a long hallway. A single closed door in an empty space.
Through the magic of the brain, you have conjured up an image of a door. You can see its color, how it opens. The space around it. It’s a physical object.
In your mind journey keep thinking about the door, but walk closer. Get so close to the door you can almost smell it. Lean up close to it right before you touch it, and blow softly on it.
I’ll bet your brain made a solid door. Your breath didn’t go through. It’s a real object in your mind.
In the cognitive-experiential self-theory you’ve used your experiential conceptual system to create something observable; it’s an object.
Now instead let’s put you in front of a tricky math problem you have to solve by hand. Say (47*16)/19.
I want you to visualize the answer. What is it? Well. Unless you’re an autistic savant can’t visualize the answer right away. You can’t “see” the answer in the same way you can see the door because you’re using a different system. You have to use the rational conceptual system. You have to remember math and the strategies to multiply and do long division. It’s a different system. It feels different.
Kirkpatrick and Epstein wanted to see if any weird human brain stuff went on when humans had to switch between the two systems. So here’s the experiment they set up (for you purists, I’m skipping to Experiment 3 in their study):
There were two bowls with red and white jelly beans. One was the Big Bowl that had 100 jelly beans, and one was the Small Bowl with only 10 jelly beans.
They set up a game where if you randomly pick a jelly bean and it’s red, you win some money (like $4); but if it’s white you win nothing.
They then put their subjects into one of four conditions. Condition 1 had (and told subjects) there was a 10% win rate. So that means 10 red jelly beans and 90 white jelly beans in the Big Bowl, and 1 red jelly bean and 9 white jelly beans in the Small Bowl.
The odds are the same; either 10/90 or 1/9.
Condition 2 had (and told subjects) there was a 90% win rate. With 9/1 jelly beans in the Small Bowl, and again 90/10 jelly beans in the Big Bowl.
Again, the odds are the same; either 90/10 or 9/1.
Conditions 3 and 4 were the same as Conditions 1 and 2, except the odds were framed as losing. Condition 3 had a 10% lose rate (so the odds and bowls were the same as Condition 2, 9/1 and 90/10), and Condition 4 had a 90% lose rate (so the odds and bowls were the same as Condition 1, 1/9 and 10/90).
Subjects were then put in front of the Big Bowl and Small Bowl and could decide which bowls they wanted to bet on. Here’s the important thing to remember; THE ODDS IN THE BOWLS ARE EXACTLY THE SAME. In every condition the odds for the Big Bowl and Small Bowl are Identical. It’s just that the big bowl has 10x the number of Jelly Beans.
Statistically it makes NO DIFFERENCE which bowl you bet on. If you gave this problem to a computer (and perhaps this is a great question for my Turing Test, to see if you’re AI or a human), it would bet randomly, or 50/50 on the Big or Small bowls. The odds are the same. You make no more or less money betting on one over the other.
So that’s what people did right? Of course not!
When presented with low odds of winning (the 10% win, or 90% lose conditions), about 75% of people chose to bet in the Big Bowl (73.1% for 90% lose and 76.9% for 10% win).
Conversely when presented with high odds of winning (the 90% win, or 10% lose conditions), only about 30% chose to bet in the Big Bowl (30.8% for the 10% lose condition, and 36.5% for the 90% win condition).
When presented with low odds of winning, most people wanted to gamble on a Big Bowl with lots of jelly beans, but when presented with high odds of winning, most people wanted to gamble on a Small Bowl with very few jelly beans.
This provides very strong support for the theory that there are two different systems. Rationally we know the odds are the same, but then our experiential system kicks in. I quote from the BEGUIDE 2016: “our experiential system – unlike the rational system – encodes information as concrete representations, and absolute numbers are more concrete than ratios or percentages.”
When we’re faced with a simply ratio-based math problem we use our rational system. But when we are standing in front of bowls with jelly beans it’s not 90%; it’s 9 out of 10. That kicks us into experiential.
9 out of 10 is almost a sure win; it’s really concrete. Our brains tell us that we want the small bowl because there are “fewer” chances to lose because there are fewer jelly beans. There’s only one loser jelly bean! We only have to avoid one bad bean, but in the Big Bowl we have to avoid 10! Your brain says, “oh, 1 is smaller than 10, that feels better, bet on that”. And this happens even while the rational system tells you they’re the same.
We walk around in non-rational, experiential mode, so people bet the small bowl.
Conversely, when it is only a 1 out of 10 chance of winning, oh man, there’s only one winner jelly bean in the whole Small Bowl. I’d rather have 10 chances of winning, and the big bowl has 10 winner jelly beans, so 10 is more than 1, so let’s bet in the Big Bowl.
Even while the rational system says they’re the same.
People go with their feelings.
Takeaways then. Welp. It’s another nail in the coffin of human rational decision making. If you want people to feel better about making a choice that has small odds of success, they’ll feel better if there are lots of possible winners, even if there are also proportionally just as many chances to lose.
Conversely, if you want people to feel better about making a choice that has high odds of success, minimize the number of losing tickets, even if that means reducing the number of winning tickets. People feel much better when they see numerically only one losing ticket.
Kirkpatrick, L. A., & Epstein, S. (1992). Cognitive-experiential self-theory and subjective probability: Further evidence for two conceptual systems. Journal of Personality and Social Psychology, 63(4), 534-544. doi:10.1037//0022-35184.108.40.2064