Mind games: Why everything you thought you knew about yourself is wrong

The decisions we make and even the memories we hold are based on delusions, according to a new book

Gillian Orr
Monday 29 October 2012 14:30 EDT
Comments

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

So you remember your wedding day like it was yesterday. You can you spot when something is of high quality. You keep yourself well-informed about current affairs but would be open to debate and discussion, You love your phone because it's the best, right? Are you sure? David McRaney from Hattiesburg, Mississippi, is here to tell you that you don't know yourself as well as you think. The journalist and self-described psychology nerd's new book, You Are Not So Smart, consists of 48 short chapters on the assorted ways that we mislead ourselves every day. "The central theme is that you are the unreliable narrator in the story of your life. And this is because you're unaware of how unaware you are," says McRaney. "It's fun to go through legitimate scientific research and pull out all of the examples that show how everyone, no matter how smart or educated or experienced, is radically self-deluded in predictable and quantifiable ways." Based on the blog of the same name, You Are Not So Smart is not so much a self-help book as a self-hurt book. Here McRaney gives some key examples

Expectation

The Misconception: Wine is a complicated elixir, full of subtle flavours only an expert can truly distinguish, and experienced tasters are impervious to deception.

The Truth: Wine experts and consumers can be fooled by altering their expectations.

An experiment in 2001 at the University of Bordeaux had wine experts taste a red and white wine, to determine which was the best. They dutifully explained what they liked about each wine but what they didn't realise was that scientists had just dyed the same white wine red and told them it was red wine. The tasters described the sorts of berries and tannins they could detect in the red wine as if it really was red. Another test had them judge a cheap bottle of wine and an expensive one. They rated the expensive wine much more highly than the cheap, with much more flattering descriptions. It was actually the same wine. It's not to say wine-tasting is pointless, it's to show that expectation can radically change experience. Yes, these people were experts, but that doesn't mean they can't be influenced by the same things as the rest of us, whether it be presentation or advertising or price. This drives home the idea that reality is a construction of the brain. You don't passively receive the outside world, you actively construct your experience moment by moment.

The Texas Sharpshooter Fallacy

The Misconception: We take randomness into account when determining cause and effect.

The Truth: We tend to ignore random chance when the results seem meaningful or when we want a random event to have a meaningful cause.

Imagine a cowboy shooting at the side of a barn over and over again with a gun. The side of the barn fills up with holes. If you walk over and paint a bullseye around clusters of holes it will make it look like you have made quite a lot of correct shots. It's a metaphor for the way the human mind naturally works when trying to make sense out of chaos. The brain is very invested in taking chaos and turning it into order. For example, in America it's very popular to discuss how similar the Lincoln and Kennedy assassinations were. Elected 100 years apart, Lincoln was killed in the Ford theatre; Kennedy was in a Lincoln automobile made by Ford. They were both killed on a Friday, sitting next to their wives, by men with three names. And so on and so on. It's not spooky. People take hold of the hits but ignore the misses. They are pulled into the things that line up, and are similar or coincidental, but they ignore everything else that's not. The similarities are merely bullseyes drawn around the many random facts.

Confirmation Bias

The Misconception: Your opinions are the result of years of rational, objective analysis.

The Truth: Your opinions are the result of years of paying attention to information that confirmed what you believed, while ignoring information that challenged your preconceived notions.

Any cognitive bias is a tendency to think in one way and not another whenever your mind is on auto-pilot; whenever you're going with the flow. Confirmation bias is a tendency to pay attention to evidence that confirms pre-existing beliefs and notions and conclusions about life and to completely ignore other information. This happens so automatically that we don't even notice. Say you have a flatmate, and you are arguing over who does most of the housework, and both people believe that they do most of the work. What is really happening is that both people are noticing when they do the work and not noticing when they don't. The way it plays into most of our lives is the media that we choose to put into our brains; the television, news, magazines and books. We tend to only pick out things that line up with our pre-existing beliefs and rarely choose anything that challenges those beliefs. It relays the backfire effect, which is a cognitive bias where if we're presented with contradictory evidence, we tend to reject it and support our initial belief even more firmly. When people watch a news programme or pundit, they aren't looking for information so much as confirmation of what they already believe is going in.

Brand Loyalty

The Misconception: We prefer the things we own over the things we don't because we made rational choices when we bought them.

The Truth: We prefer the things we own because we rationalise our past choices to protect our sense of self.

Why do people argue over Apple vs Android? Or one car company versus another? After all, these are just corporations. Why would you defend a brand as if you are their PR representative? We believe that we prefer the things we own because we made these deep rational evaluations of them before we bought them, but most of the rationalisation takes place after you own the thing. It's the choosing of one thing over another that leads to narratives about why you did it, which usually tie in to your self-image.

There are at least a dozen psychological effects that play into brand loyalty, the most potent of which is the endowment effect: you feel like the things you own are superior to the things you don't. When you buy a product you tend to connect the product to your self-image, then once it's connected to your self-image you will defend it as if you're defending your own ego or belief structure.

The Misinformation Effect

The Misconception: Memories are played back like recordings.

The Truth: Memories are constructed anew each time from whatever information is currently available, which makes them highly permeable to influences from the present.

You might think your memory is a little fuzzy but not that it's completely inaccurate. People believe that memory is like a video or files stored in some sort of computer. But it's not like that at all. Memories are actually constructed anew each time that you remember something.

Each time you take an old activation sequence in your brain and re-construct it; like building a toy airplane out of Lego and then smashing the Lego, putting it back into the box, and building it again. Each time you build it it's going to be a little bit different based on the context and experience you have had since the last time you created it.

Oddly enough, the least remembered memory is the most accurate. Each time you bring it into your life you edit it a little more. In 1974 Elizabeth Loftus had people watch a film of two cars having a collision and divided them into groups. Asking each group the same question, she used a slightly different description: how fast were the cars going when they contacted, hit, bumped, collided or smashed? The more violent the wording, the higher they estimated the speed. The way in which questions were worded altered the memories subjects reported.

They weren't looking back to the memory of the film they watched, they were building a new experience based on current information. Memory is actually very malleable and it's dangerous to think that memory is a perfect recording of a past event.

'You Are Not So Smart: Why Your Memory is Mostly Fiction, Why You Have Too Many Friends on Facebook and 46 Other Ways You're Deluding Yourself' by David McRaney (Oneworld, £8.99)

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in