How Our Intuitions Deceive Us - The Invisible Gorilla

ByChristopher Chabris

feedback image
Total feedbacks:27
13
10
1
2
1
Looking forHow Our Intuitions Deceive Us - The Invisible Gorilla in PDF? Check out Scribid.com
Audiobook
Check out Audiobooks.com

Readers` Reviews

★ ★ ★ ★ ☆
tarar
A very insightful book. You walk away with an understanding of how illusions of reality are formed in our minds. The writer has thoroughly documented his presentation. However, I found the detailed case descriptions to be a bit more elaborate than I was interested in, so I had to skim through them to get to the next eye-opening concept.
★ ★ ★ ★ ★
caroline pattison
I stumbled upon this book by chance, there was an article written about it in the Wisconsin Pharmacist's Association magazine and it seemed like an interesting book. Once I started reading it, I couldn't put it down. It is very thought provoking and makes you think differently about the world. I think I am probably a better person for reading the book.
★ ★ ★ ★ ★
bits
A really interesting book concerning all the ways in which humans deceive themselves. Written with humour, but also with great examples that show just how detrimental these deceptive illusions can be. This book is for anyone interested In learning more about the intriguing things and behaviours that aren't really what they seem.
Why We Justify Foolish Beliefs - and Hurtful Acts by Tavris :: BBW Paranormal Shape Shifter Romance - Piece of Tail :: Galaxa Warriors (Paranormal Dating Agency Book 14) :: Galaxa Warriors (Paranormal Dating Agency Book 15) :: Bad Decisions and Hurtful Acts - Why We Justify Foolish Beliefs
★ ★ ★ ★ ★
erikka
A really interesting book concerning all the ways in which humans deceive themselves. Written with humour, but also with great examples that show just how detrimental these deceptive illusions can be. This book is for anyone interested In learning more about the intriguing things and behaviours that aren't really what they seem.
★ ★ ★ ★ ★
constance lapsati
A great work with key contributions to understand how Human could be deceived.
The book should be read by managers and other professionals that are required to take decisions dayly and risk to rely on intuition
★ ★ ★ ★ ★
dennis dallaglio
Has a white cover and many pages. Would buy again!

Seriously though, I do recommend this book quite highly for both scientists and layfolk. Fair Warning: reading this book will change how you understand your own ability to perceive the world.
★ ★ ★ ★ ☆
batsheva knopfler
Sale price sticker on cover was the only thing that would make this a 4 star review instead of a 5 star. Otherwise, the book is in perfect condition and delivery time was great, on the early side of the ETA.
★ ★ ★ ★ ☆
jennifer li
Very well written..new ideas.. smart but not complicated..most of the examples are well selected and illustrative..

the one example that failed to impress me was the Hillary Clinton sniper story..It just doesn't make any sense that Clinton was not actually lying..story is too detailed and the political value is obvious.
The other part that fails to impress was what sounded to me like a financial advice on investing in the stock market..the writers themselves fell victims to the curse of knowledge.. their advice is worthless if you consider the return on the average long term stock portfolio held through the past decade...still a great book.
★ ☆ ☆ ☆ ☆
sheri becker
Full of bias statistics to argue certain points, tedious and full of fillers to make there points longer. There were however, some interesting points of the book. Over all very bias in ALOT of the statiticas, and ambiguity they used to cover points like the "power of small numbers," for many of there study's (thinking fast and slow). Honestly the book could be spark noted don't waste your money.
★ ★ ☆ ☆ ☆
dennis chin
Other than a few interesting experiments, this book lacked an real substance. I was really hoping for more. The writing style was bland, and the authors obvious political views/bias were unnecessarily present throughout.
The first thing i noticed about this book that struck me, was being this is a book on psychology and studies, the authors describing their gorilla experiment refer to "around half" of the people didn't see the gorilla. "around half" is even more vague a statement than "nearly" or "almost". just my opinion but i thought that was pretty lame given the nature of the sort of book. i expected actual numbers/hard facts.
★ ★ ★ ★ ☆
yulianus xu
The authors of this book are a pair of psychologists who gained fame for their famous “gorillas in our midst” experiment that illustrated how human visual perception and attention was not as flawless as we would believe. In this experiment, subjects watching a recorded baseball game and instructed to count passes failed to notice a man in a gorilla suit crossing their field of view. The most surprising thing about this experiment to psychologists was that it violated the long held belief that humans will attend to things that are novel in their environment. This would make evolutionary sense, as attending to novelty would allow organisms to become aware of threats to their survival- and yet here was proof that the opposite was the case. As other experiments confirmed, people are quicker to notice familiar things in their visual field than unusual ones.

The authors look at a wide range of mistaken perceptions, beliefs, and illusions. They look into a great many popular beliefs that seem to be supported by science, like the supposed link between vaccines and autism, how listening to Mozart can make your baby smarter, and the idea that you can train your brain via computerized exercises. They also look into generally accepted scientific truths that turn out to lack actual support.

To explain these phenomena, the authors cite several principles, such as the brain’s propensity to find patterns. That’s a necessary part of perception, as it allows us to classify our perceptions and accomplish tasks like separating foreground and background in our visual field. At the same time, it also allows us to see the man in the moon and Jesus in a piece of burn toast. Another principle is the illusion of competence. People typically overestimate their own knowledge or skill in a task, and select experiential evidence that supports this belief.

All of this is presented in clear and well supported exposition,and it makes for very interesting reading. I would, however, disagree with some of their interpretations based on my own background in perception and neurocognition. It’s a well accepted principle in cognitive psychology that our sensation of the world is not a direct reflection of our physical perception. In other words, what we think we see and hear and feel is not just what our eyes, ears, and sense of touch senses, but rather an artificial picture of the world created by our minds based on a combination of sensation and memory.o

Consider that the actual sharp field of view of your eyes is equivalent to the diameter of a quarter held at arms length, and yet we perceive a panoramic image of the world before us. What we actually are perceiving is a world created by our minds, updated by our eyes when they happen to encounter something that differs from the saved image. If we don’t actively scan and attend to an object in our visual field that differs from the constructed visual field, we won’t see it- even if it’s a gorilla at a basketball game.

Or consider language. When we listen to an unfamiliar foreign lang age we hear sounds that don’t appear to have any pattern, and we can’t discern word boundaries. One word simply blends into the next. Yet when we listen to a language we’re fluent in, we hear words that are clearly separate from each other, evenwhenwerunthemalltogether.

Another area where I’d disagree with the authors is in their discussion of an experiment in which subjects were instructed to allocate money between stock and bond funds, and were given financial data on the performance of the funds at different simulated intervals ranging from a month to five years. The best performance was achieved by subjects who received the least number of updates, and did the least amount of shifting of their allocations. They tended to leave the bulk of their money in the stock fund, which, while more volatile in the short run than the bond fund, had significantly higher long term appreciation.

The authors ascribe this to the “illusion of knowledge” in the group receiving more frequent updates. I think there’s a simpler explanation, one that doesn’t require any psychology, but one that might be obvious to an engineer: Aliasing error resulting from using the wrong sampling rate. We’ve all seen how wheels, fans, and other rotating objects appear to change speed and direction when videotaped or filmed. If the frame rate of the camera is slightly faster than the rotation of the object, it will appear to reverse. If the moving object doesn’t have a constant rate of change, the resultant image will appear to jump all over the place, randomly changing direction.

The same phenomenon occurs in measuring economic data, which is why most economic data includes time-averaged sample that filter out sampling errors as well as short term fluctuations that don’t reflect underlying dynamics. The subjects receiving the five-year updates received a truer picture of the relative performance of the two funds that filtered out noise and aliasing errors.

Despite my differences with these and other interpretations, I think this is, overall, an excellent book, and one that should prove both interesting and informative to anyone interested in factors that influence and can distort human perception and belief.
★ ★ ☆ ☆ ☆
lisa springle
Other than a few interesting experiments, this book lacked an real substance. I was really hoping for more. The writing style was bland, and the authors obvious political views/bias were unnecessarily present throughout.
The first thing i noticed about this book that struck me, was being this is a book on psychology and studies, the authors describing their gorilla experiment refer to "around half" of the people didn't see the gorilla. "around half" is even more vague a statement than "nearly" or "almost". just my opinion but i thought that was pretty lame given the nature of the sort of book. i expected actual numbers/hard facts.
★ ★ ★ ★ ☆
killian
The authors of this book are a pair of psychologists who gained fame for their famous “gorillas in our midst” experiment that illustrated how human visual perception and attention was not as flawless as we would believe. In this experiment, subjects watching a recorded baseball game and instructed to count passes failed to notice a man in a gorilla suit crossing their field of view. The most surprising thing about this experiment to psychologists was that it violated the long held belief that humans will attend to things that are novel in their environment. This would make evolutionary sense, as attending to novelty would allow organisms to become aware of threats to their survival- and yet here was proof that the opposite was the case. As other experiments confirmed, people are quicker to notice familiar things in their visual field than unusual ones.

The authors look at a wide range of mistaken perceptions, beliefs, and illusions. They look into a great many popular beliefs that seem to be supported by science, like the supposed link between vaccines and autism, how listening to Mozart can make your baby smarter, and the idea that you can train your brain via computerized exercises. They also look into generally accepted scientific truths that turn out to lack actual support.

To explain these phenomena, the authors cite several principles, such as the brain’s propensity to find patterns. That’s a necessary part of perception, as it allows us to classify our perceptions and accomplish tasks like separating foreground and background in our visual field. At the same time, it also allows us to see the man in the moon and Jesus in a piece of burn toast. Another principle is the illusion of competence. People typically overestimate their own knowledge or skill in a task, and select experiential evidence that supports this belief.

All of this is presented in clear and well supported exposition,and it makes for very interesting reading. I would, however, disagree with some of their interpretations based on my own background in perception and neurocognition. It’s a well accepted principle in cognitive psychology that our sensation of the world is not a direct reflection of our physical perception. In other words, what we think we see and hear and feel is not just what our eyes, ears, and sense of touch senses, but rather an artificial picture of the world created by our minds based on a combination of sensation and memory.o

Consider that the actual sharp field of view of your eyes is equivalent to the diameter of a quarter held at arms length, and yet we perceive a panoramic image of the world before us. What we actually are perceiving is a world created by our minds, updated by our eyes when they happen to encounter something that differs from the saved image. If we don’t actively scan and attend to an object in our visual field that differs from the constructed visual field, we won’t see it- even if it’s a gorilla at a basketball game.

Or consider language. When we listen to an unfamiliar foreign lang age we hear sounds that don’t appear to have any pattern, and we can’t discern word boundaries. One word simply blends into the next. Yet when we listen to a language we’re fluent in, we hear words that are clearly separate from each other, evenwhenwerunthemalltogether.

Another area where I’d disagree with the authors is in their discussion of an experiment in which subjects were instructed to allocate money between stock and bond funds, and were given financial data on the performance of the funds at different simulated intervals ranging from a month to five years. The best performance was achieved by subjects who received the least number of updates, and did the least amount of shifting of their allocations. They tended to leave the bulk of their money in the stock fund, which, while more volatile in the short run than the bond fund, had significantly higher long term appreciation.

The authors ascribe this to the “illusion of knowledge” in the group receiving more frequent updates. I think there’s a simpler explanation, one that doesn’t require any psychology, but one that might be obvious to an engineer: Aliasing error resulting from using the wrong sampling rate. We’ve all seen how wheels, fans, and other rotating objects appear to change speed and direction when videotaped or filmed. If the frame rate of the camera is slightly faster than the rotation of the object, it will appear to reverse. If the moving object doesn’t have a constant rate of change, the resultant image will appear to jump all over the place, randomly changing direction.

The same phenomenon occurs in measuring economic data, which is why most economic data includes time-averaged sample that filter out sampling errors as well as short term fluctuations that don’t reflect underlying dynamics. The subjects receiving the five-year updates received a truer picture of the relative performance of the two funds that filtered out noise and aliasing errors.

Despite my differences with these and other interpretations, I think this is, overall, an excellent book, and one that should prove both interesting and informative to anyone interested in factors that influence and can distort human perception and belief.
★ ★ ★ ★ ☆
sober
Lately, there has been a plethora of books trying to popularize the more interesting and counter-intuitive results from fields like behavioral psychology. All of those books, as far as I'm aware, mention a particularly famous study where participants are asked to view a video of basketball players and asked to count the number of passes. As odd as it sounds, about half of the participants fail to notice the "invisible gorilla" - a man dressed like a gorilla strolling from one side of the court to the other.

These two authors are the inventors of that and subsequent experiments. In other words, these authors are very knowledgeable about their field because, in a sense, they invented one of its primary experiments.

What is their focus in this book? Well, it is not so much that people didn't notice the "invisible gorilla" that surprised them, but the adamance with which participants denied that they could have missed something so obvious. Many disbelieved that there was actually a gorilla in the tape they were shown, accusing he researchers of playing a trick on them. So, the authors' mission in this book is to explore the human tendency toward overconfidence in their abilities.

Each chapter focuses on a different "illusion" that comes from the human tendency to (very subconsciously) overestimate our ability. They are as follows:

Chapter 1 - Illusion of Attention, or, the belief that we are attentive to much more than we actually are at any given moment.
Chapter 2 - Illusion of Memory, or, the illusion that our memories are much more exact than they are.
Chapter 3 - Illusion of Confidence, or, the illusion that confidence (in others) is a good sign of competence.
Chapter 4 - Illusion of Knowledge, or, the illusion that we have detailed knowledge about many things that, in fact, we only have vague knowledge of.
Chapter 5 - Illusion of Cause, or, the illusion that two things happening sequentially necessarily signifies scause/effect relationship.
Chapter 6 - Illusion of Potential, or, the illusion that in every human, there is a vast array of untapped potential waiting to come out (if only we learn to use more of our brains, listen to Mozart, "train our brains" etc.)

The thing is that while this book is a very interesting and well-written one for casual reading, each of these illusions has very potentially serious consequences. While the authors present studies and anecdotes in each chapter that illustrate each phenomenon, the message is very serious: if we are not careful to be somewhat aware of our tendency to overestimate our abilities, we could send the wrong person to prison (if we are a witness), spend too much time and money on the wrong things for our child's cognitive development (if we are a parent), or even cause an accident (if we are a texter-while-driving).

For instance, the authors spend a great deal of time in chapter 1 debunking the myth that is multitasking. In reality, study after study show that we can only multi task when (a) all but one of the things we are doing is completely routine, or (b) alternate our attention rapidly, but often ineffectively, between all the things we are doing. It is literally impossible to do two non-routine things well at the same time. And this leads to people thinking that they can text or talk on their cell while driving, when studies show that this leads to the exact same type of delayed reactions exhibited by drunk drivers. Once we text or talk, we can only drive well when nothing unexpected happens. Should a car dart in front of us, our reaction time will be about the same as the drunk driver.

Another example? Chapter 5 spends much time examining the disjunct between how scientific studies work to establish causal connections, and how the human brain does it. The latter often falls victim to seeing causal relationships in events that are simply sequential or correlational. Particularly, the media often tends to report a causal link between x and y when the scientific study only said that factors x and y were correlated (and the cause may be z or something more complex). We also tend, in our personal lives, to give more credence to anecdotes than statistics. Put these together, and it leads to a lot of wasted money and time chasing false leads (like trying to undo autism by not getting children vaccinated, or buying Baby Mozart CD's based on very flawed reports).

All in all, this book is not only interesting and entertaining to read, but has some very serious lessons to teach. One would think a book telling us that we are not often all that we think we are might imbue pessimism into its readers. This book really does the opposite: it shows us that by knowing where we are most likely to make mistakes in estimating our abilities, we actually INCREASE our competence (or, am I just succumbing to the illusion of confidence?).
★ ★ ★ ☆ ☆
mim holmes
This book presents some fascinating findings and provides the reader with intriguing insights into the workings of the mind. However, it is poorly organized and irritatingly repetitive in many parts. The writing style is quite uneven, as is the tone. Some passages are light and entertaining, some are clear and persuasive, and some are unconvincingly dogmatic. Some of the examples are very helpful while others (e.g., the one about Thomas J. Wise near the end of the book) seem to be off the mark because they do not illustrate the point that the authors are trying to make. All in all, it was interesting but felt like a first draft that needs to be gone over by a good editor.
★ ★ ★ ★ ★
tizire
The authors use short stories to illustrate points. They use stories to explain the illusions of attention, memory, knowledge, confidence, potential and cause. In particular they spend a lot of time on an experiment they performed where participants kept track of the number of passes one of the basketball teams made during the game. Then a person in a gorilla suit walks through the game for a minute. Roughly 50% of the people who watch the game don’t see the gorilla. If participants are asked to keep track of additional information during the game 70% of these viewers don’t see the gorilla. The authors don’t explain why many of the viewers do see the gorilla. For the ones who didn’t it may be because there is only so much attention available and when it is concentrated on one aspect, then other obvious, but unexpected, facets like gorillas on the court are missed. Additional research failed to connect seeing unexpected objects such as the gorilla to any particular traits in the people that were able to see the gorilla.

This lack of attention to everything around us is used, via short stories, to explain intriguing anecdotes such as police not seeing crimes happening in front of them, commuters not stopping to listen to a world famous violinist, pilots and auto drivers not seeing vehicles in their path, etc. They explain the difference between having a conversation with someone in your car vs. with someone on a telephone, while driving, is much more demanding of attention and dangerous with the telephone.

The authors use personal and fairly well known events of how well we remember happenings. Famous events such as 9/11, the Challenger disaster, are called flashbulb memories. They find that people believe they know the exact details of how they learned of the event and what they did. Experiments done with individuals and groups show that we rarely get all the details correct, and typically memory is less accurate over longer periods of time. Most people have a more realistic idea that they will not get the details correct on every day happenings. The moral seems to be if you want an accurate memory make sure it’s recorded by video or writing.

Asking people about their intelligence, humor, driving skills, chess playing ability, then measuring these different abilities and comparing their opinion with the actual level of ability gives some interesting results. Most people tend to be somewhat more confident in their skill level than their test scores justify. But most interesting is that the least skilled are the most confident that they have high ability. When these less confident people are given training happily their ability begins to approach their own opinion of their expertise.

People who are very confident frequently wind up in leadership positions, unfortunately their actual ability is often much lower than their confidence. Surprisingly the authors found that group decisions were no better than single person answers on trivia questions. Patients have less confidence in doctors who pause to look up information during an exam. This is strange since the doctor is probably aware of additional possibilities and checking those possibilities makes it a more thorough diagnosis.

For witnesses in trials confidence has great sway with the jurors. And it is true that very confident witnesses are generally more correct than less confident witnesses. But this confidence frequently obscures a lack of evidence against the defendant and results in innocent people being found guilty. Over 30% of these very confident witnesses are proven wrong over time by hard evidence.

Examples of experts such as geneticists guessing the number of genes before the human genome was sequenced are used to show that experts frequently differ from each other and from the correct answer. The geneticists for instance varied in their answer from about 25,000 to 140,000, with the best present guess of 19-25,000. The trouble with this is that the term has more than one definition. I can remember being at a genetics conference before the human genome was sequenced and speaking with the soon-to-be Nobel prize winner Sydney Brenner, Eric Lander of MIT and Craig Venter of Celera and others. After asking how many genes are in the human genome Lander made the interesting comment that it depended on which of six different definitions that they used in his laboratory. I have the feeling that if we look closely at interviews and stories that the experts aren’t as clearly wrong. But they clearly are not correct. The same goes for projecting the time and cost of projects or the result of actions on almost any scale.

For problem in misinterpreting an assignment or problem is frequently due to only superficial knowledge of the area and the consequences of unknowns. We tend not to look at the details of a problem or project in enough detail. A partial solution is studying different aspects and asking questions to thoroughly understand the project.

Weather predictions are known for their lack of precision. But compared to many other areas the constant gathering of meteorological data, refining computer weather models, and correlating these with the ability to predict future weather patterns all make weather predicting better than it gets credit for. Many other areas such as investing, predictions can be very valuable when the predictions increase your wealth. In most markets there are many factors that are almost impossible to predict over long periods of time. So there is a tendency to wait a long time to see if predictions are accurate then to invest just before the system is ready to change again, since the market is likely then near a high, this often results in losses.

Other areas of concern are the tricks we play on ourselves to see cause and effect when it doesn’t exist. For example when a physician, Andrew Wakefield, noted that in 8 out of 12 children developed autism after receiving the measles, mumps, and rubella (MMR) vaccine. This was the total number of patients studied with no controls and no placebos were used. Subsequent statistical analysis of hundreds of thousands of cases shows there is no cause and effect between MMR vaccine and developing autism. Despite this many parents are refusing to have their children vaccinated putting everyone else not vaccinated at greater risk. The emotional outcry of the parents trumps the dry, fact-based studies. This is an example of the tricks we play on ourselves to see cause: perception of patterns in randomness, happenings that occur together as linked when they are not, interpreting events that occur before others as a cause. In all cases we make connections between events when the connection does not exist or at least is unproven.

A lighter example has to do with listening to classical music, especially Mozart and having improved mental ability. This turns out to be a case where one group observed the effect with what appears to be good scientific methods, but no other research group has been able to duplicate it. Recent results indicate that anything that puts us into a positive mood and higher levels of arousal result in small improvements in performance.

A common misconception is that on average people only use 10-15% of their brain. The brain has been thoroughly mapped and all of it is used. A more correct statement is that we use a specific 10-15% of our brain for a specific task, and a somewhat different 10-15% for each different task. It may be true that we rarely use 100% of our brain at one time.

The idea of selling more goods by using subliminal suggestion started in NJ in 1957 and supposedly successfully sold more popcorn and Coke in a movie theater. Actually this was just a made up study by an advertising company to improve their business. Subsequent studies have shown it has no effect.

Doing crossword puzzles, Sudoku, or computer games do show improved performance in the actual skills practiced, but very little if any carryover to other skills or intelligence itself. Damn, easy would have been nice. What really appears to work is aerobic exercise. According to neuroscientist Arthur Kramer, University of Illinois, who conducted multifaceted experiment on exercise and its effect on cognitive ability of seniors. Results of cognitive testing before and after six months of 3 hours a week of walking or stretching and toning exercises showed large improvements on cognitive tests in addition to better heart health for the aerobic, but not stretching exercises. Meta studies of all clinical trials, and actual measurements of subjects’ brains before and after exercise showed a sizeable benefit for cognition in general. The studies didn’t appear to be double blind so they are still open to question.

Somewhat surprising is research that shows improved cognition from playing action computer games. Some of the caveats are that it takes a lot of practice and the carry over to everyday application outside the laboratory is in question. Research has been done into the ability of chess grandmasters to play under pressure such as when blindfolded or having to make moves very rapidly and relating the number of errors they make compared to a regular match. The results indicate that they made more mistakes under pressure. Don’t we all? This raises some question if they are actually using more than intuitive pattern recognition that is very fast. If they are using intuitive pattern recognition time shouldn’t be a factor.

What do we learn from all of this? If you have an important decision to make, check the facts you are basing the decision on, to be sure they are as you remember them. Practical application seems limited, but it is interesting to understand that many of these popular beliefs don’t hold up under well-designed laboratory experiments. Another point is that you can speed up the decision by making the first choice that fulfills your general needs to solve the problem. In many situations it can take a great deal of time to look at all possibilities.
★ ★ ★ ★ ☆
ming
Do you ever feel like your mind is playing tricks on you?

You're not crazy - your instincts are deceiving you (those bastards).

My latest rental from the library, The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us by Christopher Chabris and Daniel Simons, provides a jaw-droppingly fascinating perspective into mental illusions that influence our every word, action, and thought.

Chabris & Simons, both established cognitive psychologists, are best known for their "Gorillas in Our Midst" study (click the link to try it for yourself!). Their gorilla study (and namesake for the book) is used to illustrate the first of 6 everyday illusions: the illusion of attention. In subsequent chapters, the other 5 illusions are explained in detail: memory, confidence, knowledge, cause, & potential.

What we intuitively accept and believe is derived from what we collectively assume and understand, and intuition influences our decisions automatically and without reflection. Intuition tells us that we pay attention to more than we do, that our memories are more detailed and robust than they are, that confident people are competent people, that we know more that we really do, that coincidences and correlations demonstrate causation, and that our brains have vast reserves of power that are easy to unlock. But in all these cases, our intuitions are wrong, and they can cost us our fortunes, our health, and even our lives if we follow them blindly. -- Page 231

Admittedly, the book may sound like a bit of a downer, but I found it to be extremely intriguing. Included with the extensive explanation of each illusion, Chabris & Simons provide straightforward info on how to break these illusions for yourself. As you might expect, the key is to stop & think before you speak or act, ensuring you are acting from a perspective from of misconceptions.

When you think about the world with an awareness of everyday illusions, you won't be as sure of yourself as you used to be, but you will have new insights into how your mind works, and new ways of understanding why people act the way they do. Often, it's not because of stupidity, arrogance, ignorance, or lack of focus. It's because of the everyday illusions that affect us all. -- Page 242

If you have a curiosity for psychology & the instinctual functions of the human mind, I'm quite sure you will enjoy The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us as much as I did.(
★ ★ ★ ★ ☆
ibrahim z
Wonderfully easy and fun read.

Just the right amount of technical talk mixed with real world stories. Some good fun. Some very unfortunate and one heart breaking.

Unfortunate: Witness memories in criminal cases aren't always on point. Most of us knew that. But I found the frequency shocking. There is no finger wagging from the authors. Simple explanations of experiments followed with real life results, and consequences. The story that stuck out for me, a man convicted due to testimony of a rape victim. The victim was lauded for her bravery and convincing testimony, yet was proven wrong after 10 years due to DNA. A chance meeting of the wrongly convicted man and the real rapist prompted the DNA testing. Conclusion: there is zero correlation between the confidence of a witness and her testimony "that's the guy" and the facts.

Shocking: Our long term memories are not so hot. Even "flashlight memories". Meaning, memories of what we were doing, who we were with etc...when a major even occurred such as the JFK assassination, or 9/11.

Concentration and accurate interpretation of the environment. One thing I didn't want to learn, talking on a cell phone with a hands free device does nothing to improve safety vs talking whilst holding a cell phone. Oddly, talking to a passenger degrades safety little.
★ ★ ★ ★ ☆
leah lax
Summary:
Although somewhat supplanted by the more extended analysis of Kahneman's "Thinking, Fast and Slow," "The Invisible Gorilla" covers a range of mental processing errors in detail, shares a few data sets and anecdotes all its own, and is generally an excellent introduction to behavioral psychology for those intimidating by the size of Kahneman's text.

Thoughts and Observations:
-This book offers a reflective, self-analytical presentation of human mental processing errors with the goal of alleviating public overconfidence in their own mental abilities
-The authors place an important emphasis on the limits of behavior studies, and the often risky business of publishing studies that are misinterpreted by the media
-Great 'human interest' choice of topics: cell phones in cars, the spurning of a concert violinist
-A little too much gloating over the ubiquity of their own experiment's success (the gorilla experiment)
-Head-nods to Levitt and Tversky (research partner of Kahneman); Kahneman himself does not seem to be mentioned...
-Some overlap with Hallinan, "Why We Make Mistakes," in their analysis of film continuity errors and script supervisors
-Long analysis of memory: its focus and limits
-Analysis of confidence is probably the best and most important part of the book, and even has some funny stabs at self-help literature
-Analysis of causation errors is the portion of the book now most thoroughly outdated by Kahneman
-Unfair stabs at Gladwell throughout the book: their frequently caricature his theses, which are generally very ambivalent and self-aware; they wrongly make Gladwell look smug/simplistic
-Analysis of 'mental transfer' and video games was for me (a former gamer) *fascinating* -- it's what I'll likely remember from this book a year from now, if nothing else
-I'm also a walker/hiker so I'm glad to hear that studies confirm this is a mental boon
-Excellent summary scenario that ties together all errors covered
★ ★ ★ ★ ★
kennan
Our brain constructs the world of reality for us or does it? The book illustrates the many ways in which it fools us. Well, 'fooling' is a misleading word there because the authors explain why it does not make sense for the brain to 'record' everything for us. There is a limited availability of the resources required for our day-to-day survival and storing / processing all of what we sense would probably mean detrimental to us.

The chapters on Illusion of Attention and Confidence were the best illusions for me. I'll be honest - I totally missed the gorilla in the video and so did my colleagues and friends. They were curious to know more about the book - a very good sign for the book I'm sure. I always make it a point to sound confidence in my words and have always perceived others as better when they spoke confidently myself. The book reveals astounding experiences where confidence has trapped people into a false mindset.

The Ilusion of cause basically tells us that correlation does not always mean causation. I was able to relate this to a quote in Bhagavadgita:

2:47 karmany evadhikaras te ma phaleshu kadachana
ma karma-phala-hetur bhur ma te sango ’stv akarmani

"You have a right to perform your prescribed duty, but you are not entitled to the fruits of action. Never consider yourself the cause of the results of your activities, and never be attached to not doing your duty." The part that says 'never consider yourself the cause of the results of your activities' guides us to not fall prey to this illusion.

I'm truly impressed with the amount of research the authors have done and studied in the making of this book.
★ ★ ★ ★ ★
devin bruce
This is a fascinating book about pattern recognition - how we perceive what we expect to perceive but also misperceive or may even be blind to that which we do not expect to see. We tend to believe that what we see is what is out there in the world to be seen. This belief is often untrue.

The book explores what has become one of the best known illusions. If you are not familiar with this experiment, click here to experience it. The instructions are to count how many times the players wearing white pass the basketball. Don't read further before doing this!

About half the people viewing this video do not see the gorilla. It is as yet unclear what the differences are between those who do and do not see this glaring intrusion.

What made the gorilla invisible? This error of perception results from a lack of attention to an unexpected object, so it goes by the scientific name "inattentional blindness." This name distinguishes it from forms of blindness resulting from a damaged visual system; here, people can't see the gorilla, but not because of a problem with their eyes. When people devote their attention to a particular area or aspect of their visual world, they tend not to notice unexpected objects, even when those unexpected objects are salient, potentially important, and appear right where they are looking. In other words, the subjects were concentrating so hard on counting the passes that they were "blind" to the gorilla right in front of their eyes. (p. 6-7)

What is in some ways even more important than a failure to notice changes is the mistaken belief that we should notice them. Daniel Levin cheekily named this misbelieve change blindness blindness, because people are blind to the extent of their own change blindness... Most people firmly believe that they will notice unexpected changes, when in fact almost nobody does. (p. 55)

Such experiences of selective perception and selective interpretations of our perceptions are very frequent in our lives. They are usually totally outside of our conscious awareness.

I was myself introduced in medical school to issues of these sorts, in a lecture on making the correct diagnosis. In the middle of the lecture, a man with a cast on his leg wandered into through the front door of the lecture hall. He paused and glanced up at the audience of 92 students, appearing disoriented. He then walked over to the lecturer and asked whether this was the discussion on fracture aftercare. The lecturer politely indicated that the room this man wanted was further on down the hall. The man apologized for intruding and walked out the door on the other side of the room, in the direction indicated.

The lecturer shrugged, paused and shuffled his notes as he picked up the mental threads of his presentation, and continued with his lecture. About ten minutes later, he set aside his lecture notes and invited our class to describe the man who had walked across the room in front of us. We were astounded to find that we disagreed on which leg had the cast; the color of the man's hair, clothes and shoes; and numerous other details of what we had each seen!

Our misbeliefs in the accuracy of perceptions have broad implications in our lives. They may get us into serious troubles. Take, for instance, our western ways of dealing with health issues:

Patients trust doctors, perhaps more than they should, and that trust reinforces the confidence that doctors already have. As Keating puts it, "When people go to the doctor, they often believe that the doctor has an ability to make the right decisions for them. That goes beyond the scientific reality. They trust your decision-making more than their own. That's a problem because it encourages doctors to not be honest about what they know and what they don't know. It builds your ego to have people think that you know."

In medicine, the confidence cycle is self-perpetuating. Doctors learn to speak with confidence as part of their training process (of course, there may also be a tendency for inherently confident people to become doctors). Then patients, mistaking confidence for competence, treat doctors more as priests with divine insight than as people who might not know as much as they profess to. This adulation in turn reinforces the behavior of doctors, leading them to be more confident. The danger comes when confidence gets too far ahead of knowledge and ability. [Jim Keating, MD, runs a diagnostic center for clarifying difficult medical problems at the St. Louis Children's Hospital.] As Keating notes, "Equanimity is something we should aspire to, but we ought to get there by building skills, and it should always have a `not sure' component to it so you can continue to learn. There's still a lot of room for humility in our profession." Doctors have to be able to listen to the evidence, admit when they don't know, and learn from their patients. Not all of them are able to overcome their overconfidence. (p. 104-5)

Pattern recognition is an important but limited aspect in the full spectrum of intuition, which also includes psychic awarenesses and participation in the collective consciousness. See more on this in an editorial in IJHC on Intuition (Benor, 2002).

This engaging and informative book is highly recommended for anyone interested in how we perceive and interact with our world.

Reference: Benor, Daniel J. Editorial Musings: Intuition, International J Healing and Caring 2002, 2(2), 1-17.
★ ★ ★ ★ ★
carolina cordero
A very interesting book about six everyday illusions that have potentially dangerous consequences in our lives:

a) The illusion of Attention - We frequently miss what we aren't looking for (See the fascinating video on attention - search for "Test Your Awareness" on Youtube). We talk on cell phones while driving even though it could be life threatening because we believe we can pay attention to both activities.

b) The illusion of Memory - We frequently overestimate our ability to remember. We don't understand how we store and retrieve memory - we "fill in" a lot of detail and don't really capture a flawless video of events.

c) The illusion of Confidence - We believe confident people and associate confidence with being right. We believe Con-men (actually stands for confidence men). We tend to believe people who appear confident as opposed to people who tend to talk in uncertainties and probabilities.

d) The illusion of Knowledge - We often think we know more than we actually do. We tend to equate familiarity with knowledge - just because we are familiar with using a product doesn't mean we "know" how it really works. "Experts" are far more error prone in their estimates/predictions than we think.

e) The illusion of Cause - We tend to see patterns when randomness explains it much better. We have a need to explain everything even though association/correlation doesn't mean causation.

f) The illusion of Potential - We believe in quick fixes; that we use only 10% of our brain; and that listening to music or something else will "unlock" huge hidden human potential. We believe in faulty studies that "demonstrate" it ever after they have been discredited.

I found the book well written, well researched and engaging, even if it was a little simplistic at places. I didn't agree with some conclusions authors make and had questions about some experiments they report. However, I found the book extremely value adding.
★ ★ ★ ★ ★
emily gill
This is a densely packed scientific read that lays bare all our illusions. Looking at the table of contents below, I will later paraphrase what each covers:

Table of Contents:
Introduction: Everyday Illusions
1. I think I would have seen that (The illusion of attention)
2. The coach who choked (The illusion of memory)
3. What smart Chess players and Stupid Criminals have in common (The illusion of confidence)
4. Should you be more like a weather forecaster or a Hedge Fund manager (The illusion of knowledge)
5. Jumping to Conclusions (The illusion of making a Conclusion)
6. Get Smart Quick (The illusion of the quick fix)
Conclusion: The Myth of Intuition
Acknowledgements
Notes
Index

This book covers all the above with studies and stories to back up each illusion. Similar to "Being Wrong" by Kathryn Schultz, it covers many of the ways humans interact with their world and are incorrect in their decisions. Unlike Being Wrong this is a bit more formidable, nearly textbook delivery of the subject. I can see where it would be well placed in a college course in Psychology, but not necessary a book to read while relaxing on the beach. Given my field of work is Human Factors, this was my kind of book and if this is an interest of yours you might dive into it's chock full pages of studies and data like I did.

I did find this particularly interesting in the sense that many studies were cited. They also covered the way the studies were set up and how future ones could be done a bit better. Also they covered how hard it would be to fully set up some studies.

For the average reader, not into the details of a Sensation and Perception course, items that were covered effect many details of everyday life. Please take my sentences on each as a very rough indication on what is covered.

Chapter 1. The illusion of attention was particularly appropriate with the concern today of driving while talking on the phone and how divided attention, is essential attention well below that needed to drive.

Chapter 2. This covered people's abilities to recall events. Especially events that were stressfull. From recollections to what they were doing on 9/11 to cases of personal assaults where people that to recall the victim's identity. Our retelling of a story in effect changes our memories.

Chapter 3. Confidence in our abilities is what Chess Players and Stupid Criminals share. Many of the most confident people don't really have the smarts they think they do.

Chapter 4. Often we think we know more about complex things than we do. Since we make decisions based on incomplete knowledge, it can have unfortunate results.

Chapter 5. Jumping to Conclusions is a culmination of many of the other issues, we may think we know we have enough data or know enough to make a conclusion, but unfortunately often we don't.

Chapter 6. The Quick Fix, is a fallacy. We don't get smarter by listening to Mozart and often new skills only come by practice of the desired skill, not some puzzle to increase brain power.

The Myth of Intuition: Intuition, is often seen as a magical skill some people have and is often based on nothing. Sometimes people are just lucky and other times they are just dead wrong.

In summary, a dense read chock full of studies, anecdotes and data. Amazing how we delude ourselves.
★ ★ ★ ★ ☆
alysa
Solid intro into many of the frequently occurring human biases and psychological errors that plague us. Audio book ably read by Dan Woren. One could not help but be enlightened after reading this book, especially business people who are vastly overconfident in their decision making processes (and I see it everyday at the highest levels).
★ ★ ★ ★ ★
liberte louison
Most of us have heard of the authors' famous experiment. Subjects were asked to watch a basketball game and count passes. Meanwhile a gorilla enters the room. Most watchers don't even notice. They saw the gorilla but didn't notice it. There's a difference.

The implications of this experiment are everywhere. Memories are fragile but so are perceptions. It's hard to understand how we can not see something that literally crosses our path, but these authors provide convincing evidence.

The implications are especially profound in the world of law enforcement. The book is worth buying, owning and promoting just for one sentence on page 115:

"The common law of criminal procedure was established over centuries in England and the United States, and its assumptions are based precisely on mistaken intuitions like these."

Juries often have trouble understanding what to believe and in the twenty-first century, we need to question whether we should make decisions about people's lives based on beliefs that scientific psychologists recognize as mistaken. Amazingly, people are sent to prison based on identifications and memories that are extremely fallible.

The book's last chater develops implications for readers to use in everyday life. I wish this chapter had been a little longer. We may know we are missing gorillas but it's not clear how we can become more aware. We're told that quick decisions make sense in some circumstances (e.g., tasting jam) but not others (e.g., complex decisions where we have data to analyze). Again, I'd like to learn more.

Definitely recommended if you're ready for greater awareness about how minds work.
★ ★ ★ ★ ★
pam brunt
Once, years ago, someone passed out at church. Sure that an elderly person had collapsed, I looked over the pew and "saw" the white-haired head of an old man. I reported this to the paramedics who answered my cell phone call for help. Much to my surprised, the "senior" turned out to be a dark-haired 17-year-old boy. I was the victim of an illusion built, not by my eyes, but by what my mind expected to see.

"The Invisible Gorilla" reports on a now-famous experiment (see it for yourself on the web) by cognitive psychologists Christopher Chabris and Daniel Simons that revealing gaping holes in human perception. At first, I thought that the authors would spin out an entire, dull book about the cosmic implications of their one big experiment. But I was pleased to see the book bloom into a really interesting look at the unexpectedly close-in boundaries of human judgment and perception. The authors phrase these limits as "illusions" - such as the Illusion of Attention, the Illusion of Knowledge or the Illusion of Confidence. In case after case, they illustrate how human beings mistake confident people for competent ones. Or mistake sequence for causality. Or assume that memories are unchanging snapshots of reality. By the end of the book, I was beginning to feel that rationality was a veneer as thin on the brain as frost on a windowpane.

The results have wide-ranging implications or human behavior. If car drivers run into motorcycles partly because they don't expect to see them, or if juries give more weight than they ought to confident witness, or if a doctor who looks up an answer in a reference book is wrongly judged less than competent, then lives and freedom are at risk. The section of the way our "memories" of traumatic events like 9/11 shows that even highly emotional memories change over time. This says much about the problematic way that conspiracy theorists rely on memories long after the event.

Unwavering faith in scientific methods pervaded the book, and a few swipes at Malcolm Gladwell of "Blink" fame. Chabris and Simons used scientific questioning to examine hot topics like reporting on the war in Iraq and the correlation between vaccines and autism and the relative value of intuition and deductive reasoning. Their down-to-earth explanation of the different value of correlational studies (and the breathless headlines they spawn) and randomized tests was alone worth the price of the book alone. Fun, informative and the opposite of sensationalistic.
★ ★ ★ ★ ★
artha nugraha jonar
Vivid, persistent memories often have led us to believe that those memories are true but in "The Invisible Gorilla" authors and psychologists Christopher Chabris and Daniel Simons share the results of their well known experiment (where people watching a video are asked to count the amount of times the basketball is passed and, in the process, miss a woman in a gorilla suit standing in the middle of the frame, beating her chest before running off camera) and other experiments that demonstrate how the more vivid a memory is sometimes the LESS accurate it can be and of how perception and memory can fool us with false memories/observations.

The authors cite a number of stories about people such as Hillary Clinton misremembering landing in a 3rd world country under heavy protection because of violence when, in fact, it was a perfectly calm landing and they had only the usual protection because there was no outbreak of violence. It damaged her reputation during an election year but as the authors point out she wasn't lying just confusing two completely different events. They also share other instances where people have "adopted" the memories of others into their own lives imaginaing that they were there and sharing the story with vivid recollection when, in fact, they were never there and (and in one instance the person related the story about sitting near actor Patrick Stewart in a Maine restaurant in front of the person it actually happened to not realizing that it the other person's story).

The authors go into further depth discussing a wide variety of observations and memories and how they can misinform us. The authors point out that what we often perceive around us (including examples of continuity errors in films and how they crop up even with the diligence of film people)can be wrong and how multi tasking reduces our observational abilities further resulting in more errors, etc.

What makes "The Invisible Gorilla" better than the average book about memory and perception are the facts that the authors have done a number of experiments themselves and can cite the results as well as those of their colleagues to support their thesis AND that both authors are terrific writers reducing complex experiments to easily understandable bits of information and then gleaning what it means to us as the average reaaders.

I'd highly recommend "The Invisible Gorilla" and promise that after you read it you'll be less likely to trust your senses and memory or at the very least (hopefully)be more observant.
Please RateHow Our Intuitions Deceive Us - The Invisible Gorilla
More information