The Science Behind the Reality-Bending Mandela Effect

If you’ve ever questioned what’s real and what’s not, you’re not alone. 

If you remember when Monopoly’s Rich Uncle Pennybags wore a monocle, or when Curious George had a tail, or when Looney Toons graced your childhood TV set, you’re wrong. 

Pennybags has perfect vision, Curious George is as tail-less as a guinea pig, and the show is correctly spelled Looney Tunes. But before you question your reality, you’re not alone. People all over the world have these same mis- recollections. It’s called the Mandela Effect, and the term can be traced back to one person’s inaccurate memory.

In the 1980s, a self-described paranormal researcher named Fiona Broome claimed that she remembered hearing about the death of Nelson Mandela, the prominent South African anti- apartheid activist, while he was in prison. However, Mandela emerged from prison in 1990 and went on to become president of South Africa in 1994. 

Broome wasn’t the only person to “remember” Mandela’s death. After hearing from other people who had similar recollections of Mandela’s passing, Broome created a website to recount her false memory and dubbed the occurrence the Mandela Effect. Memories are unreliable for many reasons, but a contributing factor might be the complex arrangement of memory storage in our brains. We don’t have a central memory storage unit. 

“Different aspects of experiences are stored in different parts of the brain, and they are linked together by a brain structure known as the hippocampus,” says Daniel Schacter, Ph.D., a Harvard professor, psychologist, and author of The Seven Sins of Memory. The hippocampus resides in the temporal lobe. It’s the metaphorical intersection of our brains’ complex memory highway, and it stores our long-term memories.

To retrieve a memory, we have to use different parts of our brains and “different elements of an experience,” says Schacter. Memories are psychological combinations of visual perceptions, auditory perceptions, and emotional responses. They’re not like photos, Schacter says. “They reflect our interpretations of our experiences, and are not literal recordings of what happened.” 

Memories are complicated even more by the influence of stored knowledge and past associations. When our brains don’t have all the information they need to relay a full memory, they fill our memory gaps with educated guesses based in what they already think is true. That’s when things become confused in our memory-retrieval process.

Our brains prefer to fill information gaps with inferences or assumptions rather than leave them vacant, says Christopher Dwyer, Ph.D., assistant lecturer in applied psychology at the Technological University of the Shannon: Midlands Midwest in Ireland. This process is known as confabulation. It’s rooted in ancient survival instincts that encourage your mind to play it safe at the small possibility of danger. “Humans generally don’t like uncertainty or confusion because it implies an ‘unknown’—and people fear the unknown,” Dwyer says. 

This bias toward a complete picture of the past applies to the present, too. Internalizing new information passes through these same cognitive filters for many of the same survival-motivated reasons. Dwyer says: “There’s no guarantee that [new] information will be processed in a way that’s complete or accurate.” If new information contradicts something we already believe, we might twist that new information to make it fit the pattern we’re familiar with seeing. We could receive factual information “but choose not to accept it,” Dwyer says. 

Instead, we might dismiss or mold the incoming information to be “good enough” to fit what we already believe to be true, even if, in Dwyer’s words, that information is “not entirely accurate and sometimes, just plain wrong.” 

Dwyer says this warped way of thinking can also confirm false information, particularly if the falsehood coincides with a perspective or attitude we already hold.

We’re unreliable on our own, but others might be even worse: Our memories are vulnerable to external suggestibility as well, says Elizabeth Loftus, Ph.D., distinguished professor of psychological science at the University of California, Irvine, who specializes in cognitive psychology, human memory, and psychology and law. Being inclined to believe or act on the ideas of others can be tied to many things: heightened emotions, low self- esteem, personal assertiveness, even age. But it can also depend on how much we trust the source of the idea (see sidebar). When someone we trust—a family member, politician, or social-media influencer—spreads misinformation, it can lead to another kind of pseudo-memory, false not because our memories are inaccurate, but rather because the base information was never true.

Misinformation creates a form of collective false memories. Today, a mere 26 percent of Americans are “very confident” they can differentiate between fake news and reputable news, per the statistics and data company Statista. Overall media trust is poor in the United States. Among 92,000 people surveyed in 46 different countries for a 2021 Reuters Institute report, Americans ranked last in media trust: Just 29 percent of surveyed Americans said they trusted in the news overall, while only 44 percent said they trusted news they use. As mistrust and misinformation spread, the alternate realities of the Mandela Effect seem more and more real.

As for the original Mandela Effect, the most likely explanation for Fiona Broome’s (and others’) mistake is that she confused Mandela with Steve Biko, a different anti-apartheid activist imprisoned at the same time as Mandela. Biko had actually died in prison, in 1977. 

Each instance of the Mandela Effect, from Curious George’s tail to Uncle Pennybags’s monocle, might have an objective truth, but the psychological origins of each mix-up are specific to each individual. Do you remember the monocle because you assume older people have poor eyesight? Or do you conflate top hats with monocles, perhaps? If the Mandela Effect created a glitch in your personal matrix, the takeaway might not be to question your reality. It might be to question your assumptions. 

Planting False Memories on Purpose

Let’s just say Inception isn’t totally far-fetched. It’s possible for someone to plant false memories or misinformation in your mind on purpose—with you utterly convinced that information is the truth. 

Consider the case of a Wisconsin woman named Nadean Cool. In 1986 , Cool sought trauma therapy from Kenneth Olson, a psychiatrist. Olson used suggestive techniques, including hypnosis, to convince Cool she had repressed memories of being in a satanic cult, cannibalizing infants, and witnessing the murder of a child. Olson alleged Satan had possessed Cool, so he performed exorcisms on her. She came to believe she had as many as 120 personalities, including various angels as well as the devil. Years later, after Cool’s family helped her realize Olson’s malice, Cool sued her former doctor. The case settled out of court in 1997, and she won $2.4 million.

Cool later admitted to questioning some of Olson’s diagnoses during his treatment of her, but went along with his treatment suggestions because she was reliant on his care. Cool also claimed the medications Olson prescribed her caused her to hallucinate, which may have made her more susceptible to his suggestion. Ultimately, Cool had fallen victim to Olson abusing his power as an authority figure. ro

From Popular Mchanics