Can Forgetting Help You Remember?

Four times a year, I attend the Yizkor service at synagogue. Yizkor in Hebrew denotes “remembrance,” and the official name of the service, Hazkarat Neshamot, means a “remembering of souls.” During the service, I call to mind loved ones who have died—parents, grandparents, uncles, aunts, close friends—reliving shared times that were cherished, and some that were fraught. I think about what I learned from these people, several of whom were in my life from my first moments of awareness. I recall being taught to swim by my father, hearing my pious Russian grandmother’s tearful account of the Kishinev pogrom, standing by my father’s bedside as a medical student in an underequipped community hospital as he suffered a fatal heart attack. The Yizkor service at my synagogue ends with the Kaddish, the mourner’s prayer, and with a call to perform deeds of loving-kindness in memory of the departed.

Many religions and cultures have rituals structured around remembrance, a fact that suggests how central the ability to remember is to our sense of self, both as individuals and as communities. But how accurate are our memories, and in what ways do they truly shape us? And why does some of what we remember come to us easily, even unbidden, while other things remain maddeningly just out of reach, seeming to slip even further away the more we struggle to summon them?

In “Why We Remember” (Doubleday), Charan Ranganath, a neuropsychologist at U.C. Davis, writes that the question he always gets when he mentions that he studies memory is “Why am I so forgetful?” The title of his book is a riposte to this, a suggestion that it’s the wrong question to be asking. “The problem isn’t your memory, it’s that we have the wrong expectations for what memory is for in the first place,” he writes. “The mechanisms of memory were not cobbled together to help us remember the name of that guy we met at that thing.”

It has never been easier to fact-check our memories against an external record and find ourselves lacking, but Ranganath is intent on giving us a new way of understanding memory. He tracks how ideas about the phenomenon have developed in the course of more than a century of scientific inquiry, and lays out the state of current research. In common with many researchers studying cognition and behavior, he takes a broadly evolutionary view. “The various mechanisms that contribute to memory have evolved to meet the challenges of survival.” It’s easy enough to imagine how being able to retain knowledge about food sources or particular dangers could be lifesaving for our ancestors—“which berries were poisonous, which people were most likely to help or betray them,” as Ranganath puts it. But thinking of memory as an adaptive trait has a less obvious and perhaps more interesting corollary: “Viewed through this lens, it is apparent that what we often see as the flaws of memory are also its features.” In the right circumstances, apparently, forgetting has been useful, too.

The earliest scientist in Ranganath’s account is the German psychologist Hermann Ebbinghaus, who, in the late nineteenth century, attempted to put the study of memory on an objective footing by quantifying its effects. Acting as his own experimental subject, he set about seeing how much data he could remember with a given amount of study. The test he used, chosen for its lack of prior associations, was a welter of meaningless three-letter syllables. Ebbinghaus found that he could memorize sixty-four of these pseudo-words in forty-five minutes before becoming exhausted. However, when he measured his retention, he observed that he had forgotten nearly half the words after twenty minutes. Graphing the rate at which information was lost, he came up with the so-called forgetting curve, a concept that is still influential—for instance, in the design of spaced-repetition learning tools. The forgetting curve starts out steep—a huge amount of information vanishes within sixty minutes—and levels off over several days. As Ranganath notes, “Much of what you are experiencing right now will be lost in less than a day.”

Ebbinghaus’s experiment drew a sharp line between remembering and forgetting, but, a generation later, Frederic Bartlett, a psychologist at the University of Cambridge, showed that the situation is more complicated. Not only do we fail to remember much of what happens to us; even things we remember are often wrong. In a famous experiment, volunteers were told a Native American folk tale called “War of the Ghosts,” selected because it contained cultural details that would be alien to British students. Later, the students recalled the core of the tale but replaced some details with more culturally familiar ones. Instead of words such as “canoe” and “paddle,” they recalled “boat” and “oar”; they replaced “seal hunting” with “fishing.”

From these results, Bartlett concluded that memories are not a simple record of the past but, rather, an “imaginative reconstruction,” in which retrieved information is fleshed out with preëxisting knowledge to compose a story that feels coherent to us. With repeated acts of recall, Ranganath later writes, further alterations creep in, making the memory “like a copy of a copy of a copy, increasingly blurry and susceptible to distortions.” Subsequent research has borne out Bartlett’s insight about the imaginative nature of memory, showing that the neural circuits associated with imagination are active during the act of remembering. Ranganath guides us through the roles of various brain regions, particularly the hippocampus and parts of the prefrontal cortex, and describes some research of his own, which has helped demonstrate the role of the perirhinal cortex in imparting a sense of familiarity. (Notably, this sense can sometimes be triggered even when we are presented with something unfamiliar, leading us to experience déjà vu.)

The picture that emerges is one in which what we call “memory” is less a single thing than a web of interrelated functions. Emotion plays a significant role, particularly in the retrieval of “episodic memories.” The term was used in 1972 by an Estonian-born Canadian psychologist named Endel Tulving, who drew a distinction between two kinds of memory, episodic and semantic. Episodic memory happens when we recall experiences. Semantic memory is the retrieval of discrete facts or knowledge that isn’t reliant on summoning the experiential context in which the information was learned. Tulving wrote that episodic memory amounted to a form of “mental time travel,” as we enter a state of consciousness similar to the one we were in when the memories were stored.

Marcel Proust’s episodic memory, famously, was triggered by the smell of madeleines. Taste can function in a similar way and, as Ranganath writes, so can music. He also speculates that nostalgia has its roots in episodic memory. According to him, research shows that, on average, people find it easier to recollect positive rather than negative memories, and this bias increases as we age. He even thinks that this “might explain older adults’ penchant for nostalgia.” But I wonder, too, whether nostalgia might have to do with the vicissitudes of the aging process, which may prompt us to recall wistfully the vitality of youth rather than the onset of arthritis in our hips or the formation of cataracts.

How can such an apparently haphazard system confer an advantage on us as a species? The answer starts to come into focus when Ranganath writes about attempts to make certain machine-learning models simulate the way that human brains learn. As information is fed in, the model gradually builds up a body of knowledge about a given area. Ranganath provides a hypothetical example:

“An eagle is a bird. It has feathers, wings, and a beak, and it flies.”

“A crow is a bird. It has feathers, wings, and a beak, and it flies.”

“A hawk is a bird. It has feathers, wings, and a beak, and it flies.”

Soon, he explains, the system will be able to use the examples it has been taught to deduce that a seagull, say, can fly. But it has problems making sense of information that doesn’t fit the pattern, such as “A penguin is a bird. It has feathers, wings, and a beak, and it swims.” Exceptions to the rule can cause what is known as “catastrophic interference,” in which learning the new piece of information causes the model to forget what it had previously learned. Overcoming this weakness requires training the computer on colossal amounts of data.

People, by contrast, take such contradictions in stride, something that Ranganath attributes to our ability to toggle between semantic and episodic memory. The general rule is stored in semantic memory, whereas episodic memory, not being designed to draw universals from across our experience, organizes events in a more idiosyncratic manner. The result is that our brains are much quicker to adjust to the real world. “They are wonderfully adapted to make use of the past, given the dynamic and unpredictable world in which we have evolved,” Ranganath writes. “The world around us is constantly changing, and it’s critical to update our memories to reflect these changes.”

Once we see memory as a dynamic phenomenon, rather than as a passive record, it becomes possible to understand how forgetting can also serve a purpose. “Forgetting isn’t a failure of memory; it’s a consequence of processes that allow our brains to prioritize information that helps us navigate and make sense of the world,” Ranganath writes. (It’s when we forget the wrong things, of course, that we get frustrated.) In certain circumstances, forgetting can even be part of the memorization process, and Ranganath spends a good deal of time on the power of “error-driven learning.” It seems that pushing our memory to failure can produce exactly the sort of salient experience that will then fix a piece of information in our mind.

Ranganath quotes Bartlett to the effect that “literal recall is extraordinarily unimportant” and makes clear that his book is “not a book about ‘how to remember everything.’ ’’ Nonetheless, an account of how memory works can hardly avoid giving a few tips. He advises us to think of our memories as “like a desk cluttered with crumpled-up scraps of paper. If you’d scribbled your online banking password on one of those scraps of paper, it will take a good deal of effort and luck to find it.” The key is to attach important memories to something distinctive, the equivalent of a “hot-pink Post-It note.” A related strategy is the memory-palace technique, in which one visualizes units of information as being arranged in a space that is already familiar, such as one’s childhood bedroom.

Perhaps the most useful tactic in memorization is “chunking,” a phenomenon identified by the pioneering mid-century cognitive psychologist George A. Miller. Miller noted how hard it was for us to hold more than a few pieces of information in our head simultaneously; he thought that it was impossible to keep more than seven things in mind at once, but subsequent research suggests that the situation is even worse and that the maximum is probably even lower. Fortunately, there’s what Ranganath calls a “huge loophole”: our brains are very flexible about what constitutes a single piece of information. A simple example is the way we remember telephone numbers. Breaking a ten-digit U.S. phone number into two groups of three plus a group of four reduces the number of “items” to be remembered from ten to three. At a larger scale, the most talented soccer or basketball players are able to “read” complicated arrangements of other players as single pieces of information. Likewise, many chess masters can take in the places of pieces on a board at a glance, because they are remembering not individual pieces on individual squares but larger patterns, based on their accumulated knowledge of the game. Tellingly, if the pieces are arranged randomly rather than having arisen out of actual gameplay, a chess master’s advantage in memorizing the position is dramatically reduced.

Toward the end of his book, Ranganath expands his focus from the individual to examine the social aspect of memory. He cites a startling analysis of casual conversation which found that forty per cent of the time we spend talking to one another is taken up with storytelling of some kind. Whether spilling our entire past or just quickly catching up, we are essentially engaged in exchanging memories. It should come as no surprise that communication renders our memories even more fungible. “The very act of sharing our past experiences can significantly change what we remember and the meaning we derive from it,” Ranganath writes, and distortions multiply with each telling.

Another pioneering experiment by Frederic Bartlett examined the distortions that occur in “serial reproduction”—or what we would call a game of telephone. Bartlett showed student volunteers a drawing of an African shield and then had them redraw it from memory. He gave these drawings to another group of volunteers and asked the fresh volunteers to reproduce the new drawings from memory. As he repeated the process with group after group, he found not only that the results looked less and less like an African shield but also that they started to resemble a man’s face. Collectively, the volunteers were changing something unfamiliar into something familiar. More recent work on such serial distortions has shown that, over several iterations, elements of a story that fit common stereotypes get reinforced and elements that don’t fall away.

The psychologist Henry L. Roediger III has adopted the term “social contagion” to describe such memory distortions. He conducted an experiment in which pairs of people were given a set of photos and asked to recall what they remembered from the pictures. However, only one individual in each pair was a true volunteer; the other had been planted with instructions to deliberately “recall” things that were not in the photos. The actual volunteers became “infected” by the misinformation, often themselves remembering items that hadn’t been in the pictures at all. Furthermore, the effect persisted even if they were warned of the possibility that their partner’s recollections might be mistaken.

Our openness to influence and the tendency of serial reproduction to magnify social biases have dispiriting political implications. “Once distortions creep into our shared narratives, they can be incredibly difficult to root out,” Ranganath writes. It’s no wonder that conspiracy theories—about the 2020 election being stolen, about Barack Obama being born in Kenya—prove so resistant to repeated debunking. It also turns out that groups are disproportionately swayed by dominant members who speak confidently. Ranganath offers a crumb of comfort. Research shows that diverse groups remember more accurately than homogenous ones do, and that groups also remember more fully if a wide range of group members contribute to discussion and if contributions from less powerful members are actively encouraged.

source site

Leave a Reply