Why are elephants grey and wrinkly?
Elephants never forget. That’s what they say. Ignoring questions like “How do they know?”, and obvious brain comparisons like being grey and wrinkly, are they actually wrong?
Now think about what that means. Can your brain get full? What could it get full of? What, actually, is a memory? What does it look like? Is it a concept or a thing? Phew! And does that mean you won’t forget either?
When they say an elephant never forgets, did you ever think it might, really, be true?
In truth, these questions, and some others that they suggest, are harder than they look.
It’s impossible to understand memory and forgetting independently of each other, as they are flip sides of the same coin, both directly related to learning. Is it true for example, that we (or elephants for that matter!) never forget anything? That our brains are vast, even infinite, storehouses of information? Think carefully. Have you forgotten things, or just not remembered them – there’s a difference. Put it another way, have you really lost information from your brain, or just can’t find where you put it?
If you take the brain as computer idea, does the brain have its own bazillion gigabyte hard drive, held together by oodles of data cables, all of which, together, resembles warm, congealed porridge? Or are we simply on a limited storage capacity plan which means that we must delete some of the old to make space for the new? Can we upgrade? And if we do, how do we determine what to delete and what to keep? Welcome to Forgetting 101.
Somewhere to start
Here’s a test. Think back to what you had for your last meal. Now think back to what you had for lunch 15 years ago on this date. Have you forgotten? Or did you never remember in the first place and so you cannot therefore have forgotten? You can see why this is so devilishly tricky. At what point does something even become a memory so that we can genuinely say we’ve forgotten? Or how we forget.
Over time, some common ideas have developed as to what happens when we forget. These exclude excuses for not having done homework, taking out the trash and answers to questions like “Mr Grommetwobbler, the members of the court would like to know where you were at precisely 9.47pm on the night in question…”. These ideas go like this.
Pretty straightforward this one. Basically it suggests that time destroys memories; they weaken with age, hence decay. Ironically, as an idea, this is continuing to weaken with age because, as a concept, it has some pretty large cavities. For example, on word lists, it’s possible to recall things at a second testing which were not recalled initially, showing exactly the opposite of what decay theory would suggest.
I’m sure you’re familiar with the common idea of what interference is, and it’s the same here. In the process of committing something to, or making a memory, something gets in the way. In the case of forgetting, one of two processes are in play.
Either, old information gets in the way of a making a new memory, or new information interferes with an old memory. Not content with just describing these processes, we have impressive names for them too.
Old interferes with New = Proactive interference (the idea of moving forward)
New interferes with Old = Retroactive interference (the idea of moving backward)
Application: How much more exciting is it to say “Sorry, proactive interference complications” instead of “Sorry, I forgot”?
This is one of the most popular views (not without good reason too), and holds that memories have gone in, and been kept in, but our problem is in getting them out. In other, more impressive, words
- Encoding, or getting it in – Pass
- Storage, keeping it in – Pass
- Retrieval, getting it out – Fail
In here too is some thinking around the conditions and cues in the learning environment, and how these are available in the recall environment.
Application: “Ah boss, all good on that deadline, except for the soft-wired file retrieval system brown out. Should be back up in a couple of hours. Keep you posted yeah?”.
The more things we collect under a single retrieval cue, the more likely we are to forget a particular one (or more) of these items linked to that cue. If, for example, you learned a 10 word list, which used four furniture words, and the retrieval cue was “furniture”, you’ve a harder job to get all four than if there had been only one furniture word for the cue.
Application: “Boss, hey, cue overload outage in the consolidation phase. Might take some time.” See below.
When we’re talking about memory, consolidation means stabilizing the memory. Most probably, every time we use a memory, our brain re-stores it from scratch, rebuilding it as it were, and so opening the door for interruption, interference or variation with each new version, which may also explain why memories change over time. Also, if the memory isn’t consolidated well in the first place, we are more likely to lose it.
Application: “Yeah boss, I hear you, would’ve got to that but for the unstable consolidation subroutine“.
A couple of variations, but all getting to the same spot. Repression, taken from the work of the famous Sigmund Freud (rhymes with void, not food), just means avoiding an unpleasant memory, from gentle distraction to more active, more forceful suppression. Less popular than it once was.
What none of these theories of forgetting really cover though, is why we forget. We usually consider it a curse rather than a blessing, but forgetting also has its uses.
So here’s the take home bit
It seems that the brain likes to worry about the usefulness of memories rather than their accuracy because, for the most part, close enough is good enough. But to keep up to date, it seems the brain also likes to make space for itself. In order to create new memories, it pays attention to old memories that it can do without. After all, how much of what you actually remember is going to be useful one day? Let’s put a real life spin on this.
How many of us never delete old emails, thinking they’ll be useful, or we need to keep them for some other reason, when we really only deal consistently with the most recent ones anyway. It’s always the last 50 or 100 that are important. Once they get beyond that, their use has come and gone and, probably, we could delete them. If the inbox had a storage limit, as some do, we’d be forced to. A similar thing might be going on inside your brain, not for storage limits, but for efficiency.
While making space is useful, your brain also likes efficiency, and retaining every last shred of information requires energy and mental resources which must be used to maintain what we do. In brain resource terms, this is expensive.
The trick for you, is solidly encoding and consolidating those you want, and leaving the rest. We’ll spend more time on memory soon.
Impressive words to drop into the morning coffee chat
Proactive interference, Retroactive Interference, Encoding, Storage, Retrieval, Repression, Suppression.
What have you noticed?
Want more like this? Subscribe for FREE to get Bite sized brains in your inbox!