The Science of Learning: Secrets of the Brain


 

As the most complex structure known to man, the brain has been a highly valued area of study, and it’s only relatively recently that we’ve developed the technology to truly do so. In the process, we’ve gathered some key insights for successful learning.

If we are to learn how to learn, then brain research, specifically involving how its comprehension and memory mechanisms are impacted by external influences, must be given primacy. Empirical findings from psychology- and cognition-based studies continue to affirm the most profitable methods for learning, agnostic of individual characteristics or propensities. While it’s true that people are more amenable to certain learning styles, all of us can benefit from specific approaches and techniques that best aid information transfer and retrieval. Whether you’re trying to pass a naturally difficult course this semester, sharpen your memory and recall power or learn how to compete effectively at a new sport, the mechanics of the brain reveal some “inside” information into best practices for achieving success.

We will be drawing from the work of Robert Bjork, director of the UCLA Learning and Forgetting Lab and the guy who has practically written the book on cognition-driven learning. His book, Successful Remembering and Successful Forgetting, is encyclopedic in scope and a popular textbook choice for psychology programs around the country. This and other notable research is interspersed here that marries learning with the underlying science.

Integration

A common learning tactic many employ is trying to master a single concept before moving on to another. It turns out this is not very effective. Psychology tells us an integrated approach to learning is much more productive than the rigidly linear approach we tend to take.

Moving on to a new concept can reinforce our understanding of ones before it, allowing us to arrive at our goals—e.g., a deeper understanding of calculus or higher proficiency in tennis—earlier. For example, rather than sinking all of your time practicing your long-distance drive in golf, you might mix in some putts, chips, pitches and bunker shots. The idea is to not focus solely on one concept as if it were independent, as our time is used more efficiently when working in several interrelated concepts together. This deepens our understanding of the topic and eases the absorption process of higher-level information to which we have not yet been exposed.

Note that the order in which we learn certain concepts is still important, particularly for language acquisition, science and mathematics. This is why great care is taken in academia to structure curricula logically. Just be wary not to spend an inordinate amount of time on one concept before transitioning to the next.

Spacing Effect

Bjork also makes a clear distinction between memory retention and memory retrieval. Apparently we never actually forget the things we learn, as they are permanently stored in the hippocampus, the brain’s long-term memory repository. These neuronal connections just become increasingly difficult to revive over time.

Though you might not be able to recall precisely the address where you lived as a child, for example, it’s still tucked away in the stygian recesses of your brain. Research has shown that once you are reminded of the address, you will be able to recall it much easier and more quickly than an entirely unfamiliar street address. In this way, retaining information is more of a default process, while retrieval is something we must work at to improve.

To do this, we should apply what psychologists call the “spacing effect” to our study routines. If we learn new material and review it too soon afterward, we limit our potential for long-term memory retrieval. In the same way, if we wait too long before revisiting it, we may be unable to retrieve the information unassisted, even though it still resides within our neural networks.

It seems there is a critical interval between initial exposure to and subsequent revisitation of study material that optimizes information recall. According to Frank Dempster, research psychologist at the University of Nevada, this interval should grow increasingly longer after each retrieval to maximize recall power.

One theme which interlaces many of these studies is the harder we have to work to pull information out of our memory banks, the easier recall will be for that information in the future. This is why reviewing material too soon after it’s first presented is ineffective. Similar to how people ask a cashier to swipe their card a second and third time after the first attempt was declined, the outcome of gap-free study sessions is no different. Our routines should be more interstitial in nature.

Continuous studying leading up to an exam is thus cognitively disadvantageous. While we often think of repetition as building immunity to forgetfulness, it is actually working against us in the long run.

Delayed Note-Taking

Most of us probably think that note-taking is essential to getting the most out of a class or lecture. The practice of vigorously jotting down every important-sounding tidbit we hear bears a two-fold problem, however. First, part of our cognitive faculty is assigned to putting our thoughts to paper that could otherwise be devoted to a more acute concentration on the lecture. As any neuroscientist will tell you, our brains are not designed for competent multi-tasking. And secondly, merely regurgitating a lecture’s exact words immediately after you hear them does not constitute learning. Both handicap our ability to absorb information and get the most out of a presentation.

A better practice is to defer your note-taking until after the class, ideally right after. Having now attained the essence of the lecture, you should try to restate the key ideas in your own words. Your notes may very well reflect a stream of consciousness as you frantically write everything down before you forget it, but this process establishes more potent neuronal connections, easing recall later on.

This method not only forces you to pay greater attention in class but allows you to absorb more information than you would otherwise, both of which should be easier now that you are not writing while listening. It’s simply too easy to write down word for word what the professor says or copy the Power Point slides. Again, more work means higher memory performance when you most need it.

Recording notes in this way also frequently requires you to call upon existing knowledge and experience you already have, since you are not simply copying the lecturer’s slides as you would in class. You’re now applying new information to prior knowledge (which tangentially explains why analogies are so helpful). The reason behind this is straightforward enough. Restating and summarizing information on your own forms new neural connections in the brain, integrating the newly acquired information into existing neural networks. If we think of something first, or form the idea ourselves, we are more likely to remember it. Any time you can appropriately place newly learned information within your existing cerebral storehouse ossifies your understanding of that information and thereby amplifies memory retrieval.

Be aware that this does not suggest that you should record no notes while information is being presented to you. Hard facts, like statistics and other easy to forget numbers, are perfectly acceptable to write down. They may in fact grant you a more comprehensive or big-picture view of the topic down the road.

Re-Representation

This technique is closely linked to the one above. It’s long been proven that taking information in one format and re-representing it in an alternative format speeds up the conversion of information to knowledge. This can involve reinterpreting text-heavy information into visual information or vice-versa. For example, reconstituting something you just read in the form of a graph, chart or other diagram will help crystallize the concept in your mind. Likewise, taking the time to deconstruct statistical charts or graphs and interpret them in your own words will pay dividends during final exams. You will then have two memory retrieval cues to access – the text and the visuospatial representation of the text.

As part of their work in learning science, Diane Halpern and Milton Hakel discuss concept maps, which is a way of breaking down complex concepts into hierarchical illustrations. Consider a complex field of study like Euclidean geometry or developmental psychology. Every so often, it is helpful to concept map everything you’ve learned up to that point, properly taxonomizing and organizing what you’ve covered and how it all fits together. Over time, this equips you with a broader, big-picture informational framework of your area of study.

Location and Activity

Thus far we’ve discussed the recommended hows and whens of learning, but the where is also important. Numerous studies have shown that the location in which you learn and the activity in which you participate while learning are also factors important to a vivacious memory. If you change up your study environment often, there is potential for you to associate that environment with what you learned, thus increasing the chance of recall. Spending 100% of your study time in the university library does not provide a lot of environmental variability for you to attach information.

In the same way, taking part in an activity promotes learning because it adds yet another retrieval cue to your memory register. This is why educators and lecturers sometimes make the learning process into a game. Playing a political science version of Jeopardy will help cement the experience into students’ minds, securing a prominent place in the architecture of the brain. Similarly, it’s common for kinesthetic learners to go on walks while reading a book because they prefer to stay active, and varying the location of these walks can compound its effectiveness.

Visuals

The presence of visuals also aids memory retrieval, regardless of whether you self-identify as a visual learner. Whenever we attach imagery to information, we recruit the occipital lobe in the rear of our brains. Incorporating images such as graphs, flowcharts and other illustrations into our study or teaching materials adds one more retrieval cue for eliciting the desired synaptic response. The aforementioned concept maps and even the simple practice of sketching on paper what you’re learning will be useful in mastering more abstruse topics.

 

 

Wrap-Up

Given these insights, you may find it necessary to reengineer the way you learn, prepare for school exams or approach practicing a new sport or skill. Because the applications are so broad, a greater understanding of how the brain operates will consistently lead to tangible improvements in many areas of our lives.

Key ideas:

  • Interleaving different but related concepts into our study sessions, as opposed to focusing all our efforts on one, can dramatically reduce total study time.
  • Similarly, practicing several tennis, golf or other sports techniques together can result in quicker improvements.
  • Being cognizant of the optimal period of time before revisiting studied material yields better learning by strengthening neuronal connections in our brain.
  • Assimilating fresh information into our existing knowledge networks by summarizing lectures and seminars ourselves enables a deeper understanding and furthers recall.
  • Distilling complex concepts down to more accessible terms or illustrations enables comprehensive knowledge over time.
  • Lastly, remember that anything we can do to expand the number of retrieval cues attached to different pieces of information stimulates prompt and thorough recall. Visual imagery, varying the study location, and inserting an activity into the standard routine can all ensure information is readily available when we need it.

Use these learning-based stratagems to boost your cognitive toolkit and make more efficient use of your time.


 

Sources and further reading:

Everything You Thought You Knew About Learning Is Wrong
Applying the Science of Learning to the University and Beyond, Diane F. Halpern and Milton D. Hakel
The Spacing Effect: A Case Study in the Failure to Apply the Results of Psychological Research, Frank N. Dempster
10 facts about learning that are scientifically proven and interesting for teachers
Experiential Learning Theory: Previous Research and New Directions, David A. Kolb
Experiential learning: Experience as the Source of Learning and Development, David A. Kolb
Engines For Education, Roger C. Schank
Virtual Learning: A Revolutionary Approach to Building a Highly Skilled Workforce, Roger C. Schank

 

Feature image by CalicoStonewolf at Deviant Art