فصل 03کتاب: کاملا بخاطر بسپار - علم یادگیری موفق / فصل 3
- زمان مطالعه 73 دقیقه
- سطح خیلی سخت
دانلود اپلیکیشن «زیبوک»
این فصل را میتوانید به بهترین شکل و با امکانات عالی در اپلیکیشن «زیبوک» بخوانید
متن انگلیسی فصل
Jumpers are staggered, but in the four turbulent seconds until your chute opens you have neither awareness nor control over your proximity to other jumpers. The incident, which amounted to nothing, thanks to her training, is telling nonetheless. Had it frightened her? Not at all, she said. Mia was prepared to handle it, and her confidence gave her the cool to “just sort of swim out.”
It’s one thing to feel confident of your knowledge; it’s something else to demonstrate mastery. Testing is not only a powerful learning strategy, it is a potent reality check on the accuracy of your own judgment of what you know how to do. When confidence is based on repeated performance, demonstrated through testing that simulates real-world conditions, you can lean into it. Facing the jump door may always reawaken feelings of terror, but the moment she’s out, Mia says, the fear evaporates.
How Learning Occurs
To help you understand how difficulty can be desirable, we’ll briefly describe here how learning occurs.
Let’s imagine you’re Mia, standing in a gravel pit watching a jump instructor explain and demonstrate the parachute landing fall. The brain converts your perceptions into chemical and electrical changes that form a mental representation of the patterns you’ve observed. This process of converting sensory perceptions into meaningful representations in the brain is still not perfectly understood. We call the process encoding, and we call the new representations within the brain memory traces. Think of notes jotted or sketched on a scratchpad, our short-term memory.
Much of how we run our day-to-day lives is guided by the ephemera that clutter our short-term memory and are, fortunately, soon forgotten—how to jigger the broken latch on the locker you used when you suited up at the gym today; remembering to stop for an oil change after your workout. But the experiences and learning that we want to salt away for the future must be made stronger and more durable—in Mia’s case, the distinctive moves that will enable her to hit the ground without breaking an ankle, or worse.3 Consolidation
The process of strengthening these mental representations for long-term memory is called consolidation. New learning is labile: its meaning is not fully formed and therefore is easily altered. In consolidation, the brain reorganizes and stabilizes the memory traces. This may occur over several hours or longer and involves deep processing of the new material, during which scientists believe that the brain replays or rehearses the learning, giving it meaning, filling in blank spots, and making connections to past experiences and to other knowledge already stored in long-term memory. Prior knowledge is a prerequisite for making sense of new learning, and forming those connections is an important task of consolidation. Mia’s considerable athletic skills, physical self-awareness, and prior experience represent a large body of knowledge to which the elements of a successful PLF would find many connections. As we’ve noted, sleep seems to help memory consolidation, but in any case, consolidation and transition of learning to long-term storage occurs over a period of time.
An apt analogy for how the brain consolidates new learning may be the experience of composing an essay. The first draft is rangy, imprecise. You discover what you want to say by trying to write it. After a couple of revisions you have sharpened the piece and cut away some of the extraneous points. You put it aside to let it ferment. When you pick it up again a day or two later, what you want to say has become clearer in your mind. Perhaps you now perceive that there are three main points you are making. You connect them to examples and supporting information familiar to your audience. You rearrange and draw together the elements of your argument to make it more effective and elegant.
Similarly, the process of learning something often starts out feeling disorganized and unwieldy; the most important aspects are not always salient. Consolidation helps organize and solidify learning, and, notably, so does retrieval after a lapse of some time, because the act of retrieving a memory from long-term storage can both strengthen the memory traces and at the same time make them modifiable again, enabling them, for example, to connect to more recent learning. This process is called reconsolidation. This is how retrieval practice modifies and strengthens learning.
Suppose that on day 2 of jump school, you’re put on the spot to execute your parachute landing fall and you struggle to recall the correct posture and compose yourself—feet and knees together, knees slightly bent, eyes on the horizon—but in the reflex to break your fall you throw your arm out, forgetting to pull your elbows tight to your sides. You could have broken the arm or dislocated your shoulder if this were the real deal. This effort to reconstruct what you learned the day before is ragged, but in making it, critical elements of the maneuver come clearer and are reconsolidated for stronger memory. If you’re practicing something over and over in rapid-fire fashion, whether it’s your parachute landing fall or the conjugation of foreign verbs, you’re leaning on short-term memory, and very little mental effort is required. You show gratifying improvement rather quickly, but you haven’t done much to strengthen the underlying representation of those skills. Your performance in the moment is not an indication of durable learning. On the other hand, when you let the memory recede a little, for example by spacing or interleaving the practice, retrieval is harder, your performance is less accomplished, and you feel let down, but your learning is deeper and you will retrieve it more easily in the future.4 Retrieval
Learning, remembering, and forgetting work together in interesting ways. Durable, robust learning requires that we do two things. First, as we recode and consolidate new material from short-term memory into long-term memory, we must anchor it there securely. Second, we must associate the material with a diverse set of cues that will make us adept at recalling the knowledge later. Having effective retrieval cues is an aspect of learning that often goes overlooked. The task is more than committing knowledge to memory. Being able to retrieve it when we need it is just as important.
The reason we don’t remember how to tie knots even after we’ve been taught is because we don’t practice and apply what we’ve learned. Say you’re in the city park one day and come across an Eagle Scout teaching knots. On a whim you take an hour’s lesson. He demonstrates eight or ten specimens, explains what each is useful for, has you practice tying them, and sends you away with a short length of rope and a cheat sheet. You head home committed to learning these knots, but life is full, and you fail to practice them. They are soon forgotten, and this story could end there, with no learning. But then, as it happens, the following spring you buy a small fishing boat, and you want to attach an anchor on a line. With rope in hand and feeling mildly stumped, you recall from your lesson that there was a knot for putting a loop in the end of a line. You are now practicing retrieval. You find your cheat sheet and relearn how to tie a bowline. You put a small loop in the rope and then take the short end and draw it through, silently reciting the little memory device you were given: the rabbit comes up from his hole, goes around the tree, and goes back down. Retrieval again. A little snugging-up, and there you have your knot, a dandy piece of scoutcraft of the kind you’d always fancied knowing. Later, you put a piece of rope beside the chair where you watch TV and practice the bowline during commercials. You are doing spaced practice. Over the coming weeks you’re surprised at how many little jobs are easier if you have a piece of rope with a loop in the end. More spaced practice. By August you have discovered every possible use and purpose in your life for the bowline knot.
Knowledge, skills, and experiences that are vivid and hold significance, and those that are periodically practiced, stay with us. If you know you’re soon to throw yourself out of a troop transport, you listen up good when they’re telling you when and how to pull the rip cord on your reserve chute, or what can go wrong at twelve hundred feet and how to “just sort of swim out of it.” The mental rehearsal you conduct while lying in your bunk too tired to sleep and wishing the next day was already over and well-jumped is a form of spaced practice, and that helps you, too.
Extending Learning: Updating Retrieval Cues
There’s virtually no limit to how much learning we can remember as long as we relate it to what we already know. In fact, because new learning depends on prior learning, the more we learn, the more possible connections we create for further learning. Our retrieval capacity, though, is severely limited. Most of what we’ve learned is not accessible to us at any given moment. This limitation on retrieval is helpful to us: if every memory were always readily to hand, you would have a hard time sorting through the sheer volume of material to put your finger on the knowledge you need at the moment: where did I put my hat, how do I sync my electronic devices, what goes into a perfect brandy Manhattan?
Knowledge is more durable if it’s deeply entrenched, meaning that you have firmly and thoroughly comprehended a concept, it has practical importance or keen emotional weight in your life, and it is connected with other knowledge that you hold in memory. How readily you can recall knowledge from your internal archives is determined by context, by recent use, and by the number and vividness of cues that you have linked to the knowledge and can call on to help bring it forth.5 Here’s the tricky part. As you go through life, you often need to forget cues associated with older, competing memories so as to associate them successfully with new ones. To learn Italian in middle age, you may have to forget your high school French, because every time you think “to be” and hope to come up with the Italian essere, up pops etre, despite your most earnest intentions. Traveling in England, you have to suppress your cues to drive on the right side of the road so you can establish reliable cues to stay on the left. Knowledge that is well entrenched, like real fluency in French or years of experience driving on the right side of the road, is easily relearned later, after a period of disuse or after being interrupted by competition for retrieval cues. It’s not the knowledge itself that has been forgotten, but the cues that enable you to find and retrieve it. The cues for the new learning, driving on the left, displace those for the old, driving on the right (if we are lucky).
The paradox is that some forgetting is often essential for new learning.6 When you change from a PC to a Mac, or from one Windows platform to another, you have to do enormous forgetting in order to learn the architecture of the new system and become adept at manipulating it so readily that your attention can focus on doing your work and not on working the machine. Jump school training provides another example: After their military service, many paratroopers take an interest in smoke jumping. Smokejumpers use different airplanes, different equipment, and different jump protocols. Having trained at the army’s jump school is cited as a distinct disadvantage for smoke jumping, because you have to unlearn one set of procedures that you have practiced to the point of reflex and replace them with another. Even in cases where both bodies of learning seem so similar to the uninitiated—jumping out of an airplane with a parachute on your back—you may have to forget the cues to a complex body of learning that you possess if you are to acquire a new one.
We know this problem of reassigning cues to memory from our own lives, even on the simplest levels. When our friend Jack first takes up with Joan, we sometimes call the couple “Jack and Jill,” as the cue “Jack and” pulls up the old nursery rhyme that’s so thoroughly embedded in memory. About the time we have “Jack and” reliably cuing “Joan,” alas, Joan throws him over, and he takes up with Jenny. Good grief! Half of the time that we mean to say Jack and Jenny we catch ourselves saying Jack and Joan. It would have been easier had Jack picked up with Katie, so that the trailing K sound in his name handed us off to the initiating K in hers, but no such luck. Alliteration can be a handy cue, or a subversive one. In all of this turmoil you don’t forget Jill, Joan, or Jenny, but you “repurpose” your cues so that you can keep pace with the changing opera of Jack’s life.7 It is a critical point that as you learn new things, you don’t lose from long-term memory most of what you have learned well in life; rather, through disuse or the reassignment of cues, you forget it in the sense that you’re unable to call it up easily. For example, if you’ve moved several times, you may not be able to recall a previous address from twenty years ago. But if you are given a multiple choice test for the address, you can probably pick it out easily, for it still abides, as it were, in the uncleaned closet of your mind. If you have ever immersed yourself in writing stories of your past, picturing the people and places of earlier days, you may have been surprised by the memories that started flooding back, things long forgotten now coming to mind. Context can unleash memories, as when the right key works to open an old lock. In Marcel Proust’s Remembrance of Things Past, the narrator grieves over his inability to recall the days of his adolescence in the French village of his aunt and uncle, until one day the taste of a cake dipped in lime blossom tea brings it all rushing back, all the people and events he thought had long since been lost to time. Most people have experiences like Proust’s when a sight or sound or smell brings back a memory in full force, even some episode you have not thought about in years.8 Easier Isn’t Better
Psychologists have uncovered a curious inverse relationship between the ease of retrieval practice and the power of that practice to entrench learning: the easier knowledge or a skill is for you to retrieve, the less your retrieval practice will benefit your retention of it. Conversely, the more effort you have to expend to retrieve knowledge or skill, the more the practice of retrieval will entrench it.
Not long ago the California Polytechnic State University baseball team, in San Luis Obispo, became involved in an interesting experiment in improving their batting skills. They were all highly experienced players, adept at making solid contact with the ball, but they agreed to take extra batting practice twice a week, following two different practice regimens, to see which type of practice produced better results.
Hitting a baseball is one of the hardest skills in sports. It takes less than half a second for a ball to reach home plate. In this instant, the batter must execute a complex combination of perceptual, cognitive, and motor skills: determining the type of pitch, anticipating how the ball will move, and aiming and timing the swing to arrive at the same place and moment as the ball. This chain of perceptions and responses must be so deeply entrenched as to become automatic, because the ball is in the catcher’s mitt long before you can even begin to think your way through how to connect with it.
Part of the Cal Poly team practiced in the standard way. They practiced hitting forty-five pitches, evenly divided into three sets. Each set consisted of one type of pitch thrown fifteen times. For example, the first set would be fifteen fastballs, the second set fifteen curveballs, and the third set fifteen changeups. This was a form of massed practice. For each set of 15 pitches, as the batter saw more of that type, he got gratifyingly better at anticipating the balls, timing his swings, and connecting. Learning seemed easy.
The rest of the team were given a more difficult practice regimen: the three types of pitches were randomly interspersed across the block of forty-five throws. For each pitch, the batter had no idea which type to expect. At the end of the forty-five swings, he was still struggling somewhat to connect with the ball. These players didn’t seem to be developing the proficiency their teammates were showing. The interleaving and spacing of different pitches made learning more arduous and feel slower.
The extra practice sessions continued twice weekly for six weeks. At the end, when the players’ hitting was assessed, the two groups had clearly benefited differently from the extra practice, and not in the way the players expected. Those who had practiced on the randomly interspersed pitches now displayed markedly better hitting relative to those who had practiced on one type of pitch thrown over and over. These results are all the more interesting when you consider that these players were already skilled hitters prior to the extra training. Bringing their performance to an even higher level is good evidence of a training regimen’s effectiveness.
Here again we see the two familiar lessons. First, that some difficulties that require more effort and slow down apparent gains—like spacing, interleaving, and mixing up practice—will feel less productive at the time but will more than compensate for that by making the learning stronger, precise, and enduring. Second, that our judgments of what learning strategies work best for us are often mistaken, colored by illusions of mastery.
When the baseball players at Cal Poly practiced curveball after curveball over fifteen pitches, it became easier for them to remember the perceptions and responses they needed for that type of pitch: the look of the ball’s spin, how the ball changed direction, how fast its direction changed, and how long to wait for it to curve. Performance improved, but the growing ease of recalling these perceptions and responses led to little durable learning. It is one skill to hit a curveball when you know a curveball will be thrown; it is a different skill to hit a curveball when you don’t know it’s coming. Baseball players need to build the latter skill, but they often practice the former, which, being a form of massed practice, builds performance gains on short-term memory. It was more challenging for the Cal Poly batters to retrieve the necessary skills when practice involved random pitches. Meeting that challenge made the performance gains painfully slow but also long lasting.
This paradox is at the heart of the concept of desirable difficulties in learning: the more effort required to retrieve (or, in effect, relearn) something, the better you learn it. In other words, the more you’ve forgotten about a topic, the more effective relearning will be in shaping your permanent knowledge.9
How Effort Helps
Effortful recall of learning, as happens in spaced practice, requires that you “reload” or reconstruct the components of the skill or material anew from long-term memory rather than mindlessly repeating them from short-term memory.10 During this focused, effortful recall, the learning is made pliable again: the most salient aspects of it become clearer, and the consequent reconsolidation helps to reinforce meaning, strengthen connections to prior knowledge, bolster the cues and retrieval routes for recalling it later, and weaken competing routes. Spaced practice, which allows some forgetting to occur between sessions, strengthens both the learning and the cues and routes for fast retrieval when that learning is needed again, as when the pitcher tries to surprise the batter with a curveball after pitching several fastballs. The more effort that is required to recall a memory or to execute a skill, provided that the effort succeeds, the more the act of recalling or executing benefits the learning.11 Massed practice gives us the warm sensation of mastery because we’re looping information through short-term memory without having to reconstruct the learning from long-term memory. But just as with rereading as a study strategy, the fluency gained through massed practice is transitory, and our sense of mastery is illusory. It’s the effortful process of reconstructing the knowledge that triggers reconsolidation and deeper learning.
Creating Mental Models
With enough effortful practice, a complex set of interrelated ideas or a sequence of motor skills fuse into a meaningful whole, forming a mental model somewhat akin to a “brain app”. Learning to drive a car involves a host of simultaneous actions that require all of our powers of concentration and dexterity while we are learning them. But over time, these combinations of cognition and motor skills—for example, the perceptions and maneuvers required to parallel park or manipulate a stick shift—become ingrained as sets of mental models associated with driving. Mental models are forms of deeply entrenched and highly efficient skills (seeing and unloading on a curveball) or knowledge structures (a memorized sequence of chess moves) that, like habits, can be adapted and applied in varied circumstances. Expert performance is built through thousands of hours of practice in your area of expertise, in varying conditions, through which you accumulate a vast library of such mental models that enables you to correctly discern a given situation and instantaneously select and execute the correct response.
Retrieval practice that you perform at different times and in different contexts and that interleaves different learning material has the benefit of linking new associations to the material. This process builds interconnected networks of knowledge that bolster and support mastery of your field. It also multiplies the cues for retrieving the knowledge, increasing the versatility with which you can later apply it.
Think of an experienced chef who has internalized the complex knowledge of how flavors and textures interact; how ingredients change form under heat; the differing effects to be achieved with a saucepan versus a wok, with copper versus cast iron. Think of the fly fisher who can sense the presence of trout and accurately judge the likely species, make the right choice of dry fly, nymph, or streamer, judge the wind, and know how and where to drop that fly to make the trout rise. Think of the kid on the BMX bike who can perform bunnyhops, tail whips, 180s, and wall taps off the features of an unfamiliar streetscape. Interleaving and variation mix up the contexts of practice and the other skills and knowledge with which the new material is associated. This makes our mental models more versatile, enabling us to apply our learning to a broader range of situations.
Fostering Conceptual Learning
How do humans learn concepts, for example the difference between dogs and cats? By randomly coming across dissimilar examples—Chihuahuas, tabby cats, Great Danes, picture book lions, calico cats, Welsh terriers. Spaced and interleaved exposure characterizes most of humans’ normal experience. It’s a good way to learn, because this type of exposure strengthens the skills of discrimination—the process of noticing particulars (a turtle comes up for air but a fish doesn’t)—and of induction: surmising the general rule (fish can breathe in water). Recall the interleaved study of birds in one case, and of paintings in another, that helped learners distinguish between bird types or the works of different painters while at the same time learning to identify underlying commonalities of the examples within a species or an artist’s body of work. When asked about their preferences and beliefs, the learners thought that the experience of studying multiple examples of one species of bird before studying examples of another species resulted in better learning. But the interleaved strategy, which was more difficult and felt clunky, produced superior discrimination of differences between types, without hindering the ability to learn commonalities within a type. As was true for the baseball players’ batting practice, interleaving produced difficulty in retrieving past examples of a particular species, which further solidified the learning of which birds are representative of a particular species.
The difficulty produced by interleaving provides a second type of boost to learning. Interleaved practice of related but dissimilar geometric solids requires that you notice similarities and differences in order to select the correct formula for computing the volume. It’s thought that this heightened sensitivity to similarities and differences during interleaved practice leads to the encoding of more complex and nuanced representations of the study material—a better understanding of how specimens or types of problems are distinctive and why they call for a different interpretation or solution. Why a northern pike will strike a spoon or a crankbait, say, but a bass will happily powder his nose until you see fit to throw him a grub or a popper.12 Improving Versatility
The retrieval difficulties posed by spacing, interleaving, and variation are overcome by invoking the same mental processes that will be needed later in applying the learning in everyday settings. By mimicking the challenges of practical experience, these learning strategies conform to the admonition to “practice like you play, and you’ll play like you practice,” improving what scientists call transfer of learning, which is the ability to apply what you’ve learned in new settings. In the Cal Poly batting practice experiment, the act of overcoming the difficulties posed by random types of pitches built a broader “vocabulary” of mental processes for discerning the nature of the challenge (e.g., what the pitcher is throwing) and selecting among possible responses than did the narrower mental processes sufficient for excelling during massed, nonvaried experience. Recall the grade school students who proved more adept at tossing beanbags into three-foot baskets after having practiced tossing into two- and four-foot baskets, compared to the students who only practiced tossing into three-foot basket. Recall the increasing difficulty and complexity of the simulation training in jump school, or the cockpit simulator of Matt Brown’s business jet.
Priming the Mind for Learning
When you’re asked to struggle with solving a problem before being shown how to solve it, the subsequent solution is better learned and more durably remembered. When you’ve bought your fishing boat and are attempting to attach an anchor line, you’re far more likely to learn and remember the bowline knot than when you’re standing in a city park being shown the bowline by a Boy Scout who thinks you would lead a richer life if you had a handful of knots in your repertoire.
Other Learning Strategies That Incorporate Desirable Difficulties
We usually think of interference as a detriment to learning, but certain kinds of interference can produce learning benefits, and the positive effects are sometimes surprising. Would you rather read an article that has normal type or type that’s somewhat out of focus? Almost surely you would opt for the former. Yet when text on a page is slightly out of focus or presented in a font that is a little difficult to decipher, people recall the content better. Should the outline of a lecture follow the precise flow of a chapter in a textbook, or is it better if the lecture mismatches the text in some ways? It turns out that when the outline of a lecture proceeds in a different order from the textbook passage, the effort to discern the main ideas and reconcile the discrepancy produces better recall of the content. In another surprise, when letters are omitted from words in a text, requiring the reader to supply them, reading is slowed, and retention improves. In all of these examples, the change from normal presentation introduces a difficulty—disruption of fluency—that makes the learner work harder to construct an interpretation that makes sense. The added effort increases comprehension and learning. (Of course, learning will not improve if the difficulty completely obscures the meaning or cannot be overcome.) The act of trying to answer a question or attempting to solve a problem rather than being presented with the information or the solution is known as generation. Even if you’re being quizzed on material you’re familiar with, the simple act of filling in a blank has the effect of strengthening your memory of the material and your ability to recall it later. In testing, being required to supply an answer rather than select from multiple choice options often provides stronger learning benefits. Having to write a short essay makes them stronger still. Overcoming these mild difficulties is a form of active learning, where students engage in higher-order thinking tasks rather than passively receiving knowledge conferred by others.
When you’re asked to supply an answer or a solution to something that’s new to you, the power of generation to aid learning is even more evident. One explanation for this effect is the idea that as you cast about for a solution, retrieving related knowledge from memory, you strengthen the route to a gap in your learning even before the answer is provided to fill it and, when you do fill it, connections are made to the related material that is fresh in your mind from the effort. For example, if you’re from Vermont and are asked to name the capital of Texas you might start ruminating on possibilities: Dallas? San Antonio? El Paso? Houston? Even if you’re unsure, thinking about alternatives before you hit on (or are given) the correct answer will help you. (Austin, of course.) Wrestling with the question, you rack your brain for something that might give you an idea. You may get curious, even stumped or frustrated and acutely aware of the hole in your knowledge that needs filling. When you’re then shown the solution, a light goes on. Unsuccessful attempts to solve a problem encourage deep processing of the answer when it is later supplied, creating fertile ground for its encoding, in a way that simply reading the answer cannot. It’s better to solve a problem than to memorize a solution. It’s better to attempt a solution and supply the incorrect answer than not to make the attempt.14 The act of taking a few minutes to review what has been learned from an experience (or in a recent class) and asking yourself questions is known as reflection. After a lecture or reading assignment, for example, you might ask yourself: What are the key ideas? What are some examples? How do these relate to what I already know? Following an experience where you are practicing new knowledge or skills, you might ask: What went well? What could have gone better? What might I need to learn for better mastery, or what strategies might I use the next time to get better results?
Reflection can involve several cognitive activities we have discussed that lead to stronger learning. These include retrieval (recalling recently learned knowledge to mind), elaboration (for example, connecting new knowledge to what you already know), and generation (for example, rephrasing key ideas in your own words or visualizing and mentally rehearsing what you might do differently next time).
One form of reflection that is gaining currency in classroom settings is called “write to learn.” In essence, students reflect on a recent class topic in a brief writing assignment, where they may express the main ideas in their own words and relate them to other concepts covered in class, or perhaps outside class. (For an example, read in Chapter 8 about the “learning paragraphs” Mary Pat Wenderoth assigns her students in her human physiology course.) The learning benefits from the various cognitive activities that are engaged during reflection (retrieval, elaboration, generation) have been well established through empirical studies.
An interesting recent study specifically examined “write to learn” as a learning tool. Over eight hundred college students in several introductory psychology classes listened to lectures throughout the semester. Following the presentation of a key concept within a given lecture, the instructor asked students to write to learn. Students generated their own written summaries of the key ideas, for example restating concepts in their own words and elaborating on the concepts by generating examples of them. For other key concepts presented during the lecture, students were shown a set of slides summarizing the concepts and spent a few minutes copying down key ideas and examples verbatim from the slide.
What was the result? On exams administered during the semester, the students were asked questions that assessed their understanding of the key concepts that they had worked on learning. They scored significantly (approximately half a letter grade) better on the ones they had written about in their own words than on those they had copied, showing that it was not simply exposure to the concepts that produced the learning benefit. In follow-up tests approximately two months later to measure retention, the benefits of writing to learn as a form of reflection had dropped but remained robust.15 Failure and the Myth of Errorless Learning
In the 1950s and 1960s, the psychologist B. F. Skinner advocated the adoption of “errorless learning” methods in education in the belief that errors by learners are counterproductive and result from faulty instruction. The theory of errorless learning gave rise to instructional techniques in which learners were spoonfed new material in small bites and immediately quizzed on them while they still remained on the tongue, so to speak, fresh in short-term memory and easily spit out onto the test form. There was virtually no chance of making an error. Since those days we’ve come to understand that retrieval from short-term memory is an ineffective learning strategy and that errors are an integral part of striving to increase one’s mastery over new material. Yet in our Western culture, where achievement is seen as an indicator of ability, many learners view errors as failure and do what they can to avoid committing them. The aversion to failure may be reinforced by instructors who labor under the belief that when learners are allowed to make errors it’s the errors that they will learn.16 This is a misguided impulse. When learners commit errors and are given corrective feedback, the errors are not learned. Even strategies that are highly likely to result in errors, like asking someone to try to solve a problem before being shown how to do it, produce stronger learning and retention of the correct information than more passive learning strategies, provided there is corrective feedback. Moreover, people who are taught that learning is a struggle that often involves making errors will go on to exhibit a greater propensity to tackle tough challenges and will tend to see mistakes not as failures but as lessons and turning points along the path to mastery. To see the truth of this, look no further than the kid down the hall who is deeply absorbed in working his avatar up through the levels of an action game on his Xbox video console.
A fear of failure can poison learning by creating aversions to the kinds of experimentation and risk taking that characterize striving, or by diminishing performance under pressure, as in a test setting. In the latter instance, students who have a high fear of making errors when taking tests may actually do worse on the test because of their anxiety. Why? It seems that a significant portion of their working memory capacity is expended to monitor their performance (How am I doing? Am I making mistakes?), leaving less working memory capacity available to solve the problems posed by the test. “Working memory” refers to the amount of information you can hold in mind while working through a problem, especially in the face of distraction. Everyone’s working memory is severely limited, some more than others, and larger working memory capacities correlate with higher IQs.
To explore this theory about how fear of failure reduces test performance, sixth graders in France were given very difficult anagram problems that none of them could solve. After struggling unsuccessfully with the problems, half of the kids received a ten-minute lesson in which they were taught that difficulty is a crucial part of learning, errors are natural and to be expected, and practice helps, just as in learning to ride a bicycle. The other kids were simply asked how they had gone about trying to solve the anagrams. Then both groups were given a difficult test whose results provided a measure of working memory. The kids who had been taught that errors are a natural part of learning showed significantly better use of working memory than did the others. These children did not expend their working memory capacity in agonizing over the difficulty of the task. The theory was further tested in variations of the original study. The results support the finding that difficulty can create feelings of incompetence that engender anxiety, which in turn disrupts learning, and that “students do better when given room to struggle with difficulty.”17 These studies point out that not all difficulties in learning are desirable ones. Anxiety while taking a test seems to represent an undesirable difficulty. These studies also underscore the importance of learners understanding that difficulty in learning new things is not only to be expected but can be beneficial. To this point, the French study stands on the shoulders of many others, among the foremost being the works of Carol Dweck and of Anders Ericsson, both of whom we discuss in Chapter 7 in relation to the topic of increasing intellectual abilities. Dweck’s work shows that people who believe that their intellectual ability is fixed from birth, wired in their genes, tend to avoid challenges at which they may not succeed, because failure would appear to be an indication of lesser native ability. By contrast, people who are helped to understand that effort and learning change the brain, and that their intellectual abilities lie to a large degree within their own control, are more likely to tackle difficult challenges and persist at them. They view failure as a sign of effort and as a turn in the road rather than as a measure of inability and the end of the road. Anders Ericsson’s work investigating the nature of expert performance shows that to achieve expertise requires thousands of hours of dedicated practice in which one strives to surpass one’s current level of ability, a process in which failure becomes an essential experience on the path to mastery.
The study of the French sixth graders received wide publicity and inspired the staging of a “Festival of Errors” by an elite graduate school in Paris, aimed at teaching French schoolchildren that making mistakes is a constructive part of learning: not a sign of failure but of effort. Festival organizers argued that modern society’s focus on showing results has led to a culture of intellectual timorousness, starving the kind of intellectual ferment and risk-taking that produced the great discoveries that mark French history.
It doesn’t require a great conceptual leap to get from Paris’s “Festival of Errors” to San Francisco’s “FailCon,” where technology entrepreneurs and venture capitalists meet once a year to study failures that gave them critical insights they needed in order to pivot in their business strategies so as to succeed. Thomas Edison called failure the source of inspiration, and is said to have remarked, “I’ve not failed. I’ve just found 10,000 ways that don’t work.” He argued that perseverance in the face of failure is the key to success.
Failure underlies the scientific method, which has advanced our understanding of the world we inhabit. The qualities of persistence and resiliency, where failure is seen as useful information, underlie successful innovation in every sphere and lie at the core of nearly all successful learning. Failure points to the need for redoubled effort, or liberates us to try different approaches. Steve Jobs, in his remarks to the Stanford University graduating class of 2005, spoke of being fired at age thirty in 1985 from Apple Computer, which he had cofounded. “I didn’t see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me. The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life.” It’s not the failure that’s desirable, it’s the dauntless effort despite the risks, the discovery of what works and what doesn’t that sometimes only failure can reveal. It’s trusting that trying to solve a puzzle serves us better than being spoon-fed the solution, even if we fall short in our first attempts at an answer.
An Example of Generative Learning
As we said earlier, the process of trying to solve a problem without the benefit of having been taught how is called generative learning, meaning that the learner is generating the answer rather than recalling it. Generation is another name for old-fashioned trial and error. We’re all familiar with the stories of skinny kids in Silicon Valley garages messing around with computers and coming out billionaires. We would like to serve up a different kind of example here: Minnesota’s Bonnie Blodgett.
Bonnie is a writer and a self-taught ornamental gardener in a constant argument with a voice in her head that keeps nattering about all the ways her latest whim is sure to go haywire and embarrass her. While she is a woman of strong aesthetic sensibilities, she is also one of epic doubts. Her “learning style” might be called leap-before-you-look-because-if-you-look-first-you-probably-won’t-like-what-you-see. Her garden writing appears under the name “The Blundering Gardener.” This moniker is a way of telling her voices of doubt to take a hike, because whatever the consequences of the next whim, she’s already rolling up her sleeves. “Blundering means that you get going on your project before you have figured out how to do it in the proper way, before you know what you’re getting into. For me, the risk of knowing what you’re getting into is that it becomes an overwhelming obstacle to getting started.”18 Bonnie’s success shows how struggling with a problem makes for strong learning, and how a sustained commitment to advancing in a particular field of endeavor through trial-and-error effort leads to complex mastery and greater knowledge of the interrelationships of things. When we spoke, she had just traveled to southern Minnesota to meet with a group of farmers who wanted her gardening insights on a gamut of issues ranging from layout and design to pest control and irrigation. In the years since she first sank her spade, Bonnie’s garden writing has won national recognition and found a devoted following far and wide through many outlets, and her garden has become a destination for other gardeners.
She came to ornamental gardening about the time she found herself eyeballing middle age. She had no training, just a burning desire to get her hands dirty making beautiful spaces on the corner lot of the home she shares with her husband in a historic neighborhood of St. Paul.
“The experience of creating beauty calms me down,” she says, but it’s strictly a discovery process. She has always been a writer, and some years after having launched herself into the garden, she began publishing the Garden Letter, a quarterly for northern gardeners in which she chronicles her exploits, mishaps, lessons, and successes. She writes the same way that she gardens, with boldness and self-effacing humor, passing along the entertaining snafus and unexpected insights that are the fruits of experience. In calling herself the Blundering Gardener, she is giving herself and us, her readers, permission to make mistakes and get on with it.
Note that in writing about her experiences, Bonnie is engaging two potent learning processes beyond the act of gardening itself. She is retrieving the details and the story of what she has discovered—say, about an experiment in grafting two species of fruit trees—and then she is elaborating by explaining the experience to her readers, connecting the outcome to what she already knows about the subject or has learned as a result.
Her leap-taking impulses have taken her through vast swaths of the plant kingdom, of course, and deeply into the Latin nomenclature and the classic horticultural literature. These impulses have also drawn her into the aesthetics of space and structure and the mechanics thereof: building stone walls; digging and wiring water features; putting a cupola on the garage; building paths, stairs, and gates; ripping out a Gothic picket fence and reusing the wood to create something more open and with stronger horizontal lines to pull down the soaring verticality of her three-story Victorian house and connect it with the gardens that surround it; making the outdoor spaces airier and more easily seen from the street, while still circumscribed, so as to impart that essential sense of privacy that makes a garden a room of its own. Her spaces are idiosyncratic and asymmetrical, giving the illusion of having evolved naturally, yet they cohere, through the repetition of textures, lines, and geometry.
A simple example of how she has backed into more and more complex mastery is the manner in which she came to embrace plant classification and the Latin terminology. “When I started, the world of plants was a completely foreign language to me. I would read gardening books and be completely lost. I didn’t know what plant names were, common or Latin. I wasn’t thinking about learning this stuff, ever. I’m like, Why would you want to do that? Why wouldn’t you just get outside and dig a hole and put something in it?” What she relished were pictures that gave her ideas and passages of text where the designers used phrases like “my process” in describing how they had achieved the desired effect. It was the possessive pronoun, my process, that affirmed Bonnie in her headlong rush to learn by doing. The notion is that every gardener’s process is uniquely his or her own. Bonnie’s process did not involve taking direction from experts, much less mastering the Linnaean taxonomy or the Latin names of what she stuck in holes and dragged her water hose to. But as she thrashed around, working to achieve in dirt the magical spaces that danced in her mind, she came to Latin and Linnaeus despite herself.
“You begin to discover that the Latin names are helpful. They can give you a shortcut to understanding the nature of the plants, and they can help you remember. Tardiva, which is a species name, comes after hydrangea, which is a genus.” Bonnie had taken Latin in high school, along with French, and of course English, and the cues to those memories began to reawaken. “I can easily see that tardiva means late, like tardy. The same word comes after many plant varieties, so you see the genus and then the species is tardiva, and now you know that particular plant is a late bloomer. So you begin to realize that the Latin names are a way of helping you remember, and you find yourself using them more and more. Also you remember plants better, because it’s second nature to you that procumbus means prostrate, crawling on the ground. It makes sense. So now it’s not so hard to remember that particular species name when it’s attached to a genus. It’s also important to know the Latin names because then you can be absolutely specific about a plant. Plants have common names, and common names are regional. Actaea racemosa has a common name of black cohosh, but it’s also known as snakeroot, and those names are often given to other plants. There’s only one Actaea racemosa.” Gradually, and despite her inclination to resist, she came to grasp the classical taxonomy of ornamental plants and to appreciate how Linnaeus’s schema frames family connections and communicates attributes.
Bonnie said that the farmers she had recently met were particularly interested in what she has learned about the advantages of composting and earthworms over chemical fertilizers for building nutrients and soil aeration, and how to get strong root growth on low rations of water through a homemade system of drip irrigation. She paused in recounting her meeting with them, reflecting on how all of this knowledge has sneaked up on her. It was never something she set out to conquer. “Look, blundering’s really not a bad thing. It’s a good thing in that you get stuff done. A lot of people, when they contemplate the enormity of the task and they see all that’s entailed, they’re stopped in their tracks.” Of course, in some settings—like learning to jump out of airplanes and walk away with your life—blundering is not the optimal learning strategy.
Elizabeth and Robert Bjork, who coined the phrase “desirable difficulties,” write that difficulties are desirable because “they trigger encoding and retrieval processes that support learning, comprehension, and remembering. If, however, the learner does not have the background knowledge or skills to respond to them successfully, they become undesirable difficulties.”19 Cognitive scientists know from empirical studies that testing, spacing, interleaving, variation, generation, and certain kinds of contextual interference lead to stronger learning and retention. Beyond that, we have an intuitive sense of what kinds of difficulties are undesirable but, for lack of the needed research, we cannot yet be definitive.
Clearly, impediments that you cannot overcome are not desirable. Outlining a lesson in a sequence different from the one in the textbook is not a desirable difficulty for learners who lack the reading skills or language fluency required to hold a train of thought long enough to reconcile the discrepancy. If your textbook is written in Lithuanian and you don’t know the language, this hardly represents a desirable difficulty. To be desirable, a difficulty must be something learners can overcome through increased effort.
Intuitively it makes sense that difficulties that don’t strengthen the skills you will need, or the kinds of challenges you are likely to encounter in the real-world application of your learning, are not desirable. Having somebody whisper in your ear while you read the news may be essential training for a TV anchor. Being heckled by role-playing protestors while honing your campaign speech may help train up a politician. But neither of these difficulties is likely to be helpful for Rotary Club presidents or aspiring YouTube bloggers who want to improve their stage presence. A cub towboat pilot on the Mississippi might be required in training to push a string of high-riding empty barges into a lock against a strong side wind. A baseball player might practice hitting with a weight on his bat to strengthen his swing. You might teach a football player some of the principles of ballet for learning balance and movement, but you probably would not teach him the techniques for an effective golf drive or backhand tennis serve.
Is there an overarching rule that determines the kinds of impediments that make learning stronger? Time and further research may yield an answer. But the kinds of difficulties we’ve just described, whose desirability is well documented, offer a large and diverse toolkit already at hand.
Learning is at least a three-step process: initial encoding of information is held in short-term working memory before being consolidated into a cohesive representation of knowledge in long-term memory. Consolidation reorganizes and stabilizes memory traces, gives them meaning, and makes connections to past experiences and to other knowledge already stored in long-term memory. Retrieval updates learning and enables you to apply it when you need it.
Learning always builds on a store of prior knowledge. We interpret and remember events by building connections to what we already know.
Long-term memory capacity is virtually limitless: the more you know, the more possible connections you have for adding new knowledge.
Because of the vast capacity of long-term memory, having the ability to locate and recall what you know when you need it is key; your facility for calling up what you know depends on the repeated use of the information (to keep retrieval routes strong) and on your establishing powerful retrieval cues that can reactivate the memories.
Periodic retrieval of learning helps strengthen connections to the memory and the cues for recalling it, while also weakening routes to competing memories. Retrieval practice that’s easy does little to strengthen learning; the more difficult the practice, the greater the benefit.
When you recall learning from short-term memory, as in rapid-fire practice, little mental effort is required, and little long-term benefit accrues. But when you recall it after some time has elapsed and your grasp of it has become a little rusty, you have to make an effort to reconstruct it. This effortful retrieval both strengthens the memory but also makes the learning pliable again, leading to its reconsolidation. Reconsolidation helps update your memories with new information and connect them to more recent learning.
Repeated effortful recall or practice helps integrate learning into mental models, in which a set of interrelated ideas or a sequence of motor skills are fused into a meaningful whole that can be adapted and applied in later settings. Examples are the perceptions and manipulations involved in driving a car or in knocking a curveball out of the ballpark.
When practice conditions are varied or retrieval is interleaved with the practice of other material, we increase our abilities of discrimination and induction and the versatility with which we can apply the learning in new settings at a later date. Interleaving and variation build new connections, expanding and more firmly entrenching knowledge in memory and increasing the number of cues for retrieval.
Trying to come up with an answer rather than having it presented to you, or trying to solve a problem before being shown the solution, leads to better learning and longer retention of the correct answer or solution, even when your attempted response is wrong, so long as corrective feedback is provided.
5 Avoid Illusions of Knowing
AT THE ROOT of our effectiveness is our ability to grasp the world around us and to take the measure of our own performance. We’re constantly making judgments about what we know and don’t know and whether we’re capable of handling a task or solving a problem. As we work at something, we keep an eye on ourselves, adjusting our thinking or actions as we progress.
Monitoring your own thinking is what psychologists call metacognition (meta is Greek for “about”). Learning to be accurate self-observers helps us to stay out of blind alleys, make good decisions, and reflect on how we might do better next time. An important part of this skill is being sensitive to the ways we can delude ourselves. One problem with poor judgment is that we usually don’t know when we’ve got it. Another problem is the sheer scope of the ways our judgment can be led astray.1 In this chapter we discuss perceptual illusions, cognitive biases, and distortions of memory that commonly mislead people. Then we suggest techniques for keeping your judgment squared with reality.
The consequences of poor judgment fill the daily papers. During the summer of 2008, three stickup artists in Minneapolis had a system going of phoning in large fast-food orders and then relieving the delivery man of all the goods and cash he carried. As a livelihood it was a model of simplicity. They kept at it, failing to consider the wisdom of always placing their orders from the same two cell phones and taking delivery at the same two addresses.
David Garman, a Minneapolis cop, was working undercover that summer. “It was getting more aggressive. At the beginning, it was ‘maybe they had a gun,’ then all of a sudden there were a couple of guns, and then they were hurting the people when they were robbing them.”
It was a night in August when Garman got a call about a large order phoned in to a Chinese restaurant. He organized a small team on short notice and prepared to pose as the delivery guy. He pulled on a bulletproof vest, covered it with a casual shirt, and shoved his .45 automatic into his pants. While his colleagues staked out positions near the delivery address, Garman picked up the food, drove there, and parked with his brights shining on the front door. He’d cut a slit in the bottom of the food bag and tucked a .38 inside to rest in his hand as he carried the package. “The .38 has a covered hammer on it, so I can shoot it in a bag. If I were to put the automatic in there, it’d jam and I’d be screwed.” So I walk up with the package and I say, “Hey, sir, did you order some food?” He says, “Yup,” and I’m thinking this guy’s really just going to pay me and I’m going to be out of here, and this is going to be the dumbest thing we’ve ever done. I’m thinking if he hands me $40, I don’t even know how much this food is. But he turns his head to look halfway back and two other guys start to come up, and as they’re walking towards me they flip hoods over their heads. That’s when I know it’s game time. The first guy whips a gun out of his pocket and racks it and puts it to my head all in one motion, saying, “Give me everything you’ve got motherfucker or I’ll kill you.” I ended up shooting him through the bag. It was four rounds.2 Not such a great livelihood after all. The guy was hit low and survived, although he is a lesser man as a result. Garman would have aimed higher if the food package hadn’t been so heavy, and he took a lesson from the experience: he’s better prepared for the next time, though he’d rather we didn’t describe just how.
We like to think we’re smarter than the average doodle, and even if we’re not, we feel affirmed in this delusion each year when the newest crop of Darwin Awards circulates by email, that short list of self-inflicted fatalities caused by spectacularly poor judgment, as in the case of the attorney in Toronto who was demonstrating the strength of the windows in his twenty-two-story office tower by throwing his shoulder against the glass when he broke it and fell through. The truth is that we’re all hardwired to make errors in judgment. Good judgment is a skill one must acquire, becoming an astute observer of one’s own thinking and performance. We start at a disadvantage for several reasons. One is that when we’re incompetent, we tend to overestimate our competence and see little reason to change. Another is that, as humans, we are readily misled by illusions, cognitive biases, and the stories we construct to explain the world around us and our place within it. To become more competent, or even expert, we must learn to recognize competence when we see it in others, become more accurate judges of what we ourselves know and don’t know, adopt learning strategies that get results, and find objective ways to track our progress.
Two Systems of Knowing
In his book Thinking, Fast and Slow, Daniel Kahneman describes our two analytic systems. What he calls System 1 (or the automatic system) is unconscious, intuitive, and immediate. It draws on our senses and memories to size up a situation in the blink of an eye. It’s the running back dodging tackles in his dash for the end zone. It’s the Minneapolis cop, walking up to a driver he’s pulled over on a chilly day, taking evasive action even before he’s fully aware that his eye has seen a bead of sweat run down the driver’s temple.
System 2 (the controlled system) is our slower process of conscious analysis and reasoning. It’s the part of thinking that considers choices, makes decisions, and exerts self-control. We also use it to train System 1 to recognize and respond to particular situations that demand reflexive action. The running back is using System 2 when he walks through the moves in his playbook. The cop is using it when he practices taking a gun from a shooter. The neurosurgeon is using it when he rehearses his repair of the torn sinus.
System 1 is automatic and deeply influential, but it is susceptible to illusion, and you depend on System 2 to help you manage yourself: by checking your impulses, planning ahead, identifying choices, thinking through their implications, and staying in charge of your actions. When a guy in a restaurant walks past a mother with an infant and the infant cries out “Dada!” that’s System 1. When the blushing mother says, “No, dear, that’s not Dada, that’s a man,” she is acting as a surrogate System 2, helping the infant refine her System 1.
System 1 is powerful because it draws on our accumulated years of experience and our deep emotions. System 1 gives us the survival reflex in moments of danger, and the astonishing deftness earned through thousands of hours of deliberate practice in a chosen field of expertise. In the interplay between Systems 1 and 2—the topic of Malcolm Gladwell’s book Blink—your instantaneous ability to size up a situation plays against your capacity for skepticism and thoughtful analysis. Of course, when System 1’s conclusions arise out of misperception or illusion, they can steer you into trouble. Learning when to trust your intuition and when to question it is a big part of how you improve your competence in the world at large and in any field where you want to be expert. It’s not just the dullards who fall victim. We all do, to varying degrees. Pilots, for example, are susceptible to a host of perceptual illusions. They are trained to beware of them and to use their instruments to know that they’re getting things right.
A frightening example with a happy ending is China Airlines Flight 006 on a winter day in 1985. The Boeing 747 was 41,000 feet above the Pacific, almost ten hours into its eleven-hour flight from Taipei to LA, when engine number 4 lost power. The plane began to lose airspeed. Rather than taking manual control and descending below 30,000 feet to restart the engine, as prescribed in the flight book, the crew held at 41,000 with the autopilot engaged and attempted a restart. Meanwhile, loss of the outboard engine gave the plane asymmetrical thrust. The autopilot tried to correct for this and keep the plane level, but as the plane continued to slow it also began to roll to the right. The captain was aware of the deceleration, but not the extent to which the plane had entered a right bank; his System 1 clue would have been his vestibular reflex—how the inner ear senses balance and spatial orientation—but because of the plane’s trajectory, he had the sensation of flying level. His System 2 clues would have been a glimpse at the horizon and his instruments. Correct procedure called for applying left rudder to help raise the right wing, but his System 2 focus was on the airspeed indicator and on the efforts of the first officer and engineer to restart the engine.
As its bank increased, the plane descended through 37,000 feet into high clouds, which obscured the horizon. The captain switched off the autopilot and pushed the nose down to get more speed, but the plane had already rolled beyond 45 degrees and now turned upside down and fell into an uncontrolled descent. The crew were confused by the situation. They understood the plane was behaving erratically but were unaware they had overturned and were in a dive. They could no longer discern thrust from engines 1–3 and concluded those engines had quit as well. The plane’s dive was evident from their flight gauges, but the angle was so unlikely the crew decided the gauges had failed. At 11,000 feet they broke through the clouds, astonished to see that they were roaring toward earth. The captain and first officer both pulled back hard on the stick, exerting enormous forces on the plane but managing to level off. Landing gear hung from the plane’s belly, and they’d lost one of their hydraulic systems, but all four engines came to life, and the captain was able to fly on, diverting successfully to San Francisco. An inspection revealed just how severe their maneuver had been. Strains five times the force of gravity had bent the plane’s wings permanently upward, broken two landing gear struts, and torn away two landing gear doors and large parts of the rear horizontal stabilizers.
مشارکت کنندگان در این صفحه
تا کنون فردی در بازسازی این صفحه مشارکت نداشته است.
🖊 شما نیز میتوانید برای مشارکت در ترجمهی این صفحه یا اصلاح متن انگلیسی، به این لینک مراجعه بفرمایید.