adsense code

Thursday, December 29, 2016

Thwart Stress Effects on Memory

It is well known that stress can impair memory. Everyone has had some experience of this kind. As a student suffering test anxiety, grades are likely to suffer. In high-stakes social or business interactions, the stress may well cause memory to fail us, as when Presidential candidate Rick Perry forgot the name of the agency he wanted to abolish if elected, or when we forget a friend’s name in the process of making a social introduction. How does stress do this? Is there anything we can do about it?
First, we need to know what stressful events do to the body and brain. Brain freezes, like Rick Perry's, probably occur because thinking can get so preoccupied with the stress-inducing stimuli that other thoughts cannot emerge. But other kinds of stress-induced memory impaired come from the well-known “fight or flight” response in which stress activates the release of adrenalin into the blood stream. Adrenalin has many bodily effects that support fight or flight, such as raising heart rate and blood pressure, and increasing arousal perhaps to the point of anxiety and fear. The increased attentiveness may have a fleeting beneficial effect on memory, as has been demonstrated in laboratory experiments. But the other effects of adrenalin on anxiety and distress are likely to impair memory.
The other thing that happens during stress is the activation of the anterior pituitary gland’s release of ACTH, which in turn activates another part of the adrenal gland to dump cortisol into the blood stream. In the short term, cortisol can have many beneficial effects for combatting stress, such as mobilizing white blood cells and enhancing the immune system. But cortisol binds to cells in the brain’s hippocampus, the area that converts new experiences into memory. This binding actually disrupts the memory-forming process. Ultimately, if stress continues, the synaptic regions deteriorate, making the impairment permanent.
The effects of both adrenalin and cortisol were revealed in an interesting study of mild social stress. Here, the focus was on a theory of how stress effects on memory might be thwarted by a learning technique called forced retrieval. Prior research with students, had shown, that the usual study technique of re-reading notes or text is not nearly as effective as requiring the learner to actively retrieve the information, as one might do, with flash cards, for example. Just a few months ago, I posted a blog on this forced retrieval phenomenon as a key element in “strategic studying.”
This new research was aimed at testing the possibility that forced retrieval might protect learners from the memory deficits caused by stress. In the study on the first day, 120 subjects studied a list of 30 nouns or images of nouns one at a time. Then, one half of the group restudied the items while the other half practiced retrieval by recalling as many items as they could (but without feedback telling them if they got it right). One the next day, half of each group were stressed by being required to solve hard math problems and by giving speeches in front of two judges and three peers. Then they were tested. Twenty minutes later they took a second test on items that had not been tested on the first test. The results revealed that retrieval practice yielded better results.


  
On the first test, we see that the stressed learners who just studied the items the day before had fewer of the items remembered on the first test given immediately after the stress. But there was no such effect on the stressed leaners who used retrieval practice during the initial learning. This protective effect of retrieval practice was evident on the second test 25 minutes later. In fact, the retrieval practice effect was better than on the first test, even though different items were tested. You may have noticed that the stressed study group on the second test did worse than they did on the first test. This is attributed to a mild effect of adrenalin, which as mentioned above can have some benefit on memory. Adrenalin’s action is immediate and is apparently swamped on the second test by the delayed release of cortisol, which shows up by the second test. Students might note that the magnitude of difference may appear small, but in percentage terms could equal to more than two letter grades (compare the two stressed groups on the delayed test).
To explain why forced retrieval works, the authors speculate that it provides better initial encoding. That is, the new information is registered more strongly if you make yourself try to retrieve it. This is consistent with the everyday experience that most of us have had wherein information that strongly grabs our attention is more likely to be remembered. Forced retrieval is a way to make ourselves pay better attention to what we are trying to lean.

Readers wanting to learn more about improving memory are urged to check “Memory Medic’s” books, Memory Power 101 and Better Grades, Less Effort.
               
Sources:

Klemm, W. R. (2016). Strategic studying. October 9, http://thankyoubrain.blogspot.com/2016/10/strategic-studying.html

Smith, Amy M. et al. (2016). Retrieval practice protects memory against acute stress. Science. 354 (6315), 1046-1047.


Saturday, December 10, 2016

Base Relationships on the Present, Not the Past

Everyone has feelings about those who have been close to them: parents, siblings, spouses, and colleagues. Those feelings are usually formed from memories of past interactions with those people. When those memories are negative, they can poison relationships and lead to terrible results: family feuds, alienated siblings, estrangement between children and parents, divorce, law suits, and assorted vendettas. The saddest part of all is that research is showing that many of these negative memories can be wrong.

Memories are seldom fully literal. Memories are constructed, not recorded like an audio tape. The brain decides how an experience is to be packaged as a narrative to remember. We even generate fictions for experiences that do not involve our own inter-personal relationships. Witness the conflicting stories about how many planes struck the World Trade Center or about the Ferguson "hands up, don't shoot" imagined incident. The criminal justice system now downplays eye-witness testimony because so much of it in the past has proven unreliable. Often this happens when experiences are intense and complex, causing the over-taxed brain to jam them unthinkingly into its already formed store of memories.

Construction of false memory is especially likely during childhood, for several inevitable reasons:
·         Children do not process reality as readily or correctly as adults.
·         The brain circuitry of children changes dramatically as brains grow and re-wire, which causes many memories to be lost or corrupted.
·         Constant replay of the memory over the years leads to further alteration of the memory and the repetition confirms the memory, even when it is wrong.

A recent article in the Wall Street Journal says that we categorize memories to help define ourselves. The author says this is a good thing because it is a method for bolstering one's ego. We may, for example, construct memories to help us think of ourselves as superior, righteous, or likable. But others will construct memories that confirm a pre-existing low self-esteem, thinking of oneself as a victim, incorrigible, unlikable, or whatever. This is a well-studied phenomenon that psychologists call confirmation bias. For better or worse, we transform real experiences into memories that are a "creative blend" that mixes fact and fiction.

When we construct memories that put a negative spin on past interactions with others, we build a negative attitude toward them. Negative attitudes about others are hard to hide. Then as subsequent relationship experiences occur, they too get the negative spin, adding to the storehouse of false memories that can grow into hostility. Rubbing salt into mental wounds by rehearsing grievances year after year intensifies the memory and reinforces belief in it. Apologies and forgiveness become harder and harder to generate.

Why does the brain work this way? A Harvard study revealed that the same areas of the brain are used for remembering past events and imaginary events. A University of Dayton study showed another reason: people have an unconscious incentive to create false memories to protect themselves from threats to their beliefs about themselves. As a relatively benign example, college students who opposed increased tuition, after writing an essay that required them to defend a tuition increase, mis-remembered their initial opposition.

More serious consequences result when, as a Northwestern U. psychology professor explains, people exaggerate the negativity or misery of past experiences to impress themselves and others by their endurance of suffering or "escape" from it. Such exaggeration also occurs as responses to real-time events, as for example when people put the worst possible spin on a current experience. It makes them seem to be a bigger victim and coping with it seems like a bigger achievement.

A University of Utah psychologist says false memories take on more meaning and apparent justification when recounted to others. So as if the false memory were not bad enough, we use it to poison the reputation of others. A child who thinks parents or siblings were unfair, gains validation by telling friends about the presumed mistreatment. A worker may put a negative spin on an annual review and may feel better if he uses that memory to discredit the boss in the eyes of others.
The damage in such cases is three-fold: 1) lying to oneself prevents dealing with real solutions, 2) damaging the reputation of others is mean-spirited and unjust, and 3) spreading this kind of falsehood ultimately destroys the reputation of the perpetrator.

"Bury the hatchet" is sound advice. The more promising way to have good relationships is to base them on the present and to nurture them in positive ways for the future.

Dr. Klemm is author of the recent book,
Mental Biology (New York: Prometheus).

Sources:

Krokos, Dan. (2012). False Memory. New York: Hyperion.


Shellenbarger, Sue (2016). How inaccurate memories can be good for you. Wall Street Journal. July 27. 

Wednesday, November 23, 2016

To Remember Multiple Items: Put Them in Related Groups

Memory formation and recall are greatly influenced by how items of information relate to each other. In the case of words, different words that have related categorical meanings are often easier to remember as a unit. For example, in a list of words that include spinach, cabbage, and lettuce, recalling any one of these words will assist in recall of the other two, because they have a related meaning of green vegetables. They are "semantically clustered." 

Several studies have shown that when people are asked to memorize a list of words and recall them in any order, they tend to recall words that are related. For example, in a list that contains animal names, flowers, grocery items, and historical events, a person who recalls "cat" is also likely to recall the "dog" item that was in the list. This happens because we all have a tendency to organize things by groups. Few of us capitalize on the power of this approach with a deliberate strategy to do so.

In a study of whether semantic clustering helps older people to remember, a comparison was made between 132 younger subjects (ages 18-30) and 120 older ones (60-84). In the experiment subjects were asked to memorize two lists of words, one with words presented one at a time and the other all at once so that subjects could see what words might reasonably be clustered. For the whole-list presentation, subjects were instructed on how they might use clustering.

When the groups were instructed to use semantic clustering on the second list of words, both groups showed clear and comparable improvement in recalling words presented in a whole list, as opposed to presentation one at a time. The beneficial effect was reflected in faster recall responses and in working memory capacity.

Bottom line: to remember a list of items, try to group similar items in your mind. Try it with your grocery list next time you go to the grocery store.

Sources:

Manning, J. R., and Kahana, M. J. (2012). Interpreting semantic clustering effects in free recall. Memory. doi.org/10.1080/09658211.2012.683010.


Kuhlmann, B. G., and D. R. Touron. (2016) Aging and memory improvement through semantic clustering: The role of list-presentation format. Psychology of Aging. 31(7): 771-785. 

Wednesday, October 26, 2016

Learning to Be Dishonest

In this time of Presidential elections, what better time could there be to write a post about dishonesty. What makes people dishonest? What makes some people more dishonest than others?
          Any attitude or behavior, if sufficiently rehearsed, becomes a habit. Once formed, habits automate attitude or behavior, producing mental “knee-jerk” responses to the events of life. So, the key to honorable behavior, for example, is to think carefully about the attitudes and behaviors one is repeating. If it contributes to personal integrity, habit is a good thing. If repeated attitudes and behaviors are teaching you to be dishonest you will have done it to yourself―and made it lasting.
          A clear example of teaching oneself to be dishonorable comes from a new British university study showing that people become desensitized to lying. The experiment involved creating scenarios whereby people could lie. In the experiment with 80 people, pairs in separate rooms viewed a photograph of a jar filled with pennies. The photo was clear only for one person, whose task it was to advise the other person how many pennies were in the jar. The person making the estimate was told that the reward would vary on each trial, without knowing critical details about the built-in incentive structure. No feedback was provided. The more the advice was deliberately exaggerated, the more financial reward was to be given. Conditions were manipulated so that lying could benefit both partners, benefit the advising partner at the expense of the other partner, or benefit the advising partner only. There were features of the design that I think could have been improved, but that is beyond the scope of this post.
The greatest lying occurred when it benefited only the lying person. Dishonesty persisted at lower levels if the partner also benefited. There was zero lying under conditions were lying was punished by lower reward while the partner benefited.
People's lies grew bolder the more they lied. Brain scans revealed that activity in a key emotional center of the brain, the amygdala, became less active and desensitized as the dishonesty grew. In essence, the brain was being trained to lie. Thus, a little bit of dishonesty might be viewed as a slippery slope that can lead one to grow more dishonest. 
Emotions are at the core of the issue. Normally, we tend to feel guilty when doing something we know is wrong, like lying. But as we get in the habit of lying, the associated shame or guilt habituates. We get used to it and our conscience doesn't bother us so much. So, we are less constrained in our future behavior. We can't always be brutally honest, but it is now clear that each little lie or dishonest act can escalate and negatively change the person we are.
Another possibility is that positive reinforcement of behavior is involved. A well-known principle of behavior is that one tends to repeat behavior that is rewarded. Thus, if a person benefits from lying, he will likely do more of it. However, the brain area most associated with positive reinforcement, the nucleus accumbens, did not show any change in activity. The authors still asserted that lying was motivated by self-interest, because the greatest lying occurred when only the adviser benefited. However, the experiment was designed so that subjects could not know when their advice was being rewarded. Thus, the likely remaining explanation is that they just adapted to lying and it didn't bother them so much to exaggerate their estimates.
The absence of feedback was a crucial part of the design. But the authors point out that in the real world, the extent of dishonesty is greatly affected by feedback in terms of whether the deceiving person thinks there will be benefit or punishment.

Source:

Garrett, N. et al. (2016). The brain adapts to dishonesty. Nature Neuroscience. 24 October. doi: 10.1038/nn.4426

Sunday, October 09, 2016

Strategic Studying

School has started, and many students are discovering that they are not doing as well as expected. Parents and teachers may be chiding them about working harder. It might be more helpful to urge them to work smarter. This brings us to the matter of how students study.
My impression is that many students do not study effectively. Everyone knows that it is a bad idea to try to study while listening to music, watching TV, or frequently interrupting to check e-mail or Facebook and Twitter. One aspect of studying that is often under-valued is the way students test themselves to see how much they have learned. Typically, they "look over" the assigned learning content (notes, on-line videos, or reading assignments). Most students do not realize how important it is to force themselves to recall. In part, this is because they are conditioned by multiple-choice tests to recall passively, that is recognize when a correct answer is presented, as opposed to generating the correct answer in the first place.
Studies of student learning practices reveal how important to memory formation it is to retrieve information you are trying to memorize. For example, a 2008 study evaluated study and testing effects on memorizing foreign-language word pairs in one learning session of four trials, as one might do for example with flash cards.
A large recall improvement occurred if each repeated study attempt required active recall at that time, as opposed to just looking at the correct definition. Applying this finding to all kinds of learning suggests that learners should force themselves to recall what they think they have learned. Just looking at content again and again may not promote long-term learning.
Next, the investigators wanted to know whether recall is affected by focusing only on the word pairs that were incorrectly recalled. This is equivalent in a flash-card scenario, to re-studying only the words that were missed in the previous attempt. The test groups involved Study (S)(looking at each word and its paired word) and Test (T)(forced recall of each word in the pair) for either all of 40 word pairs or just the word pairs that were not recalled in the previous trial. The learners ran through the deck four successive times.  
At the end of this learning phase, students in each group were also asked to predict how many word pairs they would be able to remember a week later. It turned out that irrespective of the learning condition, predictions were inaccurate. This confirms my own experience that students are frequently poor judges of how much they know.
As for the effectiveness of initial learning, all four groups achieved perfect scores after four trials, with the largest improvement between the first and second trial. So that means they all learned the material. The issue at hand was how well they remembered when quizzed later. When given a test a week later, the two groups in which forced-recall testing was repeated in each study trial, final recall increased over the other two conditions by four standard deviations, ranging from 63 to 95% of correct recall a week later. Thus, it seems that forced-recall testing is more important for forming memories than is the studying. What this indicates is that learning occurs during forced-recall testing, and retrieval practice should be part of the initial study process.
In 2015, another group of researchers replicated these findings and further examined the effects of the varied spacing in the first study. That is, in the 2008 study, the two conditions where testing was repeated in each trial took more time because all 40 word pairs were tested. The second group of investigators was surprised that the earlier study seemed to diminish the importance of repeated studying, compared with repeated testing. One problem might have been that the original study design was "between subjects," where scores were averaged for students in different test conditions. This design meant that the elapsed time varied among the groups, because it took more time to complete four study cycles of all 40 word pairs and tested than it did when only non-recalled items were studied and/or tested. So this new study had a "within subjects" design in which every learner experienced all four ST conditions on 10 different word pairs.
The results replicated the earlier findings on the value of forced-recall testing. That is, the two groups that self-tested in each of the four study cycles had the most recall after one week. Moreover, the group that re-studied and re-tested all word pairs recalled about twice as many word pairs than did the group that only re-studied and re-tested non-recalled words. Thus it appears that restudying items that have been correctly recalled earlier is far from useless.
Both studies make it clear that how well a learner remembers soon after learning provides no assurance of how much will be remembered after a week (or longer) delay. In these studies, optimal learning occurred when an initial learning session included repeated study and forced-recall testing of all items at least four times in a row. Of course, we only have data for 40 items, and long-term memory might be affected differently for smaller or larger sets of learning material.
Bottom line:
·         Just looking over learning material can be ineffective for long-term memory.
·         Right after learning an item of information, force yourself to recall it and check to see if you got it right.
·         Conduct forced--recall testing of all information, not just the items that were previously recalled correctly.
Study should be strategic. These and other learning and memory aids are found in my inexpensive e-book on learning skills, Better Grades, Less Effort (Smashwords.com) or the more comprehensive book, Memory Power 101 (Skyhorse).

Sources:

Karpicke, J. D., and Roediger, H. I. III (2008). The critical importance of retrieval for learning. Science. 319, 966-968.


Soderstrom, N. C. et al. (2016). The critical importance of retrieval—and spacing—for learning. Psychol. Science. Dec. 16. doi:10.1177/0956797615617778.

Wednesday, September 14, 2016

Note-taking 101

The Fall return to school is a good time to remind students and parents about learning strategies. Lectures still dominate teaching approaches. In spite of such teaching reforms as "hands-on" learning, small group collaborations, project-based learning, and others, teachers generally can't resist the temptation to be a "sage of the stage," instead of a "guide on the side." Maybe that's a good thing, because many students are not temperamentally equipped to be active learners. Rather, they have been conditioned by television and movies, as well as their former teachers, to function passively, as an audience. Students are even conditioned to be passive by the way we test learning with multiple-choice questions, which require a passive recognition of a provided correct answer among three or four incorrect ones.
The other major teaching device, reading, is also problematic. Too many students don't like to read academic material. They want somebody to spoon fed the information to them. Most lectures are just that—spoon feeding.
Given that the dominance of lecturing is not likely to change any time soon, shouldn't teachers focus more on showing students how to learn from lectures? It seems there is an implicit assumption that passive listening will suffice to understand and remember what is presented in lectures. The problem is, however, that deep learning requires active, not passive, engagement. Students need to parse lecture content to identify what they don't understand, don't know already, and can't figure out from what they do already know. This has to happen in real time, as a given lecture proceeds. Even if the lecture is taped, seeing it again still requires active engagement for optimal learning.
So how should students engage with lectures? Traditionally, this means taking notes. But I wonder if note-taking is a dying art. I don't see many students taking notes from web pages or U-tube videos. Or textbooks (highlighting is a poor substitute). Or tweets or text messages. My concern was reinforced the other day when I gave a lecture on improving learning and memory to college students. The lecture was jam packed with more information than anyone could remember from one sitting. Yet, I did not see a single one of the 58 students taking notes. Notably, the class's regular professor, who had invited me to give the lecture, was vigorously taking notes throughout.
An explanation of how to take notes is provided in my e-book, Better Grades, Less Effort (Smashwords.com). Just what is it that I think is valuable about note taking? First and foremost is the requirement for engagement. Students have to pay attention well enough to make decisions about the portion of the lecture that will need to be studied later. Paying attention is essential for encoding information. Nobody can remember anything that never registered in the first place.
Next, note taking requires thinking about the material to decide what needs to be captured for later study. This hopefully generates questions that can be raised and answered during the lecture. In the college class I just mentioned, not one student asked a question, even though I interrupted the lecture four times to try and pry out questions. Notably, after the lecture, about a dozen students came to me to ask questions.
daikubob.com
A benefit of hand-written note-taking is that students create a spatial layout of the information they think they will need to study. A well-established principle of learning is that where information is provides important cues as to what the information is. The spatial layout of script and diagrams on a page allows the information to be visualized, creating an opportunity for a rudimentary form of photographic memory, where a student can imagine in the mind's eye just were on the page certain information is, and that alone makes it easier to memorize and recall what the information is.
This brings me to the important point of visualization. Pictures are much easier to remember than words. Hand-written notes allow the student to represent verbalized ideas as drawings or diagrams. If you have ever had to learn in a biology class the Kreb's cycle of cellular energy production, for example, you know how much easier it is to remember the cycle if it is drawn rather than described in paragraph form.
This is a good place to mention note-taking with a laptop computer. Students are being encouraged to use laptops or tablet computers to take notes. Two important consequences of typing notes should be recognized. One problem is that for touch typists, taking notes on a laptop is a relatively mindless and rote process in which letters are banged out more or less on autopilot. A good typist does not have to think. Hand-written notes inevitably engage thinking and decisions about what to write down, how to represent the information, and where on the page to put specific items. Typing also tempts the learner to record more information than can be readily memorized.
One of the earliest tests of the hypothesis about learning from handwriting was an experiment with elementary children learning how to spell. Comparison of writing words on a 3 x 5 card, or laying out words with letter tiles, or typing them with a keyboard revealed that the handwriting group achieved higher test scores when tested after having four days to study the notes. These results have been confirmed in other similar studies.
One follow-up study with college undergraduates compared the effects of typed and handwritten note-taking in 72 undergraduates watching a documentary video. Again, students who wrote notes by hand scored higher on the test.
The most recent experiment involved hundreds of students from two universities and compared learning efficacy in two groups of students, one taking notes on a laptop and the other by hand writing. Results from lectures on a wide range of topics across three experiments in a classroom setting revealed that the students making hand-written notes remembered more of the facts, had a deeper understanding, and were better at integrating and applying the information. The improvement over typing notes was still present in a separate trial where typing students were warned about being mindless and urged to think and type a synthesis of the ideas. Handwritten note benefits persisted in another trial where students were allowed to study their notes before being tested a week later.
Though multiple studies show the learning benefits of handwriting over typing, schools are dropping the teaching of cursive and encouraging students to use tablets and laptops. 
Why is it so hard for educators to learn?

Sources


Cunningham, A. E., & Stanovich, K. E. (1990). Early spelling acquisition: Writing beats the computer. Journal of Educational Psychology, 82(1), 159-162. doi:10.1037/0022-0663.82.1.159

Duran, Karen S. and Frederick, Christina M. (2013). Information comprehension: handwritten vs. typed notes. URHS, Vol. 12, http://www.kon.org/urc/v12/duran.html


Mueller, Pam A., and Oppenheimer, Daniel M. (2014). The pen is mightier than the keyboard. Pschological Science. April 23. doi: 10.1177/0956797614524581. http://pss.sagepub.com/content/early/2014/04/22/0956797614524581

Friday, August 26, 2016

The Perils of Multi-tasking

We live in the age of multitasking. Though a phenomenon of the young, older folks are being dragged into the age by the digital revolution in mobile electronic devices. Youngsters, as digital natives, are wired to multi-task, but they don't realize how multitasking impairs their impaired thinking skills. We call our phones "smart," but they can actually make us dumb. This may be one of the reasons that under performance in schools is so common.

Microsoft clip art

Older folks tend to be amazed and awed by the multitasking ability of the young. But those in all generations should realize that multitasking does not make you smarter or more productive.
In school, multitasking interferes with learning. In the workplace, multitasking interferes with productivity and promotes stress and fatigue. Multitasking creates an illusion of parallel activity, but actually it requires mental switching from one task to another. This drains the glucose fuel needed by the brain, making the brain less efficient and creating the feeling of being tired.

Neuroscientist, Dan Levitan, reminds us that multitasking is stressful, as indicated by increased secretion of cortisol and adrenalin. He cites work showing that IQ can temporarily drop 10 points during multitasking. A brain-scan study showed that new information gets processed in the wrong parts of the brain and not in the hippocampus where it should go in order to be remembered. The most insidious aspect of multitasking is that it programs the brain to operate in this mode, creating a debilitating thinking habit that is permanent.

Constant switching creates a distractible state of never being fully present. It trains the brain to have a short attention span and shrinks working memory capacity. This is especially pernicious in young people, who are most likely to multi-task and whose brains are the most susceptible to programming of bad habits.

Multitasking not only becomes a habit, it is addictive. I see many youngsters who seem to have withdrawal symptoms if they can't check their phone messages every few minutes. Mail messages send an associated signal that someone thinks you are important enough to contact. This provides powerfully reward personal affirmation. Worse yet, like slot-machine payoffs, the reinforcement occurs randomly, which is the most effective way to condition behavior. It turns us into trained seals.
Why does anybody engage in behaviors that can turn them into a trained seal? One study indicates that susceptibility to task switching depends on the existing mental state. The researchers monitored 32 information workers, of near-equal gender, in the work environment for five days. Workers were more likely to switch off task to Facebook or face-to-face conversations when they were doing rote tasks, which were presumably boring. When they were focused, they were more likely to switch to e-mail. Time wasting in Facebook and e-mail increased in proportion to the amount of task switching. Over-all, the workers witched to Facebook an average of 21 times per day and to e-mail 74 times. Though the total time spent off-task was small (about 10 minutes on Facebook and 35 min on e-mail, the excessive task switching must surely have degraded the productivity of the primary work tasks. Why does anybody need to check Facebook 21 times a day or e-mail 74 times a day? This is compulsive behavior that has affected the entire workforce like an infectious disease.

How does on break the multitasking habit? The most obvious way is to reduce the opportunity. Turn off the cell phone. You do not have to be accessible to everyone at every instant. Don't launch the mail app, and when it is on, turn off the feature that notifies you about the arrival of each new message. If you don't need to use a computer or the Internet for the task you are working on, don't turn on your electronic devices. If a computer is needed, don't launch the browser until you actually need it.

Be more aware of your current mental state, because it affects your distractibility. If doing boring work, find ways to make it less boring and thus less tempting to switch tasks. If you are doing work that is engaging, make it a goal to stay focused for longer and longer times on such work. Set goals for increasing the time spent on task. You should at least be able to sustain focus for 30 minutes. Just as multitasking can condition bad habits, mental discipline can condition good attentiveness and thinking habits.

Sources:

Levitin, Daniel J. 2015. Why the modern world is bad for your brain. The Guardian. Jan. 18.

Mark, G. et al. 2015. Focused, aroused, but so distractible: A temporal perspective on multitasking and communications.  ACM Digital Library. https://www.ics.uci.edu/~gmark/Home_page/Research_files/CSCW%202015%20Focused.pdf

Mark, Gloria. 2015. Multitasking in the Digital Age. doi:10.2200/S00635ED1V01Y201503HCI029. Morgan and Claypool.


Tuesday, August 02, 2016

Who is Responsible? You or Your Neurons?

Do you deserve credit for your honest achievements and blame for your failures? No, say an increasing number of philosophers and scientists. They say that everything you do is commanded from your unconscious mind, which you can't consciously control. The conscious "you" is just a superfluous observer. Free will is thus regarded as an illusion (Fig. 1). My new book, "Making a Scientific Case for Conscious Agency and Free Will" (Academic Press), challenges the science used to justify these counter-intuitive ideas.






Figure 1. Illustration of the concept that free will is an illusion. In this view, the actions that your brain commands come from the mechanical gears of an unconscious mind. Conscious mind is informed after the fact, creating the illusion that one's conscious mind commanded the act.






How free will is defined affects the conclusion about whether humans have any free will. As defined here, free will exists when a person generates thinking and behaviors that are neither stereotyped nor predetermined, and yet not random. My book identifies and explains many actions of brain that are unlikely to be performed solely by unconscious thinking. Reason and creativity are obvious exemplars of such free will.

More fundamental is the issue of just who the conscious you is. My book presents the argument that consciousness is not just a state of observation, like a movie fan passively watching a film in which participation is not possible. Rather, consciousness may be a distinct being.

I argue that consciousness can do things because the neurons that create consciousness are part of the over-all global brain workspace. The outputs of their firing cannot be isolated from the command centers of brain. Indeed, we should realize that these neurons are part of the neocortical executive control centers. When those firing patterns enable consciousness, they enable capability for explicit observation and executive action at the same time.

Our human beingness exists as the firing patterns in the neural networks of brain. The patterns are obviously different when we are unconscious, as in sleep or anesthesia. When those patterns change in certain measurable ways, they create consciousness. Compared to the unconscious state, our beingness during consciousness is more amenable to change and more able to initiate thought and action. In that sense, we are a different being when conscious, one that can influence its own nature through explicit thought. Explicit awareness can be attributed to a being acting like an avatar on behalf of brain and body that can command action in the present, facilitate formation of memories, and program circuitry for the future.

Freedom of action in these firing patterns comes from several sources. One is the enormous amount of statistical degrees of freedom in neural networks. Every possible choice has a certain probability that it will be made, and no one option is inevitable at any given moment of choice. A more direct kind of freedom comes from the inherent self-organizing capacity of neuronal networks. The book explores the mechanisms by which neural circuits make choices and decisions and proposes chaos dynamics as one way the brain can generate free will.


Conscious choices are indeed influenced by unconscious biases, but we can be aware of predilections and countermand them. Choices are not necessarily pre-ordained, and thus they manifest the kind of free will that is most relevant to everyday life. The issue of free will is not so much whether we have any, but how able we are to develop and use the free will capacity we have.

Saturday, July 09, 2016

Chronic Pain May Be a Memory Problem

After an injury or pain-inducing experience, the body often heals itself, but a chronic pain may continue even after healing. National Institute of Medicine surveys suggest that some 116 million American adults are in chronic pain. Chronic pain is often accompanied by such emotions as anxiety, depression, and a significant reduction in quality of life. Drugs like opiates, steroids, and non-steroidal anti-inflammatories can be very effective in reducing acute pain, but may have little or no effect when post-healing chronic pain sets in.

How can pain persist when the original cause is gone? Clues have emerged from brain scans of chronic pain patients that show no sign of augmented activity in pain-mediated areas but do show increased activity in emotional and motivational areas of brain. The thought has now emerged in several research labs that chronic pain may actually be a memory. As if the chronic pain itself is not bad enough, the pain learning process may induce degenerative changes in emotional circuitry.
The idea dates back to the work of Pavlov over 100 years ago revealing that animals experiencing painful stimuli learn to associate that pain with other ongoing events, called conditioning stimuli, which include the associated emotional distress. The animals remember both the pain and the negative emotion, even when neither is any longer present. But until the last few years, nobody seems to have applied these findings to the issue of chronic pain in humans.

The idea is that a prolonged period of acute pain strengthens the emotional pathways that are activated during pain, and continuously reinforces the signals so that they do not go away even after the physical pain is gone. This process might even be thought of as a kind of addiction. Many theorists believe that the usual addictions, as to opiates, nicotine, etc. have a large learning and memory component.

We have known for a long time that pain can induce huge emotional distress. Numerous anecdotes establish that unpleasant emotional states are magnified by pain. But we also know that thoughts and emotions can regulate pain. For example, a mother's kiss may reduce a child's pain from a sudden injury better than any analgesic. In the heat of combat, a wounded soldier may feel no pain until after the attack is over. These pain-suppressing effects are not just psychological but even include inhibition of pain signals as they arise in the spinal cord.

Notably, one of the key brain areas involved in pain is the hippocampus, which is crucially involved in forming memories. But the hippocampus is a key linchpin in the neural circuitry that processes emotions and mediates stress.

You might think that this is a perverse feature of nature. But actually the process has its uses. Pain provides a teaching signal that makes one want to avoid such situations in the future. But in chronic pain the lesson becomes so well entrenched that the pain memory cannot be extinguished.
If this theory is correct, it means that the usual treatments for chronic pain need to focus on memory mechanisms. Minimizing the pain while healing is in progress should reduce the likelihood of developing chronic pain memories.

But of course, prevention is not always easy to accomplish. Today, physicians are more aware of the addictiveness of the most reliable pain killers: opiates. They tend to cut short use of opiates in order to prevent drug addiction.

One possible treatment may be akin to emerging treatments for post-traumatic stress syndrome (PTSD). Development of PTSD is reduced if morphine is given immediately after an acute trauma. A beta-blocking drug, propranolol, can have a similar preventing effect, presumably because it blocks memory reconsolidation. Whenever you recall a memory, it will be re-stored. While it is consciously "on-line," the memory is vulnerable to modification, and a new and perhaps less traumatic version of the memory can be saved in memory. In PTSD therapy, you might recall the memory and have its reconsolidation blocked by certain drugs that prevent memory consolidation.

Another possibility is to target the synaptic biochemistry involved in pain. Neuronal NMDA receptor molecules are involved in the emotional component of acute pain, and one drug that acts on these receptors, D-cycloserine, has been shown in animal studies to inhibit pain-related behavior for weeks afterward. There is also a protein kinase enzyme that mediates the emotional distress of pain. Animal studies show that there is a peptide that inhibits this enzyme and in the process reduces pain-related behavior. Work is underway in several laboratories trying to identify appropriate molecular targets in chronic-pain pathways so that appropriate drug therapies can be developed.

Sources:

Apkarian, A. V., Baliki, M. N., and Geha, P. Y. (2009). Towards a theory of chronic pain. Prog. Neurobiology. 87, 81-97.

Mansour, A. R. et al. (2014). Chronic pain: the role of learning and brain plasticity. Restorative Neurology and Neuroscience. 32, 129-139.

Melazck, R., and Wall, P.D. (1965). Pain mechanisms: a new theory. Science. 150, 971-979.


Sandkühler, J., and Lee, J. (2013). How to erase memory traces of pain and fear. Trends in Neurosciences. 36(6), 343-352.

Readers of this column will be interested in "Memory Medic's" e-book,: "Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine " (available in all formats from Smashwords.com). The book, devoted exclusively to memory issues in Seniors, includes review of many of the ideas in these columns over the last five years.

Saturday, June 25, 2016

Better Aging through Chemistry: A Daily Anti-aging Regimen.

The price we pay for living is dying. That is, to stay alive, our body must burn oxygen, and that process inevitably yields toxic metabolites called free radicals. Free radicals are highly reactive because the outer shell of electrons is incomplete. Atoms are attracted to other atoms with incomplete electron shells. That is, they share electrons to form a chemical bond. An atom that has a full outer shell tends not to enter into chemical reactions.

The damage comes from the free radical stripping electrons off of target atoms and converting them into a chain-reaction production of free radicals. This changes the target atoms so that their normal function is disabled. Such damage occurs in all sorts of molecules, including the vital molecules RNA and DNA.

So how do anti-oxidant chemicals help? They neutralize radicals by donating electrons to complete the outer shells of free radicals without becoming free radicals themselves. Think of anti-oxidants as scavengers that go around scooping up free radicals and neutralizing them.

Fortunately, nature provides us with chemicals that reduce the amount of free radicals. These are called anti-oxidants because they neutralize free radicals by donating one of their own electrons, ending the electron-"stealing" reaction. The antioxidant nutrients thus reduce cell and tissue damage. The best way to get these anti-oxidants is through eating a good diet. However, as we age, diet is often insufficient to provide enough anti-oxidants, and we need to increase our intake with supplement pills or capsules.

The table below suggests a daily regimen of healthful chemicals, anti-oxidants and a couple of other chemicals that slow aging even though they are not anti-oxidants. The idea is that combining different types of anti-oxidants and other substances known to slow aging should expand the breadth of their coverage and produce additive beneficial effects. Maybe they would act synergistically so that the benefits are super-additive—that is, more than the sum of the benefits of each individual anti-oxidant. This idea has never been tested to my knowledge, but it seems so plausible that I think we would all benefit from the combination. Most benefit might occur when the anti-oxidants are taken on an empty stomach. It is likely that some portion of an anti-oxidant can be inactivated or sequestered by binding with food and thus reducing the absorption into the blood stream. Avoid using sugar, as many are tempted to do with the coffee, tea, or chocolate. I recommend using artificial sweetener.



Omega-3 fatty acids are powerfully anti-inflammatory. Inflammation is a major cause of aging, and these fatty acids, found also in deep sea fish, have well-proven strong benefits on aging.
Finally, I add that other factors also have major anti-aging effects, such as regular exercise and weight control. Regular doctor checkups become increasingly necessary as one ages.

I have written about some of these anti-oxidants before (see references below). Two of the substances on my wellness list, cocoa and melatonin have not been discussed in my previous blogs. In animal studies, cocoa has been shown to improve memory and to increase brain levels of a chemical (brain-derived neurotrophic factor) that promotes connections between neurons. A recent study in seniors revealed that 900 mg of cocoa powder per day for three months produced significant improvements in formal thinking tests. Brain scans showed measurable increases of cerebral blood volume in the hippocampus, the area of the brain that promotes memory formation.

Melatonin has two benefits. It is not only a powerful anti-oxidant, but if taken just before bedtime, it helps you have a sound and more resting sleep.

I can't say that this regimen will make you live longer. But it will make you live better. I know this from personal experience, now that I am about to turn 82. If you have health problems, this regimen will surely help. However, you should check with your physician to identify anything on this list that would be contra-indicated for your particular problem.

Sources:

http://thankyoubrain.blogspot.com/2010/04/vitamin-d-wonder-vitamin.html
http://thankyoubrain.blogspot.com/2010/03/vitamin-d-and-memory.html
http://thankyoubrain.blogspot.com/2009/02/eat-your-bblueberries-but-not-with.html
http://thankyoubrain.blogspot.com/2010/04/resveratrol-red-wine-magic-chemical.html
http://thankyoubrain.blogspot.com/2014/06/health-benefits-of-resveratrol-new.html
http://thankyoubrain.blogspot.com/2009/01/caffeine-or-nap-which-helps-memory.html
http://thankyoubrain.blogspot.com/2010/02/more-on-benefits-of-blueberries.html
http://thankyoubrain.blogspot.com/2007/04/omega-3-fatty-acid-supplements-improve.html

Readers wanting to know more about slowing aging and boosting brain function should get Memory Medic's e-book " Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine" for only 99 cents at Smashwords.com.
.


Thursday, June 09, 2016

Two New Models for School Choice

Across the nation, there are three common ways to increase school choice: charter schools, vouchers (subsidies) to help pay for private school, and tax credits for companies that donate supplemental funds for voucher programs. All three approaches have serious deficiencies, and I propose two better alternatives.
But first, what is wrong with the current options? Charter schools are relatively unregulated, compared to regular public schools. They often are special-purpose schools that do not offer solutions for the broad swath of typical students. A common problem with school-choice options such as vouchers and tax credits is that such bills will be tied up in court challenges on the grounds that public money is being diverted to private, profit-making schools. The widely popular Nevada program, for example, is now held up in court.
Voucher programs provide only part, often around a half, of the cost of private schools. In the Nevada plan, the state transfers up to $5,700 per child directly to the parents. But the national average private school tuition is approximately $9,518 per year. Thus the shortfall means that only people who have the means can pay the difference. In other words, voucher programs are a subsidy that clearly discriminates against the poor and minorities. How can that survive court challenge?
Also, think about what happens to a public school when all the middle class students transfer out. Public schools can be undermined in another way as well. In Nevada, for example, all the funds that normally would go to the public school are transferred, thus removing support for overhead costs of running a school (utilities, janitorial service, physical plant maintenance, etc.).  In addition, a new bureaucracy has to be created to administer the program and monitor allowable expenditures by every participating parent.
Arm waving and lip service will not do. We must seek better options. One option is to privatize the management of public schools. An innovative approach has been enabled by Louisiana Senate Bill 432, passed on May 12, 2016, which transfers oversight of charter schools to local school boards. In the New Orleans Parish district, historically shamefully inadequate, there is now the opportunity to put schools under contract management. Basically, the program allows the school district to convert all public schools into charter schools, controlled by safeguards from abuse by supervision of the Orleans Parish school board and state law. Each school can have complete autonomy over all areas of school operations, such as school programming, instruction, curriculum, materials and texts, business operations, and personnel management. Further details are provided in the documents listed below.
Parents can send their children to any school in the district, and all schools must use the district-wide enrollment and expulsion system. Schools that develop so much excellence that enrollment limits are reached can create lottery admission policies. This puts enormous pressure on the local board to hire contractors who can upgrade the performance of the other schools. Multiple contractors are not only allowed but encouraged. Boards have the authority for competitive bidding processes that ensure competition among the various schools it also makes sense to allow students to transfer from one public school to another, or even to a public school in an adjacent country. Florida just passed such a law.
In Louisiana, safeguards include the requirement that each charter school must have independent third-party administration and monitoring of state high-stakes tests. The state Department of Education can withhold funding from any school districts that under-perform or abuse these new liberties. Local boards have authority to close a charter school. The state superintendent of education can rescind the charter for any school that is being inappropriately protected by a local school board.
A second option that I propose is to break up mega-enrollment schools into smaller schools-within-a-school as separate units that face open competition for enrollment. Carving out smaller schools would increase school choice because there would be more schools and more competition. They could be managed in the usual ways or as in New Orleans by independent, competing contractors. Note that the philosophy is akin to that used in premier universities like Oxford, where separate small, relatively autonomous "colleges" are embedded within the university.
Inner city schools with enrollments of several thousand or more are common after the sixth grade. This has helped to create the dysfunction in inner city schools. It is a well-documented fact that smaller schools produce better student learning. That is why you don't see private mega-schools. Super-sized schools breed attitude and behavior problems and are bad for education quality because:

  • Students become part of a herd collective, losing individuality and personal attention from teachers who know them well.
  • Students have less opportunity to hone leadership skills or to participate in key extracurricular activities.
  • Behavior and security problems are greater. Teenagers have enough trouble "finding themselves" emotionally and socially without being swallowed up as just another number passed from teacher to teacher who can't possibly know much about everybody's learning and emotional needs and problems.
  • Students in mega-schools face fierce and demotivating academic and social competition. Only a few get to participate in the popular extracurricular activities. School can cease to be fun. Almost a third of public school students quit, and many more just drift through. Minorities are especially harmed.
  • In this school-within-a-school model, the small schools may grow too large because of population growth in the community. But if that happens, the district can build new schools with the same school-within-a school philosophy. The philosophy
In this school-within-a-school model, the small schools may grow too large because of population growth in the community. But if that happens, the district can build new schools with the same school-within-a school philosophy.
Small schools can still have the same amenities as mega-schools if the districts create shared facilities, such as cafeteria, stadiums, sports arenas, gyms, band rooms, vocational education shops, special-needs or advanced placement teachers, administrative staff, etc. With shared facilities, construction costs are reduced. Support staff might actually be cut if many facilities were shared. For academic instruction in low-enrollment subjects, like calculus, a small band of roving specialty teachers could service several small schools.
There is no justification for extra administrators. The principal and staff that now serve a school of three thousand can just as readily service five schools of 600 each. The core of quality education lies in the teachers, who will do their best if they have autonomy and  competition.
In summary, educational policy wonks and legislatures should stop pursuing controversial and flawed choice options and consider these two models. Both offer more choice, do not discriminate against the poor and minorities, do not undermine public schools, easily pass court challenge, and are likely to produce better educated children. The public does not have to embrace subsidies of private schools to get more school choice.  
Districts may be slow to implement such reforms, because in many districts, the superintendents and their boards have comfortable cozy relationships. But once parents have access to better options, they will elect the kind of board members who demand real change.

Sources:

Readers interested in my efforts to improve a child's success in school might be interested in my e-   book, "Better Grades, Less Effort" at Smashwords.com or "Memory Power 101" at bookstores   everywhere.

  

Friday, May 27, 2016

The Pen is Mightier Than the Keyboard

I have written several earlier posts on the value of teaching and learning cursive. A recent infographic provides a nice summary of the advantages of handwriting over the keyboard. Handwriting engages the brain more deeply in creative thinking.

Among its many advantages claimed in the infographic, handwriting:
  •  Provides children with a clearer understanding of how letters form words, sentences, and meanings.
  • Teaches reading skills
  • Improves memory retention
  • Promotes critical and creative thinking (note taking, mind maps, etc.)


And now there is a slick newway of teaching cursive, invented by Linda Shrewsbury. She analyzed all the alphabet letters to see if there were common pen strokes that were common to many letters. She found that handwriting all the letters could be mastered by learning just four simple pen strokes. So she wrote a book, Cursive Logic, that explains how to learn cursive by first learning these four basic strokes. Instead of spending hours, days, and weeks learning how to copy each letter in an attractive and readable way, you practice the four strokes (which can be mastered in less than an hour). Then with these mastered, you quickly learn how to apply the strokes appropriately to the letters in four short lessons.

Monday, May 23, 2016

A Potential New Area for PTSD Research

Post-traumatic stress disorder (PTSD) is a common form of fear memory, in which a pervasive emotional stress is created by remembering experiences that evoked fear. If our brains could forget the fear memory, PTSD would decay away. Why can't we forget fear memories? In part, it is because they keep getting rehearsed, and much of this rehearsal occurs during our dreams. One major normal function of sleep is to help the brain to strengthen memory of things, good and bad, that happened during wakefulness.

Recent animal research suggests how the brain accomplishes this memory strengthening (called consolidation). More importantly, consolidation is manipulable. The study began with the established understanding that memories are of two kinds: explicit (episodic) and implicit (procedural). Fear memories are episodic; that is, we remember the episodes in our life that were traumatic. Episodic memories are laid down by a structure in the brain known as the hippocampus, a part of the cerebral cortex that is folded underneath the main cortex and has different internal structure and connections with other parts of brain. Moreover, the hippocampal consolidation effect is exerted when it generates a voltage rhythm of roughly 6-10 waves per second that also contains nested higher frequencies (gamma) of about 30-90.

With this background of information, researchers at McGill University in Canada* decided to see how fear memory might be affected by disrupting hippocampal theta rhythm, which in sleep occurs during the REM (dream) stage of sleep. The study was conducted in mice, monitored during their sleep, soon after they were trained to remember certain objects and also after they had learned a conditioned fear memory. The object-learning task was to remember where a novel object had been placed (the hippocampus is also known to provide the brain with spatial location information). The other learning task, and the one relevant to PTSD, involved exposing awake mice to a sound warning followed by electrical shock to their feet. They manifested the associated fear learning by freezing all movement as soon as the sound cue was heard, before the foot-shock was actually delivered.

The key part of the experiment was the ability to shut down theta activity. Other workers had shown that neurons can be made hypersensitive to laser light by injecting their environment with a virus that is fused to a fluorescent protein. The location of neurons that drive theta rhythm is known, and so the researchers injected such a virus into that area and also implanted a fiber optic that could deliver laser light on those neurons. Neuronal activity in this area could be stopped whenever laser light activated the protein.

With both memories of object location and conditioned fear, testing for recall on the next day revealed that memory formation was prevented by blocking theta activity during the preceding REM sleep when the blocking occurred during a critical four-hour period immediately after initial learning. Similar activity disruption during the non-dream, non-theta, stage of sleeping did not prevent either form of memory.

Even if you could use this laser-light technique in humans (and theoretically you can), you might say this approach could not work because it is usually not practical to institute formal therapy within four hours after an initial emotionally traumatic experience. But, a common current PTSD therapy is based on the established phenomenon of re-consolidation of memory. Every time you recall a memory, it has to be re-stored, and thus it is susceptible to modification (by talk therapy, for example). The revised memory can replace the original fear memory. A therapist could have a patient recall the bad experience, go to sleep right away, and receive light blocking of theta to disrupt the re-storage of the bad memory. Perhaps a simpler approach would be to get good dream sleep soon after talk therapy, which might help cement the revised, less traumatic memory.

*It was at McGill, about a half-century ago, that the role of the hippocampus in memory formation was first discovered.


Source:

Boyce, Richard, et al. (2016) Causal evidence for the role of REM sleep theta rhythm in contextual memory consolidation. Science. 352, 812-815.


For more information about learning and memory, consult Memory Medic’s recent book, Memory Power 101.

Saturday, April 30, 2016

Why Isn't Common Core Working?

First, the facts: Common Core (CC) is not working, as measured by its own standards and metrics. After seven years of implementation in 40 states, Associated Press now summarizes the National Report Card that reveals that two-thirds of graduating seniors are not ready for college. Seventy-five percent failed the math test and sixty-three percent failed the reading test.

These dismal findings are no surprise, as we get similar reports every year during CC's reign. Everybody seems to have an explanation, which too often is an excuse—like we don't spend enough money on schools. That conclusion is easily refuted by extensive documentation, and I won't take the time to rehash that evidence here. But let's look at some possible explanations that are widely shared and perhaps real:

Teaching to the Test. The problem with CC is not so much with its standards but with the testing regimen that has been captured by two publishing houses. The federal government education bureaucrats ("educrats") have turned schools into test factories for CC-based testing. In other areas of politics, we would call that crony capitalism. The focus of teaching in many schools is to teach students to pass multiple-choice tests limited to specific standards in only two areas, math and English. In the old days, we practiced learning the multiplication tables; today, kids practice taking tests—again and again. If teaching to the test worked, maybe we could endorse the practice. But it clearly doesn't work. Why? This leads us to other explanations.

One Size Fits All. Federal educrats treat our hugely heterogeneous population as if it were homogeneous. If you live in the Southwest, you know that this part of the country is largely Mexicanized, with huge numbers of students who don't even speak English. The country as a whole is a mixture of suburbia and ghettos. The government promotes multi-culturalism, while at the same time demands that our schools produce a cookie-cutter product. We have Red and Blue states that seem to be moving further apart. We have growing disparities in personal wealth, aspirations, and family structure. It is a fool's errand to think that one size fits all is the remedy for education.

Political Correctness. CC is notorious for its PC curriculum, which contains significant elements of anti-Americanism and leftist doctrine that have little to do with education. Moreover, for many students, such PC is demotivating. Kids do have a capacity for spotting when they are being manipulated by adults. They do not like it, especially when it is imposed in school.

State-centric versus Student-Centric Education. Students live in a different mental world than adults. Our standards of learning are not inherently theirs. Whatever it is we say they must learn has to be put in a context that is meaningful to them. Math, for example, taught as an isolated subject, has little attraction for most students, especially when the only purpose is to pass a federally mandated exam. However, when taught as a necessary component of a shop class or classes in other subjects, math acquires a relevance that even students can value. Language arts, when studied as an end itself, is hardly as motivating as when students learn it to accomplish their own purposes, like perhaps debating with peers, writing persuasive blogs and social media posts, or school publications. I think that educrats have forgotten what it is like to be a youngster.

Trashing Memorization. CC was designed to abandon the old emphasis on memorization and focus on teaching thinking skills. This is most evident in math instruction. Learning to think is of course admirable, but why then do we not see improvement on the tests designed to measure thinking skills? Do educrats not know that you think with what you know, and what you know is what you have memorized?

I have professor colleagues who criticize me for trying to be a "Memory Medic" and help students learn how to memorize more effectively. Teachers seem reluctant to teach memory skills, or maybe they don't know what the skills are. Even if teachers can teach such skills, their principals and superintendents set the demands that are focused on teaching to the test. Teaching learning skills these days is an alien concept.

What schools need to focus on is helping students to develop expertise in something. That may be in band, art, vocational classes, farm projects, or any area where skills are valued. CC does none of that. The real world needs and rewards expertise. Of course, experts can think well in their field of expertise. And why is that? They know their subject.

When a student memorizes information, she not only acquires subject-matter mastery but the personal knowledge of success. Nothing is more motivating than success. A student owns the success. Nobody can take that away. Federal exams remind students of their ignorance. And we expect that to be motivating?

When I went to school decades ago, school was fun, because I was learning cool stuff and nobody was on my back all year long to make the teacher and school to look good with my test scores. Today, a lot of kids hate school. I would too.


"Memory Medic" has three recent books on memory:

1. "Memory Power 101" (Skyhorse) - for a general audience at http://www.skyhorsepublishing.com/book/?GCOI=60239100060310http://www.skyhorsepublishing.com/book/?GCOI=60239100060310

2. "Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine"- an inexpensive e-book for boomers and seniors in all formats at Smashwords.com, https://www.smashwords.com/books/view/496252https://www.smashwords.com/books/view/496252


3. "Better Grades, Less Effort" - an inexpensive e-book for students at Smashwords.com, https://www.smashwords.com/books/view/24623https://www.smashwords.com/books/view/24623

Tuesday, April 26, 2016

The One Best Way to Remember Anything

As explained in my memory-improvement book, "Memory Power 101," the most powerful way to remember something is to construct a mental-image representation. All the memory books I have read make the same point. The professional memorizers, "memory athletes" who can memory the sequence of four shuffled decks of cards in five minutes, all use some form of mental imaging that converts each card into a mental-picture representation.

Now a recent experiment documents the power of mental images in a study involving seven experiments that compared memory accuracy with whether or not a drawing was made. College-student volunteers were asked to memorize a list of words, each of which was chosen to be easily drawn. Words were presented one at a time on a video monitor and students were randomly prompted to write the name of the object or make a drawing of it. Each word presentation was timed and a warning buzzer indicated it was time to stop and get ready for the next word display. At the end of the list, a two-minute filler task was presented wherein each student classified 60 sound tones, selected at random, in terms of whether the frequency was low, medium, or high. Then a surprise test was given wherein students were asked to verbally recall in one minute as many words as they could, in any order, whether written or drawn.

In the first two experiments students remembered about twice as many when a drawing representation had been made than when just the word had been written. Three other experiments demonstrated that drawing was more effective because the encoding was deeper. For example, one experiment was conducted like the first two, but included a third condition in which the subjects were to write a list of the physical characteristics of the word (for example, for apple, one might say red, round, tasty, chewy, etc.). This presumably provides a deeper level of encoding than just writing or
drawing the word. Results revealed that drawing was still more effective than either writing a list of attributes or writing the word.

Another highly important experiment was conducted that compared drawing and writing with just making a mental image without drawing it. Again, drawing produced the best results, although more words were remembered when mentally imaged than when written.

A follow-on experiment substituted an actual picture of the word instead of requiring the student to actively imagine an image. Here again, best results occurred with drawing, with seeing pictures being more effective than writing the word.

In a sixth experiment, drawing was still superior to writing even if the list of words was made longer or if the encoding time was reduced. In the last experiment, drawing was still beneficial in a way that could not be explained solely by the fact that drawings are more distinctive than writing a word.
The benefits of drawing were seen within and across individuals and across different conditions. The researchers concluded that drawing improves memory by encouraging a seamless integration of semantic, visual, and motor aspects of a memory trace. That makes sense to me.

The processes involved here that account for better memory are 1) elaborating the item to be remembered, 2) making a mental image of it or an alias for it, 3) the motor act of drawing the image, and 4) the reinforcing feedback of thinking about the drawing.

The implications of studies like this have enormous practical application for everyday needs to remember. The principle is that whenever you have something you need to remember, make a mental-image representation of it and then draw it. For example, if you have to remember somebody named "Mike" make a mental image of the person speaking into a microphone (mike). Then roughly draw Mike's main facial features alongside a microphone. There are all sorts of formal schemes for making mental images, even for numbers, as explained in my book. This present study indicates that the making of a mental image is powerfully reinforced when you try to draw it.

To some extent, this memory principle is used in elementary school, where drawing is a huge part of the curriculum. As students get older, teachers abandon drawing and usually so do the students. Perhaps educators need to revisit the idea that drawing has educational value at all grade levels.

Source:

Kluger, Jeffrey, (2016) Here's the memory trick that science says works. Time, April 22. http://time.com/4304589/memory-picture-draw/


Wammes, Jeffrey D. et al. (2016. The drawing effect: Evidence for reliable and robust memory benefits in free recall. The Quarterly Journal of Experimental Psychology. 69 (9), 1752-1776. DOI:10.1080/17470218.2015.1094494