Wednesday, November 23, 2016

To Remember Multiple Items: Put Them in Related Groups

Memory formation and recall are greatly influenced by how items of information relate to each other. In the case of words, different words that have related categorical meanings are often easier to remember as a unit. For example, in a list of words that include spinach, cabbage, and lettuce, recalling any one of these words will assist in recall of the other two, because they have a related meaning of green vegetables. They are "semantically clustered." 

Several studies have shown that when people are asked to memorize a list of words and recall them in any order, they tend to recall words that are related. For example, in a list that contains animal names, flowers, grocery items, and historical events, a person who recalls "cat" is also likely to recall the "dog" item that was in the list. This happens because we all have a tendency to organize things by groups. Few of us capitalize on the power of this approach with a deliberate strategy to do so.

In a study of whether semantic clustering helps older people to remember, a comparison was made between 132 younger subjects (ages 18-30) and 120 older ones (60-84). In the experiment subjects were asked to memorize two lists of words, one with words presented one at a time and the other all at once so that subjects could see what words might reasonably be clustered. For the whole-list presentation, subjects were instructed on how they might use clustering.

When the groups were instructed to use semantic clustering on the second list of words, both groups showed clear and comparable improvement in recalling words presented in a whole list, as opposed to presentation one at a time. The beneficial effect was reflected in faster recall responses and in working memory capacity.

Bottom line: to remember a list of items, try to group similar items in your mind. Try it with your grocery list next time you go to the grocery store.

Sources:

Manning, J. R., and Kahana, M. J. (2012). Interpreting semantic clustering effects in free recall. Memory. doi.org/10.1080/09658211.2012.683010.


Kuhlmann, B. G., and D. R. Touron. (2016) Aging and memory improvement through semantic clustering: The role of list-presentation format. Psychology of Aging. 31(7): 771-785. 

Wednesday, October 26, 2016

Learning to Be Dishonest

In this time of Presidential elections, what better time could there be to write a post about dishonesty. What makes people dishonest? What makes some people more dishonest than others?
          Any attitude or behavior, if sufficiently rehearsed, becomes a habit. Once formed, habits automate attitude or behavior, producing mental “knee-jerk” responses to the events of life. So, the key to honorable behavior, for example, is to think carefully about the attitudes and behaviors one is repeating. If it contributes to personal integrity, habit is a good thing. If repeated attitudes and behaviors are teaching you to be dishonest you will have done it to yourself―and made it lasting.
          A clear example of teaching oneself to be dishonorable comes from a new British university study showing that people become desensitized to lying. The experiment involved creating scenarios whereby people could lie. In the experiment with 80 people, pairs in separate rooms viewed a photograph of a jar filled with pennies. The photo was clear only for one person, whose task it was to advise the other person how many pennies were in the jar. The person making the estimate was told that the reward would vary on each trial, without knowing critical details about the built-in incentive structure. No feedback was provided. The more the advice was deliberately exaggerated, the more financial reward was to be given. Conditions were manipulated so that lying could benefit both partners, benefit the advising partner at the expense of the other partner, or benefit the advising partner only. There were features of the design that I think could have been improved, but that is beyond the scope of this post.
The greatest lying occurred when it benefited only the lying person. Dishonesty persisted at lower levels if the partner also benefited. There was zero lying under conditions were lying was punished by lower reward while the partner benefited.
People's lies grew bolder the more they lied. Brain scans revealed that activity in a key emotional center of the brain, the amygdala, became less active and desensitized as the dishonesty grew. In essence, the brain was being trained to lie. Thus, a little bit of dishonesty might be viewed as a slippery slope that can lead one to grow more dishonest. 
Emotions are at the core of the issue. Normally, we tend to feel guilty when doing something we know is wrong, like lying. But as we get in the habit of lying, the associated shame or guilt habituates. We get used to it and our conscience doesn't bother us so much. So, we are less constrained in our future behavior. We can't always be brutally honest, but it is now clear that each little lie or dishonest act can escalate and negatively change the person we are.
Another possibility is that positive reinforcement of behavior is involved. A well-known principle of behavior is that one tends to repeat behavior that is rewarded. Thus, if a person benefits from lying, he will likely do more of it. However, the brain area most associated with positive reinforcement, the nucleus accumbens, did not show any change in activity. The authors still asserted that lying was motivated by self-interest, because the greatest lying occurred when only the adviser benefited. However, the experiment was designed so that subjects could not know when their advice was being rewarded. Thus, the likely remaining explanation is that they just adapted to lying and it didn't bother them so much to exaggerate their estimates.
The absence of feedback was a crucial part of the design. But the authors point out that in the real world, the extent of dishonesty is greatly affected by feedback in terms of whether the deceiving person thinks there will be benefit or punishment.

Source:

Garrett, N. et al. (2016). The brain adapts to dishonesty. Nature Neuroscience. 24 October. doi: 10.1038/nn.4426

Sunday, October 09, 2016

Strategic Studying

School has started, and many students are discovering that they are not doing as well as expected. Parents and teachers may be chiding them about working harder. It might be more helpful to urge them to work smarter. This brings us to the matter of how students study.
My impression is that many students do not study effectively. Everyone knows that it is a bad idea to try to study while listening to music, watching TV, or frequently interrupting to check e-mail or Facebook and Twitter. One aspect of studying that is often under-valued is the way students test themselves to see how much they have learned. Typically, they "look over" the assigned learning content (notes, on-line videos, or reading assignments). Most students do not realize how important it is to force themselves to recall. In part, this is because they are conditioned by multiple-choice tests to recall passively, that is recognize when a correct answer is presented, as opposed to generating the correct answer in the first place.
Studies of student learning practices reveal how important to memory formation it is to retrieve information you are trying to memorize. For example, a 2008 study evaluated study and testing effects on memorizing foreign-language word pairs in one learning session of four trials, as one might do for example with flash cards.
A large recall improvement occurred if each repeated study attempt required active recall at that time, as opposed to just looking at the correct definition. Applying this finding to all kinds of learning suggests that learners should force themselves to recall what they think they have learned. Just looking at content again and again may not promote long-term learning.
Next, the investigators wanted to know whether recall is affected by focusing only on the word pairs that were incorrectly recalled. This is equivalent in a flash-card scenario, to re-studying only the words that were missed in the previous attempt. The test groups involved Study (S)(looking at each word and its paired word) and Test (T)(forced recall of each word in the pair) for either all of 40 word pairs or just the word pairs that were not recalled in the previous trial. The learners ran through the deck four successive times.  
At the end of this learning phase, students in each group were also asked to predict how many word pairs they would be able to remember a week later. It turned out that irrespective of the learning condition, predictions were inaccurate. This confirms my own experience that students are frequently poor judges of how much they know.
As for the effectiveness of initial learning, all four groups achieved perfect scores after four trials, with the largest improvement between the first and second trial. So that means they all learned the material. The issue at hand was how well they remembered when quizzed later. When given a test a week later, the two groups in which forced-recall testing was repeated in each study trial, final recall increased over the other two conditions by four standard deviations, ranging from 63 to 95% of correct recall a week later. Thus, it seems that forced-recall testing is more important for forming memories than is the studying. What this indicates is that learning occurs during forced-recall testing, and retrieval practice should be part of the initial study process.
In 2015, another group of researchers replicated these findings and further examined the effects of the varied spacing in the first study. That is, in the 2008 study, the two conditions where testing was repeated in each trial took more time because all 40 word pairs were tested. The second group of investigators was surprised that the earlier study seemed to diminish the importance of repeated studying, compared with repeated testing. One problem might have been that the original study design was "between subjects," where scores were averaged for students in different test conditions. This design meant that the elapsed time varied among the groups, because it took more time to complete four study cycles of all 40 word pairs and tested than it did when only non-recalled items were studied and/or tested. So this new study had a "within subjects" design in which every learner experienced all four ST conditions on 10 different word pairs.
The results replicated the earlier findings on the value of forced-recall testing. That is, the two groups that self-tested in each of the four study cycles had the most recall after one week. Moreover, the group that re-studied and re-tested all word pairs recalled about twice as many word pairs than did the group that only re-studied and re-tested non-recalled words. Thus it appears that restudying items that have been correctly recalled earlier is far from useless.
Both studies make it clear that how well a learner remembers soon after learning provides no assurance of how much will be remembered after a week (or longer) delay. In these studies, optimal learning occurred when an initial learning session included repeated study and forced-recall testing of all items at least four times in a row. Of course, we only have data for 40 items, and long-term memory might be affected differently for smaller or larger sets of learning material.
Bottom line:
·         Just looking over learning material can be ineffective for long-term memory.
·         Right after learning an item of information, force yourself to recall it and check to see if you got it right.
·         Conduct forced--recall testing of all information, not just the items that were previously recalled correctly.
Study should be strategic. These and other learning and memory aids are found in my inexpensive e-book on learning skills, Better Grades, Less Effort (Smashwords.com) or the more comprehensive book, Memory Power 101 (Skyhorse).

Sources:

Karpicke, J. D., and Roediger, H. I. III (2008). The critical importance of retrieval for learning. Science. 319, 966-968.


Soderstrom, N. C. et al. (2016). The critical importance of retrieval—and spacing—for learning. Psychol. Science. Dec. 16. doi:10.1177/0956797615617778.

Wednesday, September 14, 2016

Note-taking 101

The Fall return to school is a good time to remind students and parents about learning strategies. Lectures still dominate teaching approaches. In spite of such teaching reforms as "hands-on" learning, small group collaborations, project-based learning, and others, teachers generally can't resist the temptation to be a "sage of the stage," instead of a "guide on the side." Maybe that's a good thing, because many students are not temperamentally equipped to be active learners. Rather, they have been conditioned by television and movies, as well as their former teachers, to function passively, as an audience. Students are even conditioned to be passive by the way we test learning with multiple-choice questions, which require a passive recognition of a provided correct answer among three or four incorrect ones.
The other major teaching device, reading, is also problematic. Too many students don't like to read academic material. They want somebody to spoon fed the information to them. Most lectures are just that—spoon feeding.
Given that the dominance of lecturing is not likely to change any time soon, shouldn't teachers focus more on showing students how to learn from lectures? It seems there is an implicit assumption that passive listening will suffice to understand and remember what is presented in lectures. The problem is, however, that deep learning requires active, not passive, engagement. Students need to parse lecture content to identify what they don't understand, don't know already, and can't figure out from what they do already know. This has to happen in real time, as a given lecture proceeds. Even if the lecture is taped, seeing it again still requires active engagement for optimal learning.
So how should students engage with lectures? Traditionally, this means taking notes. But I wonder if note-taking is a dying art. I don't see many students taking notes from web pages or U-tube videos. Or textbooks (highlighting is a poor substitute). Or tweets or text messages. My concern was reinforced the other day when I gave a lecture on improving learning and memory to college students. The lecture was jam packed with more information than anyone could remember from one sitting. Yet, I did not see a single one of the 58 students taking notes. Notably, the class's regular professor, who had invited me to give the lecture, was vigorously taking notes throughout.
An explanation of how to take notes is provided in my e-book, Better Grades, Less Effort (Smashwords.com). Just what is it that I think is valuable about note taking? First and foremost is the requirement for engagement. Students have to pay attention well enough to make decisions about the portion of the lecture that will need to be studied later. Paying attention is essential for encoding information. Nobody can remember anything that never registered in the first place.
Next, note taking requires thinking about the material to decide what needs to be captured for later study. This hopefully generates questions that can be raised and answered during the lecture. In the college class I just mentioned, not one student asked a question, even though I interrupted the lecture four times to try and pry out questions. Notably, after the lecture, about a dozen students came to me to ask questions.
daikubob.com
A benefit of hand-written note-taking is that students create a spatial layout of the information they think they will need to study. A well-established principle of learning is that where information is provides important cues as to what the information is. The spatial layout of script and diagrams on a page allows the information to be visualized, creating an opportunity for a rudimentary form of photographic memory, where a student can imagine in the mind's eye just were on the page certain information is, and that alone makes it easier to memorize and recall what the information is.
This brings me to the important point of visualization. Pictures are much easier to remember than words. Hand-written notes allow the student to represent verbalized ideas as drawings or diagrams. If you have ever had to learn in a biology class the Kreb's cycle of cellular energy production, for example, you know how much easier it is to remember the cycle if it is drawn rather than described in paragraph form.
This is a good place to mention note-taking with a laptop computer. Students are being encouraged to use laptops or tablet computers to take notes. Two important consequences of typing notes should be recognized. One problem is that for touch typists, taking notes on a laptop is a relatively mindless and rote process in which letters are banged out more or less on autopilot. A good typist does not have to think. Hand-written notes inevitably engage thinking and decisions about what to write down, how to represent the information, and where on the page to put specific items. Typing also tempts the learner to record more information than can be readily memorized.
One of the earliest tests of the hypothesis about learning from handwriting was an experiment with elementary children learning how to spell. Comparison of writing words on a 3 x 5 card, or laying out words with letter tiles, or typing them with a keyboard revealed that the handwriting group achieved higher test scores when tested after having four days to study the notes. These results have been confirmed in other similar studies.
One follow-up study with college undergraduates compared the effects of typed and handwritten note-taking in 72 undergraduates watching a documentary video. Again, students who wrote notes by hand scored higher on the test.
The most recent experiment involved hundreds of students from two universities and compared learning efficacy in two groups of students, one taking notes on a laptop and the other by hand writing. Results from lectures on a wide range of topics across three experiments in a classroom setting revealed that the students making hand-written notes remembered more of the facts, had a deeper understanding, and were better at integrating and applying the information. The improvement over typing notes was still present in a separate trial where typing students were warned about being mindless and urged to think and type a synthesis of the ideas. Handwritten note benefits persisted in another trial where students were allowed to study their notes before being tested a week later.
Though multiple studies show the learning benefits of handwriting over typing, schools are dropping the teaching of cursive and encouraging students to use tablets and laptops. 
Why is it so hard for educators to learn?

Sources


Cunningham, A. E., & Stanovich, K. E. (1990). Early spelling acquisition: Writing beats the computer. Journal of Educational Psychology, 82(1), 159-162. doi:10.1037/0022-0663.82.1.159

Duran, Karen S. and Frederick, Christina M. (2013). Information comprehension: handwritten vs. typed notes. URHS, Vol. 12, http://www.kon.org/urc/v12/duran.html


Mueller, Pam A., and Oppenheimer, Daniel M. (2014). The pen is mightier than the keyboard. Pschological Science. April 23. doi: 10.1177/0956797614524581. http://pss.sagepub.com/content/early/2014/04/22/0956797614524581

Friday, August 26, 2016

The Perils of Multi-tasking

We live in the age of multitasking. Though a phenomenon of the young, older folks are being dragged into the age by the digital revolution in mobile electronic devices. Youngsters, as digital natives, are wired to multi-task, but they don't realize how multitasking impairs their impaired thinking skills. We call our phones "smart," but they can actually make us dumb. This may be one of the reasons that under performance in schools is so common.

Microsoft clip art

Older folks tend to be amazed and awed by the multitasking ability of the young. But those in all generations should realize that multitasking does not make you smarter or more productive.
In school, multitasking interferes with learning. In the workplace, multitasking interferes with productivity and promotes stress and fatigue. Multitasking creates an illusion of parallel activity, but actually it requires mental switching from one task to another. This drains the glucose fuel needed by the brain, making the brain less efficient and creating the feeling of being tired.

Neuroscientist, Dan Levitan, reminds us that multitasking is stressful, as indicated by increased secretion of cortisol and adrenalin. He cites work showing that IQ can temporarily drop 10 points during multitasking. A brain-scan study showed that new information gets processed in the wrong parts of the brain and not in the hippocampus where it should go in order to be remembered. The most insidious aspect of multitasking is that it programs the brain to operate in this mode, creating a debilitating thinking habit that is permanent.

Constant switching creates a distractible state of never being fully present. It trains the brain to have a short attention span and shrinks working memory capacity. This is especially pernicious in young people, who are most likely to multi-task and whose brains are the most susceptible to programming of bad habits.

Multitasking not only becomes a habit, it is addictive. I see many youngsters who seem to have withdrawal symptoms if they can't check their phone messages every few minutes. Mail messages send an associated signal that someone thinks you are important enough to contact. This provides powerfully reward personal affirmation. Worse yet, like slot-machine payoffs, the reinforcement occurs randomly, which is the most effective way to condition behavior. It turns us into trained seals.
Why does anybody engage in behaviors that can turn them into a trained seal? One study indicates that susceptibility to task switching depends on the existing mental state. The researchers monitored 32 information workers, of near-equal gender, in the work environment for five days. Workers were more likely to switch off task to Facebook or face-to-face conversations when they were doing rote tasks, which were presumably boring. When they were focused, they were more likely to switch to e-mail. Time wasting in Facebook and e-mail increased in proportion to the amount of task switching. Over-all, the workers witched to Facebook an average of 21 times per day and to e-mail 74 times. Though the total time spent off-task was small (about 10 minutes on Facebook and 35 min on e-mail, the excessive task switching must surely have degraded the productivity of the primary work tasks. Why does anybody need to check Facebook 21 times a day or e-mail 74 times a day? This is compulsive behavior that has affected the entire workforce like an infectious disease.

How does on break the multitasking habit? The most obvious way is to reduce the opportunity. Turn off the cell phone. You do not have to be accessible to everyone at every instant. Don't launch the mail app, and when it is on, turn off the feature that notifies you about the arrival of each new message. If you don't need to use a computer or the Internet for the task you are working on, don't turn on your electronic devices. If a computer is needed, don't launch the browser until you actually need it.

Be more aware of your current mental state, because it affects your distractibility. If doing boring work, find ways to make it less boring and thus less tempting to switch tasks. If you are doing work that is engaging, make it a goal to stay focused for longer and longer times on such work. Set goals for increasing the time spent on task. You should at least be able to sustain focus for 30 minutes. Just as multitasking can condition bad habits, mental discipline can condition good attentiveness and thinking habits.

Sources:

Levitin, Daniel J. 2015. Why the modern world is bad for your brain. The Guardian. Jan. 18.

Mark, G. et al. 2015. Focused, aroused, but so distractible: A temporal perspective on multitasking and communications.  ACM Digital Library. https://www.ics.uci.edu/~gmark/Home_page/Research_files/CSCW%202015%20Focused.pdf

Mark, Gloria. 2015. Multitasking in the Digital Age. doi:10.2200/S00635ED1V01Y201503HCI029. Morgan and Claypool.


Tuesday, August 02, 2016

Who is Responsible? You or Your Neurons?

Do you deserve credit for your honest achievements and blame for your failures? No, say an increasing number of philosophers and scientists. They say that everything you do is commanded from your unconscious mind, which you can't consciously control. The conscious "you" is just a superfluous observer. Free will is thus regarded as an illusion (Fig. 1). My new book, "Making a Scientific Case for Conscious Agency and Free Will" (Academic Press), challenges the science used to justify these counter-intuitive ideas.






Figure 1. Illustration of the concept that free will is an illusion. In this view, the actions that your brain commands come from the mechanical gears of an unconscious mind. Conscious mind is informed after the fact, creating the illusion that one's conscious mind commanded the act.






How free will is defined affects the conclusion about whether humans have any free will. As defined here, free will exists when a person generates thinking and behaviors that are neither stereotyped nor predetermined, and yet not random. My book identifies and explains many actions of brain that are unlikely to be performed solely by unconscious thinking. Reason and creativity are obvious exemplars of such free will.

More fundamental is the issue of just who the conscious you is. My book presents the argument that consciousness is not just a state of observation, like a movie fan passively watching a film in which participation is not possible. Rather, consciousness may be a distinct being.

I argue that consciousness can do things because the neurons that create consciousness are part of the over-all global brain workspace. The outputs of their firing cannot be isolated from the command centers of brain. Indeed, we should realize that these neurons are part of the neocortical executive control centers. When those firing patterns enable consciousness, they enable capability for explicit observation and executive action at the same time.

Our human beingness exists as the firing patterns in the neural networks of brain. The patterns are obviously different when we are unconscious, as in sleep or anesthesia. When those patterns change in certain measurable ways, they create consciousness. Compared to the unconscious state, our beingness during consciousness is more amenable to change and more able to initiate thought and action. In that sense, we are a different being when conscious, one that can influence its own nature through explicit thought. Explicit awareness can be attributed to a being acting like an avatar on behalf of brain and body that can command action in the present, facilitate formation of memories, and program circuitry for the future.

Freedom of action in these firing patterns comes from several sources. One is the enormous amount of statistical degrees of freedom in neural networks. Every possible choice has a certain probability that it will be made, and no one option is inevitable at any given moment of choice. A more direct kind of freedom comes from the inherent self-organizing capacity of neuronal networks. The book explores the mechanisms by which neural circuits make choices and decisions and proposes chaos dynamics as one way the brain can generate free will.


Conscious choices are indeed influenced by unconscious biases, but we can be aware of predilections and countermand them. Choices are not necessarily pre-ordained, and thus they manifest the kind of free will that is most relevant to everyday life. The issue of free will is not so much whether we have any, but how able we are to develop and use the free will capacity we have.

Saturday, July 09, 2016

Chronic Pain May Be a Memory Problem

After an injury or pain-inducing experience, the body often heals itself, but a chronic pain may continue even after healing. National Institute of Medicine surveys suggest that some 116 million American adults are in chronic pain. Chronic pain is often accompanied by such emotions as anxiety, depression, and a significant reduction in quality of life. Drugs like opiates, steroids, and non-steroidal anti-inflammatories can be very effective in reducing acute pain, but may have little or no effect when post-healing chronic pain sets in.

How can pain persist when the original cause is gone? Clues have emerged from brain scans of chronic pain patients that show no sign of augmented activity in pain-mediated areas but do show increased activity in emotional and motivational areas of brain. The thought has now emerged in several research labs that chronic pain may actually be a memory. As if the chronic pain itself is not bad enough, the pain learning process may induce degenerative changes in emotional circuitry.
The idea dates back to the work of Pavlov over 100 years ago revealing that animals experiencing painful stimuli learn to associate that pain with other ongoing events, called conditioning stimuli, which include the associated emotional distress. The animals remember both the pain and the negative emotion, even when neither is any longer present. But until the last few years, nobody seems to have applied these findings to the issue of chronic pain in humans.

The idea is that a prolonged period of acute pain strengthens the emotional pathways that are activated during pain, and continuously reinforces the signals so that they do not go away even after the physical pain is gone. This process might even be thought of as a kind of addiction. Many theorists believe that the usual addictions, as to opiates, nicotine, etc. have a large learning and memory component.

We have known for a long time that pain can induce huge emotional distress. Numerous anecdotes establish that unpleasant emotional states are magnified by pain. But we also know that thoughts and emotions can regulate pain. For example, a mother's kiss may reduce a child's pain from a sudden injury better than any analgesic. In the heat of combat, a wounded soldier may feel no pain until after the attack is over. These pain-suppressing effects are not just psychological but even include inhibition of pain signals as they arise in the spinal cord.

Notably, one of the key brain areas involved in pain is the hippocampus, which is crucially involved in forming memories. But the hippocampus is a key linchpin in the neural circuitry that processes emotions and mediates stress.

You might think that this is a perverse feature of nature. But actually the process has its uses. Pain provides a teaching signal that makes one want to avoid such situations in the future. But in chronic pain the lesson becomes so well entrenched that the pain memory cannot be extinguished.
If this theory is correct, it means that the usual treatments for chronic pain need to focus on memory mechanisms. Minimizing the pain while healing is in progress should reduce the likelihood of developing chronic pain memories.

But of course, prevention is not always easy to accomplish. Today, physicians are more aware of the addictiveness of the most reliable pain killers: opiates. They tend to cut short use of opiates in order to prevent drug addiction.

One possible treatment may be akin to emerging treatments for post-traumatic stress syndrome (PTSD). Development of PTSD is reduced if morphine is given immediately after an acute trauma. A beta-blocking drug, propranolol, can have a similar preventing effect, presumably because it blocks memory reconsolidation. Whenever you recall a memory, it will be re-stored. While it is consciously "on-line," the memory is vulnerable to modification, and a new and perhaps less traumatic version of the memory can be saved in memory. In PTSD therapy, you might recall the memory and have its reconsolidation blocked by certain drugs that prevent memory consolidation.

Another possibility is to target the synaptic biochemistry involved in pain. Neuronal NMDA receptor molecules are involved in the emotional component of acute pain, and one drug that acts on these receptors, D-cycloserine, has been shown in animal studies to inhibit pain-related behavior for weeks afterward. There is also a protein kinase enzyme that mediates the emotional distress of pain. Animal studies show that there is a peptide that inhibits this enzyme and in the process reduces pain-related behavior. Work is underway in several laboratories trying to identify appropriate molecular targets in chronic-pain pathways so that appropriate drug therapies can be developed.

Sources:

Apkarian, A. V., Baliki, M. N., and Geha, P. Y. (2009). Towards a theory of chronic pain. Prog. Neurobiology. 87, 81-97.

Mansour, A. R. et al. (2014). Chronic pain: the role of learning and brain plasticity. Restorative Neurology and Neuroscience. 32, 129-139.

Melazck, R., and Wall, P.D. (1965). Pain mechanisms: a new theory. Science. 150, 971-979.


Sandk├╝hler, J., and Lee, J. (2013). How to erase memory traces of pain and fear. Trends in Neurosciences. 36(6), 343-352.

Readers of this column will be interested in "Memory Medic's" e-book,: "Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine " (available in all formats from Smashwords.com). The book, devoted exclusively to memory issues in Seniors, includes review of many of the ideas in these columns over the last five years.

Saturday, June 25, 2016

Better Aging through Chemistry: A Daily Anti-aging Regimen.

The price we pay for living is dying. That is, to stay alive, our body must burn oxygen, and that process inevitably yields toxic metabolites called free radicals. Free radicals are highly reactive because the outer shell of electrons is incomplete. Atoms are attracted to other atoms with incomplete electron shells. That is, they share electrons to form a chemical bond. An atom that has a full outer shell tends not to enter into chemical reactions.

The damage comes from the free radical stripping electrons off of target atoms and converting them into a chain-reaction production of free radicals. This changes the target atoms so that their normal function is disabled. Such damage occurs in all sorts of molecules, including the vital molecules RNA and DNA.

So how do anti-oxidant chemicals help? They neutralize radicals by donating electrons to complete the outer shells of free radicals without becoming free radicals themselves. Think of anti-oxidants as scavengers that go around scooping up free radicals and neutralizing them.

Fortunately, nature provides us with chemicals that reduce the amount of free radicals. These are called anti-oxidants because they neutralize free radicals by donating one of their own electrons, ending the electron-"stealing" reaction. The antioxidant nutrients thus reduce cell and tissue damage. The best way to get these anti-oxidants is through eating a good diet. However, as we age, diet is often insufficient to provide enough anti-oxidants, and we need to increase our intake with supplement pills or capsules.

The table below suggests a daily regimen of healthful chemicals, anti-oxidants and a couple of other chemicals that slow aging even though they are not anti-oxidants. The idea is that combining different types of anti-oxidants and other substances known to slow aging should expand the breadth of their coverage and produce additive beneficial effects. Maybe they would act synergistically so that the benefits are super-additive—that is, more than the sum of the benefits of each individual anti-oxidant. This idea has never been tested to my knowledge, but it seems so plausible that I think we would all benefit from the combination. Most benefit might occur when the anti-oxidants are taken on an empty stomach. It is likely that some portion of an anti-oxidant can be inactivated or sequestered by binding with food and thus reducing the absorption into the blood stream. Avoid using sugar, as many are tempted to do with the coffee, tea, or chocolate. I recommend using artificial sweetener.



Omega-3 fatty acids are powerfully anti-inflammatory. Inflammation is a major cause of aging, and these fatty acids, found also in deep sea fish, have well-proven strong benefits on aging.
Finally, I add that other factors also have major anti-aging effects, such as regular exercise and weight control. Regular doctor checkups become increasingly necessary as one ages.

I have written about some of these anti-oxidants before (see references below). Two of the substances on my wellness list, cocoa and melatonin have not been discussed in my previous blogs. In animal studies, cocoa has been shown to improve memory and to increase brain levels of a chemical (brain-derived neurotrophic factor) that promotes connections between neurons. A recent study in seniors revealed that 900 mg of cocoa powder per day for three months produced significant improvements in formal thinking tests. Brain scans showed measurable increases of cerebral blood volume in the hippocampus, the area of the brain that promotes memory formation.

Melatonin has two benefits. It is not only a powerful anti-oxidant, but if taken just before bedtime, it helps you have a sound and more resting sleep.

I can't say that this regimen will make you live longer. But it will make you live better. I know this from personal experience, now that I am about to turn 82. If you have health problems, this regimen will surely help. However, you should check with your physician to identify anything on this list that would be contra-indicated for your particular problem.

Sources:

http://thankyoubrain.blogspot.com/2010/04/vitamin-d-wonder-vitamin.html
http://thankyoubrain.blogspot.com/2010/03/vitamin-d-and-memory.html
http://thankyoubrain.blogspot.com/2009/02/eat-your-bblueberries-but-not-with.html
http://thankyoubrain.blogspot.com/2010/04/resveratrol-red-wine-magic-chemical.html
http://thankyoubrain.blogspot.com/2014/06/health-benefits-of-resveratrol-new.html
http://thankyoubrain.blogspot.com/2009/01/caffeine-or-nap-which-helps-memory.html
http://thankyoubrain.blogspot.com/2010/02/more-on-benefits-of-blueberries.html
http://thankyoubrain.blogspot.com/2007/04/omega-3-fatty-acid-supplements-improve.html

Readers wanting to know more about slowing aging and boosting brain function should get Memory Medic's e-book " Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine" for only 99 cents at Smashwords.com.
.