adsense code

Friday, December 29, 2017

On Making Right Choices

Our lives our filled with making choices. Sometimes we make reasoned choices and sometimes we make irrational choices. The drivers of irrational choices were examined in a series of studies by Daniel Kahnemann and Amos Tversky, who won the Nobel Prize in Economics for their work. Their experiments showed that humans will make irrational choices when the cost-benefit relations are manipulated in certain ways. They established two generic modes of cognitive function: an intuitive mode in which judgments and decisions are made automatically and rapidly, and a controlled mode, which is deliberate and slower. Cost-benefit parameters need not involve money, but they often do, such as "should I wait for the new cars go on sale" to "how much am I willing to save for retirement."

I had the good fortune back in the early 1970s when these Nobel Prize discoveries were being made to be part of a team at Texas A&M that documented and elucidated the founding "behavioral economics" concepts. We use rigorously controlled experiments with rats in an economic environment where we commoditized their food and drink. Prices were set in terms of how many lever presses they had to make to get an item. Buy the way, they normally prefer root beer over Tom Collins mix (without the alcohol). But what they "bought" was readily manipulated by changing the cost and the amount of item they could get. With certain cost-benefit conditions, they made stupid choices even to the point of making themselves sick. Our widely cited paper apparently stimulated the present-day use by drug companies to use our approach with lab animals to test new drugs for their potential to be addictive.

Today, a recent review of behavioral economics emphasizes that foundational principles of behavioral economics may help in the treatment of maladaptive choice-making, as occurs in preventive medical practices, drug addictions, obesity, and assorted compulsions. The heart of the matter is that such choices entail how much one values a desired target (like chocolate cake) and how much one values the future consequences of delaying or minimizing immediate consumption. Choices are irrational and maladaptive when a person is inadequately sensitive to long-term consequences and is controlled mostly by immediate desires.

Choices are a gamble. You can't know for certain you have made the right choice. But being paralyzed with indecision is no solution. Reason helps you understand the odds.

The mind is a strange and wonderful thing. 
Read about it in my book, Mental Biology.

Sources:
Jarmolowicz, D. P., Reed, D. D., Reed, F. D. G.,  and Bickel, W. K. (2015) The behavioral and neuroeconomics of reinforcer pathologies: Implications for managerial and health decision making. Managr. Decis. Econ. DOI: 10.1002/mde.2716


Kagel, J. H., Rachlin, H., Green, L. Battalio, R. C., Basemann, R. I., and Klemm, W. R. (1975) Experimental studies of consumer demand behavior using laboratory animals. Economic Inquiry. 13, 22-38.

Friday, December 15, 2017

The Production Effect in Memory


When you encounter new information and want to remember it, the formation of a memory is enormously affected by what happens immediately afterwards. The most common problem is that you think of something else, and that something else erases what you just learned from your working memory “scratchpad” before it has time to set up in long-term memory. The way to avoid this problem is to do something with the new learning right away. Memory scientists often call this a “production” effect. That is, if you produce something from the new learning right away, it not only reduces interfering distractions but also strengthens the encoding and speeds up memory formation into lasting form.

Common production activities might include using the new information in a new way, to apply it in some kind of activity, such as solving a problem. This option is not always available, but there are other approaches, such as hearing the same information at the same time you read or see it. The most common production method may be taking handwritten notes during the presentation of the new information. I have noticed that many college students do not take notes or have poor note-taking skills. Many apparently have not been taught how to take notes.

A recent study has compared the effects of silently reading, hearing somebody else read aloud, hearing a recording of yourself reading aloud, and actual reading aloud in the real-time of the learning. The last two groups tested to see if the actual mouth and tongue movements of reading aloud at the time of reading had any effect. It does.

The study divided 75 college students into these four groups in which they participated in two 15-minute sessions separated by two weeks. In the first session, they read a list of 160 words presented one at a time on a computer screen. They were to see each word and say it aloud into a microphone. They were not told why they were recording the sound of the words nor told what was to happen in the return session two weeks later.

In the follow-up session, students were randomly presented 20 of the words from the first session, according to the four groups (read silently, hear another say the words, hear the self-recording, or actively say each word). Immediately after this, students took a self-paced recognition test to identify how many of the study words were recognized.

Upon testing, a clear gradient of improvement was evident with increased production effect.



That is, poorest recognition occurred with silent reading and best recognition occurred with actively saying the words.

Why is reading aloud more effective than hearing yourself or others reading? The authors concluded that the self-reference and self-control over speaking produces more engagement with the words. The deeper the engagement, the better the memory. They also attribute the self-referencing as the explanation for why rehearsal helps memory formation. We do it ourselves and do it in our “mind’s ear.”

I think there are other implications of these findings. Other research establishes that rehearsal should require forced recall, rather than just passively looking over the study material. The data shown here suggests that rehearsal would be even more effective if we forced ourselves to recall by stating the material out loud.

Note that this study measured recognition memory. This is similar to what students do when taking a multiple-choice test: they are given prompts to see if they recognize the correct choice. This is much less demanding than requiring the student to generate the right answer “from scratch,” as in fill-in-the-blank type of question. I would expect that open-ended testing would reveal an even greater benefit from production effects, such as reading aloud. Optimal benefit would probably come from reading aloud from notes that the student took at the time of initial exposure to new information.

Source:


Forrin, N. D., and MacLeod, C. M. (2017) This time it’s personal: the memory benefit of hearing oneself. Memory. DOI: 10.1080/09658211.1383434.

Sunday, October 29, 2017

How to Learn Critical Thinking. Learning How to Think Critically Makes You Smart.

Some readers may think you have to be smart to think critically. But a corollary is that learning how to think critically makes you smart. The assumption is that one can learn to think critically (that is, be smart). The assumption is correct. Here, I hope to show you how you can  become smarter by learning critical thinking skills.

Require Yourself to Think Critically


When you read or listening to others talk, force yourself to become more attentive and engaged with the information. Asking questions assures engagement.

Learn and Look for Common Thinking Errors


Unfortunately, most adults are not taught formal logic, even in college. College logic courses are electives and are made confusing by obtuse premises, propositions, and equations. But common-sense logic can suffice. I have posted a list of common thinking errors elsewhere (1). Here are some of the more serious thinking errors:

APPEAL TO AUTHORITY OR CONSENSUS: attempting to justify the conclusion by quoting an authority in its support or on the basis of how many people hold the same view.

ARGUMENT SELECTIVITY: glossing over alternative perspectives (often called “cherry picking.)” It is not only fair but usually helpful to include opposing positions when making arguments to support a position. Commonly, opposing arguments, even when wrong over-all, usually have some grain of truth that needs to be accommodated.

CIRCULAR REASONING: reasoning where the premise of an argument or a conclusion is used as support for the argument. Usually this happens when evidence is missing or glossed over.

COGNITIVE SHORTCUT BIAS: doggedly sticking with a favored view or argument for a position, when other more fruitful possibilities exist. Even chess masters, for example, may use an established gambit when a better tactic is available.

CONFUSING CORRELATION WITH CAUSATION. asserting that when two things happen together, and especially when one occurs just before the other, that one thing causes the other. Without other more direct evidence of causation, this assumption is not justified. Both events could be caused by something else. Example: rain and lightning go together, but neither causes the other.

EXCLUSIVITY CONFUSION. failure to recognize elements of compatibility in multiple apparently conflicting ideas or facts. It is important to know whether they are independent, compatible, or mutually exclusive. Example: concepts of evolution and creationism, as they are typically used, are mutually exclusive. However, stated in other ways, they have shared elements of agreement.

FALSE ANALOGY: explaining an idea with an analogy that is not parallel, as in comparing apples and oranges. While analogies and metaphors are powerful rhetorical tools, they are not equivalent to what they reference.

JUMPING TO CONCLUSIONS: using only a few facts for a definitive conclusion. The most common situation is failure to consider alternatives. An associated cause is failure to question and test assumptions used to arrive at a conclusion.

OVER-GENERALIZATION: assuming that what is true for one is true for something else. Example: some scientists studying free will claim that the decision-making process for making a button press is the same for more complex decisions.

Learn Specific Strategies


Be Aware of Your Thinking. Explain to students the need to think about how they think. This is the art of introspection, focused on being aware of such things as one's own degree of alertness, attentiveness, bias, emotional state, exploration of interpretation options, self-assurance.

Train Yourself to  to Focus. In today's multi-tasking world, students commonly lack the ability to concentrate. They are easily distracted. They don't listen well, and are not very effective at extracting meaning from what they read.

Use evidence-based Reasoning. Don't confuse opinion with fact. When others make a claim, don't accept it without supporting evidence. Even then, look for contrary evidence that is omitted.

Identify what is Missing. In conversation or reading, the most important points may be what is not stated. This is especially true when sometime is trying to persuade you of their viewpoint.

Ask Questions and Providing Your Own Answer. I had a professor, C. S. Bachofer at Notre Dame who built a whole course based on this principle. For every reading assignment, he required the students to ask a provocative question about the reading and then write how it might be answered. Fellow students debated each other's questions and answers. Developing this as a thinking habit will ensure you will become a more critical thinker, learn more, and provide some degree of enlightenment to others with whom you interact.


Professor Klemm is author of a 2017 book, "The Learning Skills Cycle. A Way to Rethink Educational Reform. New York: Rowman & Littlefield.
https://rowman.com/ISBN/9781475833225/The-Learning-Skills-Cycle-A-Way-to-Rethink-Education


(1) Klemm, W. R. (2014). Analytical thinking—logic errors 101. http://thankyoubrain.blogspot.com/2014/10/analytical-thinking-logic-errors-101.html

Saturday, September 23, 2017

Aging Shrinks the Brain

In most people, their brains get smaller as they age. It is not so much that neurons die but that their terminals and synaptic junctions shrivel. A known cause is the over-secretion of cortisol by stress, but perhaps there are also other age-related causes.
However, shrinkage with age is not inevitable. Certain people are "super-agers," defined as adults over 80 with memory at least as good as normal middle-aged adults. A usually reliable index for decline in memory ability is the degree of brain shrinkage, specifically cortical volume. Brain-scan studies show that super-agers have thicker layers of cortex than do others of the same age. Thus, their cortex has not shrunk as much as average elderly or they had more to start with. It is possible that something about the lifestyle of super-agers protected them from brain atrophy. It is not convenient to know how much cortical volume the elderly had in their youth. But the second option has been tested in a study that compared the rate of cortical aging in 36 adults averaging 83 years of age. The investigators recruited super-agers and normal elderly and tested them in an initial visit and again 18 months later. Before and after cognitive and memory tests and brain scans provided a basis for tracking the rate of aging.
Super-agers scored higher on cognitive and memory tests than the average group at both the beginning and end of the study period. This suggests that they may have been endowed with more mental capability when they were young. But it also indicates that super-agers are more resistant to age-induced mental decline. The two groups did not differ in any other neuropsychological measures, education, or estimated IQ.
A clear correlation occurred between the two groups and cortical volume. The average memory group had over twice as much cortical shrinkage over the 18 months as did the super-agers. Some in the average group lost as much as 3.4% of cortical volume per year. If that continued over the next 10 years, they would suffer a devastating loss of over 30% in cortical volume.
Unfortunately, the study did not examine the lifestyles in the two groups. The super-agers may have just had good genes or may have been more mentally active over their lifetime and had healthier diets, more exercise, and less stress than those in the average group. Notably, some shrinkage did occur in the super-agers, on average at a rate of 1.06% per year. They still scored as well as the average 50-year old on various cognitive and memory tests. It is possible that some shrinkage is a good thing, reflecting perhaps a pruning of neural circuitry as the brain learns and develops more efficiency. Pruning is a conspicuous phenomenon in the brains of the fetus and infants as maturation progresses. Obviously too much pruning can leave neural circuitry with insufficient resources.
These results also emphasize that age discrimination is not defensible. Each elderly person's mental competence has to be judged on its own merits, not on a negative stereotype of the elderly.

Sources:

Rogaalski,E. J. et al. (2013) Youthful memory capacity in old brains. J. Cognitive Neuroscience. 25(1), 29-36.


Cook, Amanda H. et al. (2017). Rates of cortical atrophy in adults 80 years and older with superior vs. average episodic memory. JAMA. 317(13), 1373-1375.

Wednesday, September 13, 2017

Teaching Children to Be Honorable

No one is born honorable. If you doubt me, just watch little kids at play. Scholarly research has confirms the point. Michael Lewis, a prominent psychologist at Rutgers, has conducted many studies of how children naturally lie and deceive and are even encouraged to do so by well-meaning parents. Certain ways of expressing emotion are taught to be acceptable while others are not. One experiment, for example, showed that children are usually taught to express sadness when their mother leaves them with a baby sitter. But the reality was that children were not sad and recovered quickly when the mother left. Other examples are how children are taught to express responses to minor insults or injuries. Some kids are taught that it is alright to over-react.
Lewis and his colleagues conducted some classical experiments in young children that revealed how their tendency for false behavior changed with age. They secretly videotaped children in an honesty test in which they were told not to peek at a toy that was placed behind them. The child was told that the adult had to leave the room for a few minutes, but when she comes back the two will play with the toy.
One-hundred percent of children as young as two peeked at the toy. As evidence that self-control grows with age, only 35% of six-year-olds peeked. However, lying seemed to increase with age as children learned to perceive a benefit from lying. When asked if they peeked, 38% of two-year-olds lied. But among six-year-olds, one-hundred percent of peekers lied about it. Boys generally had less self-control in resisting peeking, but no sex differences occurred in the extent of lying.
Clear correlations were seen with other aspects of cognitive function. For example, how quickly a child yielded to the temptation of peeking varied with IQ. Those who peeked sooner had lower IQ scores. They also had less emotional intelligence, that is, were less able to name the emotions revealed by pictures of human faces and less able to predict the kind of emotion that would be generated by certain experiences.
However, the lying varied directly with IQ and emotional intelligence. Smarter kids were more likely to lie. Moreover, Lewis and others contend that lying and deception are normal and good. Lying seems to be associated with pro-social behavior and with creativity.

Doing What Comes Naturally

Lewis and others think that children should not be condemned for their dishonorable behavior. It comes from self-serving biology. Human weakness is most evident in children, and they will often do things they know they should not do.
It is hard to know how a child really feels, because the parents are continually teaching them how they should express feelings and reactions to life events. When children become adults, the lifetime of conditioning about expressing emotions creates problems for mental health workers to treat patients because true feelings are so buried and masked.
Children learn to think and behave in untruthful ways for three reasons:
1.      Avoid negative consequences or punishment.
2.      Protect the ego from assaults on their sense of self-worth or confidence.
3.      Benefit oneself or take advantage of others.
Children also learn self-deception at rates that vary with age. The development of self-esteem is at play here. A child learns to avoid or minimize honest judgments that unnecessarily diminish their self-esteem. At the same time, a child could learn that honest self-appraisal serves the useful purpose of avoiding future mistakes or taking some necessary action.
Experimentally, pretend play provides a paradigm for studying self-deception. Very young children imitate the actions of others around them. As they get a little older, they pretend that one toy is doing something with another toy, as for example, toy soldiers engaging in battle.
Pretend play begins at around age one.  Lewis gives the example of a one-year may pretend seeing his mother talking on the phone. By age two or three the child might pretend that her doll is talking on the phone. By three years of age, a child is able to consider success or failure of the pretend scenarios and to assign blame or credit for them. At this point, self-conscious emotions have emerged that lead to shame for failure and pride in success.
Children readily learn to seek self-benefit  and to take advantage of others, as when a child lies about a misdeed and blames it on an innocent, such as a sibling. Unfortunately, little research seems to have been done on childhood development of this level of dishonesty. How does it change with age? What factors promote it? Or mitigate it? The social consequences are profound.
Children are biologically wired to behave falsely. Where do they learn moral values and respect for truth? Traditionally, this was in houses of worship. But as many parents have left formal religion, this teaching is increasingly absent. But we know that teaching of children has lasting effects, good or bad. Both Jesuits and Communist Lenin have claimed, "give me a child until age seven, and I have that child for life."
Knowledge and life experience do change what a person thinks as true. Highly intelligent kids can figure a lot of this out on their own. But they need to question, and most humans are prone to take things at face value. This formed the basis of the life of Socrates, whose mission was to show people the importance of asking questions and introspection. In my decades of teaching at the college level, I have learned that most students are intellectually compliant and do not question. Maybe they are indoctrinated to be this way. After all, they have had 12 years of taking multiple-choice quizzes where each question is deemed to be the right question that has only one correct answer.
In most cases, behaving untruthfully is stupid, because we may eventually get caught. When that happens, who will trust us again? We are made still more stupid by life experiences that create the illusion that clever false behavior works. We learn our counterproductive attitudes and behaviors, and worse yet, we reinforce them by repetition and turn them into bad habits. We do stupid things because our brains have been programmed by our learning experiences to keep doing things that are not in our best long-term interest.
Michael Lewis and some other psychologists think childhood lying, deceptions, and other forms of untruthfulness are normal developmental features  that even help children become more emotionally and socially competent as an adult. I vigorous disagree. The price to be paid for accepting childhood dishonor is that children are learning dishonorable bad habits. Children usually have a selfish reason for being untruthful and dishonorable. If parents and other adults do not correct such bad behavior as it occurs because they believe it is normal for that age, children can become spoiled brats that grow up to be self-absorbed adults who feel entitled, make excuses, shift blame to innocents, and accept and spread false narratives. They may demand "safe spaces," or even react violently when their views are not accepted by others.

Why Honor Needs to Be Taught Early

Social pathologies are rooted in flaws of conscience. These flaws begin in childhood, for it is in childhood that people develop, or fail to develop, their conscience. What’s right and what’s wrong must be taught. In an era where the divorce rate is over 50%, where out-of-wedlock births exceed 70% in some racial groups, and where more and more kids are raised by a single mom, it is not surprising that this country has so much dysfunctional behavior and crime. I read somewhere that the U.S. has the highest percentage of people in prison than any other country. Kids learn their values from somewhere, if not from a loving two-parent family, then from a gang of peers. Lack of conscience causes much youth violence, peer cruelty, stealing, cheating, sexual promiscuity, and substance abuse.
Children have a particularly hard time knowing their feelings and limitations. That is why adult nurturing is so important to help children learn how to “grow up.” It is also why children, especially teenagers, are so prone to angst and poor choices. Indeed, a critical element of growing up is self-awareness, seeing things as they really are, and resolving the things that cause problems and unhappiness. All children develop attitudes, emotions, beliefs and behaviors that need some degree of correction. In that sense, all children have a “bad brain,” and they need to be taught how to develop it constructively.

Childhood Training

Children have to learn right from wrong and then develop the discipline and character to do the right. I have a new book that should inspire readers to include in their life purpose a pursuit of truth that goes beyond the usual casual and superficial level. Each of us needs and should want to know how to detect, understand, and deal with dishonorable behavior in others, lest others betray or exploit us. Likewise, each of us needs to understand our own character weaknesses, so that we can become better, more respected, and trusted people. Children need to know the seven deadly forms of untruthfulness, which I identify as:
Lying
Deception
Pretension
Denial
Cheating
Withholding
Delusion
A useful acrostic might be: “Low-Down People Don’t Care What (they) Do”
How does one teach honorable behavior? First, note the assertion that honor is something a person can learn to embrace. If parents don’t teach right and wrong, children may not learn it. Teaching anything can involve a mixture of positive and negative reinforcement. For misbehaving children, spanking used to be the standard remedy. That does not always work, and is subject to abuse. Alternatives include withholding privileges. Young people frequently think they have a right to much of what they want in life. A loving parent may need to take away these “rights” and dole them back out as “privileges” at the pace that children grow morally.
A parent can structure rewards for good behavior. You can keep score on a calendar or in a notebook on the progress at developing a certain desirable behavior. The Boy Scout idea of doing a good deed every day has great merit. Too bad it is considered old fashioned. The merit- badge concept in both Boy and Girl Scouts is sound psychological practice for instilling the desire to do the right thing and get recognition for it.
The best teaching is not telling, but of prompting students to question what is appropriate. For example, a parent who finds her child cheating in school, should ask, “Have you thought about what might be wrong with cheating?” Or a child who steals another kid’s lunch money should be challenged with “Was it fair for you to take that money? Would you mind if another kid took your lunch money?” Or when caught in a lie, a parent could say, “Please don’t lie to me. I need to trust you. Don’t you want me to be able to trust you?”
Of course, the best approach is to prevent wrong behavior by setting a good personal example. We all know that children learn more from what we do than what we say. From family and personal relationships to practicing one's religion, what could be more destructive than hypocrisy?
Seeking recognition or rewards for good character is itself an unworthy motive. We should do the right thing for the right reasons. When a child's reason is self-serving, such as seeking praise or reward, a child may be deceiving herself and others about her real character.

In today's world, we are becoming accustomed to concealment, half-truths, misrepresentation, spin, fake news, and other forms of untruthfulness. Social media spread these distortions like a viral epidemic. I have a new inexpensive e-book at Amazon that addresses the issues: To Tell the Truth. Save Us from Concealment, Half-truths, Misrepresentation, Spin, and Fake News. This book aims to show why truth matters, identifies seven kinds of falsehood, explains the common causes, and suggests many ways we can reduce the falsehoods we commit. A concluding chapter presents an ethics model that can be used for a variety of real-world situations.
Reference

Lewis, Michael. (2015). The origins of lying and deception in everyday life. American Scientist. 103: 128-135.

Saturday, August 26, 2017

Do We See the World Like a Movie?

We have the feeling that we experience the world like a continuously sampled data stream. If we perceive multiple objects of events seemingly at the same time, we may actually be multiplexing the several data streams; that is, we take a sample from one data stream, switch to take a sample from the next stream and so onall on a millisecond time scale.

But another possibility is that we perceive objects and events like a movie frame, where the brain takes working-memory snapshots and plays them in succession. Like still frames in a movie, if played at a high-enough speed, the frames will blend in our mind to give the illusion of continuous monitoring.

In either case, we have to account for working memory. That is, we can only hold a small amount of information in our working memory at any one instant, as in being able to dial a seven-digit phone number you just looked up. In the phone number case, does our brain accumulate and buffer the representation of each integer until reaching the working memory holding capacity and then report it to consciousness as a set? Or is each integer transferred to consciousness and concantenated until the working memory capacity is filled?

A profound recent model of perception addresses the issue of continuous or movie-like perception, but unfortunately, it did not take working memory into consideration.  The model did address the issue of how consciousness integrates the static and dynamic aspects of the object of attention. For example, when viewing a white and moving baseball, consciousness apparently tracks both the static white color and shape of the ball and its movement at the same time. Are these two visual features bundled together and made available to consciousness on a continual basis or as a batch frame?
A related issue is the so-called flash-lag illusion. Displaying a moving object and a stationary light flash at the same time and location creates the illusion that the flash is lagging. There is some debate over why this happens, but it does argue against continuous monitoring of linked objects.

Another phenomenon that argues against continuous monitoring is the “color phi” phenomenon. Here, if two differently colored disks are shown at two locations in rapid succession, a viewer perceives just one disk that moves from the first location to the second, and the color of the first disk changes along the illusory path of movement. But the viewer cannot know in advance what the color and location of the second disk is. The brain must construct that perception after the fact.
Another way of studying fusion phenomenon is to show two different colored disks in rapid succession at the same location. In this case, an initial red disk followed by a green disk will be perceived as only one yellow disk. A viewer cannot consciously recognize the individual properties if there is not enough time between the two disks. This suggests that information is batched processed unconsciously and later made available to conscious awareness. Transcranial magnetic stimulation can disrupt the fusion, but only for about 400 milliseconds after the first stimulus when presumably the processing is unconscious. Since the presentation of the two disks only takes about 60 msecs, it means that unconscious processing of the fusion takes some 340 milliseconds before the results become available for conscious recognition.

Similar fusion can occur with other sense modalities. For example, the “cutaneous rabbit” effect is a somatosensory fusion illusion in which touch stimulation of first the wrist followed quickly by stimulation near the elbow produces the feeling of touch along the nerve pathway between the two points, as if a rabbit was hopping along the nerve. There is no way for conscious mind to know the pathway without the second touch near the elbow actually occurring. Perception of that pathway information is delayed until the information has been processed unconsciously.

So while these examples argue against continuous conscious monitoring of sensation, they don’t really fit well with the movie-frame idea either. We can distinguish two visual stimuli only 3 msecs apart, but a snapshot model that samples stimuli say every 40 msecs would miss the second stimulus. So to reconcile these conflicting possibilities, the authors advance a two-step model in which sensations are processed unconsciously at high speed, but the conscious percept is reported periodically or is read out when unconscious activity reaches a certain threshold or when there is top-down demand.. This fits the data from others that conscious awareness is delayed after the actual sensory event. For visual stimuli, this delay can be as long as 400 msecs.

Here the question of interest is why sensory awareness might require a mixture of continuous monitoring and periodic reporting of immediately prior data segments. Continuous monitoring and processing permits high-temporal resolution. Snapshot reporting conserves neural resources because information accumulates as a batch (a few bytes) before becoming available to consciousness. The really interesting question is what, if anything, happens to that string of movie-like snapshots that are captured in consciousness. How do these frames affect subsequent unconscious processing in the absence of further sensory input? Can unconscious processes capture and operate on the frames of conscious data? Or can successive frames of conscious data be processed batch wise in consciousness? A useful analogy might be whole-word reading. A beginning reader must sound out each letter in a word, which is comparable to the high-resolution time tracking of sensory input. However, whole word reading allows the more efficient capture of meaning because meaning has been batch pre-processed.

How do these ideas fit with the claim of other scholars that consciousness is just an observer witnessing the movie of life as it occurs? However, this assumption ignores the role that consciousness might have in reasoning, making decisions, and issuing commands. I argue this point elsewhere in my books, Mental Biology, and Making a Scientific Case for Conscious Agency and Free Will.

Research claimed as showing that free will is illusory needs reinterpretation in light of this two-step model of perception. Those experiments typically involved asking a subject to make a simple movement, like press a button, whenever they “freely” want to do so. They are to note when they made the decision by looking at a large, high-resolution clock. At the same time, their brain activity is monitored before, during, and after the chain of events.

The first event is the intention to button press. Intention is a conscious event. Was it preceded by unconscious high-resolution processing? If so, what was the need for high resolution? Or maybe this is just the way the brain is built to operate. The button press decision-making is a slow, deliberative process, which perhaps could be handled consciously as a slow progression of successive frames of conscious thought. Critics may say that there is no such thing as conscious processing, but there is no evidence for such conjecture. Once an intent is consciously realized, the subject is now thinking about when to make the press. This decision may well be determined unconsciously, but again there is no need for high temporal resolution. Moreover, there are intervening conscious steps, where the subject may think to himself, “I just did a press. Shouldn’t I wait? Is there any point in making many presses with short intervals? Or with long intervals? Or with some random mixture? Are each of these questions answered by the two-step model of sensory processing?” However the decision developed, corresponding brain electrical activity is available to be measured.

Then, there is the actual button press, the conscious realization that it has occurred, and the conscious registration of the time on the clock when the subject thought the decision to button press was made. Does the two-step model apply here? If so, there has to be a great deal of timing delays between what actually happened consciously in the brain and what the subject eventually realized the conscious thoughts.

The point is that the two-stage model of perception may have profound implications beyond sensation that involve ideation, reasoning, decision-making, and voluntary behavior. I have corresponded with the lead author to verify that I have a correct understanding of the publication. He said that his group does plan to study the implications for working memory and for free will.

Source:

Herzog, M. H., Kammer, Thomas, and Scharnowski, F. (2017). Time slices: What is the duration of a pecept? PLOS. April 12. Hrp://de.doi.org/10.1371/journal.pbio/1002433


Tuesday, August 15, 2017

Is Your Brain Older Than You Are?

"You are as old as you think you are," the saying goes. Well, not quite. You, that is the inner you in your brain, is as old as your brain is. But your brain age may or may not correlate with chronological age.

The other day at my gym workout, I again saw a young black guy, built like Captain America, whose workout schedule sometimes overlaps with mine. We had not met, and out of the blue he came up to me and said, “You are my inspiration. You inspire me to be able to work out like you when I get your age.” Wow! I inspire somebody! Then my balloon popped when I realized that he knew I was old just by looking at me. My body may not look like I’m 83, but I guess hair loss and the lines in my face betray me.

The point of this story is that the bodily organs do not have the same rate of aging. Skin ages rather conspicuously in most older people. Specific organs may age at different rates depending on what they have been exposed to, for example skin and sun, liver and alcohol, lungs and smoking, or fat tissue and too many calories. The brain may age more rapidly than other organs if you damage it with drugs or concussion, or clog its small arteries with high cholesterol, or shrivel its synaptic connections by lack of mental stimulation or not coping with stress.

Is there some biological equivalent to tree rings to show how old your brain actually is?  A scientist at the Imperial College in London, James Cole, is developing an interesting approach for estimating brain age. Moreover, the technique seems to predict approximately when you will die.

In the study thus far, MRI brain scans were taken on 2,001 people between 18 and 90 years of age. A computer algorithm evaluated these scans to construct a frame of reference for what is normal for a given age. Then the scans from 669 adults, all born in 1936, were compared against the norms to determine whether the 81 year-old brains were normal for that age.

The people whose brains were older than normal performed more poorly on fitness measures such as lung function, walking speed, and fluid intelligence. They also had increased risk of dying sooner. Predictions became more reliable when the brain-scan data were combined with the methylation of blood DNA, a marker of life experience effects on gene expression.

Another group of workers at UCLA had determined that these kinds of gene changes predict the risk of mortality. This group, headed by Steve Horvath, evaluated these gene expression changes in various tissues of a 112-year-old woman and found that her brain was younger than her other tissues. A "young" brain will help you to live longer and also have a better quality of life.

There are two take-home implications of such research. The first is that lifestyle and environmental influences affect one's age and that not all tissues age at the same rate. The second is that it may now be possible to test which interventions to slow brain aging actually work. We currently think aging brain is slowed by exercise, by anti-oxidants, by healthy diets, by reducing stress. Having objective measures for aging in general and brain in particular will help us decide how well such preventive measures work. There is also the possibility that such measurement tools may help us identify who is aging too fast and why that is happening, which in turn may lead to better therapy. 

While we wait on technology, there is one symptom of excessive brain aging we can all notice: memory loss. As the title of my book suggests, memory is the canary in your brain's coal mine.


Get the most out of life as you age. You can slow brain aging by following the advice in Memory Medic's inexpensive e-book, "Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine." It is available in Kindle at Amazon and all formats at Smashwords.com.



Sources:

Kwon, Diana (2017). How to tell a person's "brain age." The Scientist. May 22.


Cole, James H. et al. (2015). Prediction of brain age suggests accelerated atrophy after traumatic brain injury. Annals Neurology.77(4), 571-581.  doi: 10.1oo2/ana.24367.  http://onlinelibrary.wiley.com/doi/10.1002/ana.24367/full

Friday, August 04, 2017

Mental Down-time Affects Memory

Research has shown that recent experiences are reactivated during sleep and wakeful rest. This "downtime" recall of memories is part of the process for consolidating long-term memory and serves as memory rehearsal that can strengthen the memory. Thus, the old saying, "all work and no play makes Jack a dull boy," might be re-framed, "all work and no rest makes Jack a poor learner."
To expand on this idea, a study was conducted to test whether this memory enhancing effect of mental downtime applied to new learning of related material. In other words, does downtime help form memories for new experiences as well as it does for recent past experiences? The researchers hypothesized that the degree to which memory processes are engaged during mental downtime determines whether or not prior knowledge promotes or interferes with new learning.

To test this idea, human adults were trained on learning face-object pairs over four repetitions. This initial learning was followed by fMRI brain scans while subjects engaged in passive mental downtime and during a new learning period in which a new set of face-object pairs was presented, except that the same object was used as before in order to provide a learning task that overlapped and related to the first task. Also, there was a new task in which both face and object were different from those in the first task. After scanning, subjects completed a cued recall test for memory of the new learning task.

In the initial learning task, all subjects achieved near-perfect recall during the last of the four repetitions. The fMRI data of interest was the activity level in the face-recognition areas of the cerebral cortex during the mental downtime, where the level of neural activity predicted memorization of the new learning, both overlapping and non-related face-object pairs. That is, if some face-area fMRI activity was present during the down-time, learning of related new learning was more effective.

New learning of face-object pairs was better when the new pairs overlapped the earlier pre-training pairs, suggesting that the initial learning was reactivated during mental rest and used to promote the new learning. However, this did not occur in nearly half of the subjects, and recall was actually poorer than with original pairs. This process is well known from other studies, and is termed proactive interference. In other words, prior learning may help or hinder related new learning, depending on the situation and individual differences. It appears that prior learning promotes new learning if the original learning is particularly strong. Strong initial learning is better reactivated during downtime and is more available to contribute to the learning of related new material.
Bottom-line: the right kind of mental rest can help strengthen memories and make it easier to learn related new information. During mental rest, it probably helps to avoid new learning tasks, to allow the brain to work on the residual effect of the initial learning.  Such rest probably works best on initial memories that are strongly encoded.

As for practical application in education, the authors suggested that before presenting new information, it would help for learners to recall some related things they already know. Their example was for a professor to begin a lecture by asking students questions on some aspects of the lecture that students should already know something about. I would add some additional tactics:

1. Strengthen initial encoding by at least four forced-recall attempts at the time of initial learning. Add to the strengthening by using mental images and mnemonic devices.
2.  Introduce breaks in presenting information, with a mental rest period in between.
3. Avoid new learning or mental challenges during the down-time period.
4. Review information presented in the past that relates to new information that is to be learned (as in reviewing past lecture notes before a new lecture).
5. Periodically think about what you have learned as it might relate to what you want to learn next.

Readers should also want to read Memory Medic's e-book for students (Better Grades, Less Effort, available at Smashwords.com), or the paperbacks available at Amazon and bookstores (for parents and teachers: The Learning Skills Cycle, or for a general audience: Memory Power 101 (available at Amazon and bookstores).

Source:


Schlichting, Margaret L., and Preston, Alison R. (2014). Memory reactivation during rest supports upcoming learning of related content. Proc. Nat. Acad. Sci. (USA). 111 (44), 15845-15850

Thursday, July 13, 2017

A Possible Remedy for Depression

In the United States, some 5-7% of the population is clinically depressed in any given year. Over a lifetime, there are high odds that each of us has been depressed at some point. Sadly for seniors, the likelihood can increase with age.
A new treatment approach that combines mindfulness meditation and aerobic exercise seems promising. In a recent study, 22 clinically diagnosed patients with major depressive disorder were put on a treatment regimen that begins with 30 minutes of mindfulness meditation and is followed by 30 minutes of aerobic exercise. Thirty people without depression symptoms served as a comparison group. In the meditation session, patients were told to focus on the present moment and their slow, deep breathing and excluding all mind-wandering and intrusive thoughts. Exercise was on a treadmill or stationary bicycle.
At the end of eight weeks, patients were assessed again for depression symptoms, and symptoms decreased on average by 40%. An electrically evoked brain-wave response characteristic of executive control function was notably increased in the clinically depressed group.
Like any illness, an ounce of prevention is worth a pound of cure. In the case of depression, two approaches can help. The first and foremost is to live a life of worthy purpose that gives life meaning and genuine pleasure. It is hard to be depressed when you believe that you make a positive difference in the lives of others. Of course, your efforts will fail from time to time, and people will not always value your efforts on their behalf. But you can take comfort in knowing that you mean well and are on the right track.
The second approach is to avoid the cues that remind you of negative. I have written several related posts at this archived site (http://thankyoubrain.blogspot.com)(type "depression" in the search field at upper right). I have argued that continual rehearsal of negative emotions, which can be done explicitly or implicitly, is the driver of clinical depression. As a neuroscientist, I know that rehearsal of thoughts and feelings strengthens the mediating synapses and circuits. Consciously rehearsing bad events and our depressive response cements depression in neural circuitry.
So, it would seem important to focus on ways to block the retrieval cues. One solution that sometimes works is to change environments. Even if you don’t know what the depression cues are, you know they can somehow be embedded in the current environment and lifestyle. Maybe the problem is with some of the people you run around with. People who drag you down are not all that hard to spot. Avoid them. Maybe the problem is with your career or work environment, which has saddled you with too many depressing experiences. Staying in that environment assures that depression triggering cues will be encountered again.
It is not always feasible to change dealings with certain people, or the environment or lifestyle. You may not be able to change jobs or careers for economic or other practical reasons. In those cases, it helps to promote recall of happy experiences as a substitute.
Common experience and a great deal of formal research have shown the usefulness of “happy thoughts” as a way to boost positive mood. Here, the trick is to enhance recall of the buried memories of happy experiences. The same neural mechanisms involved in rehearsal and recall of depressing experiences are involved. Triggers that recall happy experiences do so at the expense of triggers that would trigger depressive feelings.
Recent research emphasizes the importance of memory as therapy for depression. Depressed patients were trained to use one or the other of two memory techniques for strengthening the memory of happy events in their lives. Both memorization methods were equally effective when recall was tested right after the training. But a week later, experimenters made a surprise phone call to each patient and asked them to recall the happy thoughts again. This time, clearly better recall occurred in the patients who had used the method-of-loci method. If we can generalize these results, it means that patients can alleviate their depression if they train their brains to be more effective at remembering positive events. Your life should be more satisfying and less depressing when you consciously train your brain to remember the good times.


Sources:

Alderman, B. L. et al. (2016). MAP training: combining meditation and aerobic exercise reduces depession and rumination while enhancing synchronized brain activity. Transl. Psychiatry. 6(e276). doi: 10.1038/tp.2015.225

Dalgleish, T. et al. (2013). Method-of_Loci as a mnemonic device to facilitate access to self-affirming personal memories for individuals with depression. Clinical Psychological Science. Feb. 12, DOI: 10.1177/21677026112468111.


"Memory Medic" has four books on improving learning and memory:

For parents and teachers: The Learning Skills Cycle.
For students: Better Grades, Less Effort
For everyone's routine living: Memory Power 101
For seniors: Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine


For details and reviews, see Memory Medic's web site: WRKlemm.com

Monday, July 03, 2017

Memory Training Produces Lasting Effects



I first got interested in memory training at age 15 when my dad was a salesman for the Dale Carnegie leadership course, which included a section on memory training. My dad taught me some tricks that enabled me to memorize the gist of what was on every page of a magazine, by page number, in 30 minutes. I used to put on demonstrations for prospective enrollees. Before the recruitment meeting started, the leader would tell the audience, "Everybody see Billy here. Stand up Billy. I am going to give him this latest magazine issue, which he has never seen, and let him study it for 30 minutes. Then we will interrupt the meeting and you can ask him what is on any given page. Or you can tell him what is on a page, and he can tell you the page number." To my own astonishment, I could do it and it was not that hard. The basic gimmick was first to memorize a number code that converted page numbers into a visual image. For example, the code for 20 was "noose," as in a hangman's noose. Then I would convert the content on page 20 to an image or image series that captured the gist of the content. Then I would link the page-code image and the content image. For example, if the content on page 20 was about Elvis joining the army and his boot camp experiences, I would picture Elvis, guitar and costume, being trucked off in a military truck to a boot camp, where they put him through gymnastic exercises, marching, and simulated combat, and then they hung him. This idea and many other mnemonic devices are explained more fully in my book, Memory Power 101.
At the time, I wondered if this kind of mental exercise would have some sort of spill-over, lasting effect. Hopefully, it would help me in school. I think it did (I never made less than an A), but I never had an objective way to verify that.
Most readers have probably heard about "memory athletes," people who use mental imaging mnemonic devices to accomplish astonishing feats of memory. Such athletes can, for example, memorize in five minutes 550 words or the sequence of four shuffled decks of cards.
Until now, there were few studies of whether the brains of such athletes are changed in any lasting way by the memory training.
One indication of lasting change had been reported in London taxi drivers who were revealed by brain scans to have an enlarged hippocampus, a large paired structure in the brain that forms memories and also maps spatial locations (London streets are convoluted in their layout and notoriously difficult to learn).
A more direct test of brain change has been recently reported. In the first experiment, 23 of the top 50 world-ranked memory athletes were compared with control normals of similar age, gender and IQ.  Brains were scanned in all subjects under two conditions: first, while they were relaxed and letting their minds wander, and second, while they were trying to memorize a list of 72 words.
Not surprisingly, the memory champions missed only two words on average when recalling the list 20 minutes later, whereas their controls missed nearly half. The brain scans revealed patterns of connectivity among various brain regions in the memory champions.
Investigators then wanted to know if memory training of the controls would produce lasting changes in them. Thus, the controls were separated into three groups: one was asked to practice the Method of Loci memory technique for half an hour every day for a total of six weeks. A second group practiced a very challenging working-memory task, the dual N-back, in which they had to memorize a sequence of spoken words while paying attention to the locations of a moving square on the computer screen, and identify when a letter or position matches one that appeared earlier. The last group just lived their normal life without memory training for the test period of six weeks.
When tested right after training on memorizing a random list of words, only the Method of Loci group showed improved memory. Comparison of brain scans before and after the six weeks revealed connectivity changes, much like those of memory champions. Also, the change in connectivity was a reliable predictor how well they performed in the memory test. Moreover, the connectivity changes and improved memory ability persisted for at least four months afterwards.
The authors of the study report could not explain why dual N-back training had no lasting effect (other than getting better at N-back tests), as might be expected because it is a very demanding task. But I think the reason is that N-back training involves a different aspect of memory that does not generalize to memorizing word lists.
Anyway, I feel better now that my memory experiences at 16 have served me well in the succeeding years. This is consistent with what I had learned about neuroplasticity as an adult neuroscientist: the brain has to change to store what you learn in memory. How that happens is explained in another book of mine, Mental Biology.

Sources:

http://www.worldmemorychampionships.com/

Dresler, M., et al (2017). Mnemonic training reshapes brain networks to support superior memory. Neuron, 93: 1-9.

Klemm, W. R. (2012) Memory Power 101. New York: Skyhorse.

Klemm, W. R. (2014). Mental Biology: The New Science of How the Brain and Mind Relate. New York: Prometheus.

See rave reviews of "Memory Medic's" books at WRKlemm.com
Available at Amazon, Barnes and Noble, and the publisher web sites.

Tuesday, June 20, 2017

Learning Stuff While Missing the Point

As a college professor for many decades, I am always amazed at how so many students pass exams while having so little understanding. If I taught math, it would probably be different, because the task in math is to solve problems, which you can't do if you don't understand how to construct and solve appropriate equations. But for most other subjects, it is amazing how much students can learn with so little understanding.
This problem also exists in the real world outside of academia. I have recently become engaged as a volunteer tutor in our community's citizenship preparation class for immigrants. This past week the topic was George Washington, and the two instructors spent a lot of time teaching trivial things, such as when he was born, where he was born, what he was (general, president), the name of his home. Nothing was presented about his philosophy about freedom and government. I had to remind the teachers and the class that after he had done such a good job in his two terms as President, many citizens pressed him to become king. He, of course, refused. I don't know what he said to the petitioners, but I can guess he thought to himself, "We just spent years fighting where many of our fellows died to create a new country based on freedom. You turkeys missed the whole point. You didn't learn a damn thing."
During that same class period, the instructors taught about our holidays, that is, what and when they were, but not why they were. For example, we talked about the President's day holiday. During the tutoring session, I asked the immigrants at my table why we celebrate all the Presidents, even though most of them had conspicuous human weaknesses, and many of whom had views and policies that the immigrants would not have supported or voted for. Blank stares encircled our table. I had to remind everybody that we honor Presidents we don't like because more than half the country did like them. If you understand anything about freedom, you have to respect every President, because otherwise you disrespect over half the country and worse yet, the principle of democratic government. Otherwise, you are leading the country down the jungle path of becoming a banana republic (which of course is what these Hispanic immigrants are used to).
There are real-world lessons today in the world of Trump. When you popularize the idea of his assassination and shout in rage "He is not my President," you are shouting at your fellow citizens who insist that he is their President and should be yours too. Dishonoring the man dishonors the office and the fundamental philosophy of our governing principles. This is vastly more important than knowing what was being taught about the holiday.
The right lessons about our government are apparently not being taught to citizens in our k-12 schools. Numerous polls uniformly have revealed that the typical high school graduate knows very little about U.S. history. School history textbooks are roundly criticized for inaccuracy, bias, and omissions. What little is learned is about the flaws in our past, such as treatment of Indians, slavery, and the Vietnam War. I have verified this in conversations with my grandchildren. The young people I talk to know nothing about the Federalist Papers. They have little appreciation for how creative the ideas in the Constitution were at the time and how they have had at least some impact everywhere in the world. They know very little about what our "greatest generation" did in World War II to save the world from despotism.
The larger point, of the need to understand the factoids you are learning, applies in all aspects in life: school, workplace training, and relationships with people of different backgrounds. In everything we learn we should get in the habit of asking ourselves certain questions:
·         Do I understand what this means?
·         How much can I learn from it, not just of it?
·         What are the limitations of this information? Where is it wrong or incomplete?
·         What are the implications of this information?
·         To what good purpose can I put this information?
Understanding is much more demanding and valuable than just knowing. I might add as the "Memory Medic" that this perspective on learning makes it easier to remember what you learn. The best way to remember factoids is the thinking required to understand them.



"Memory Medic" has four books 

on improving learning and memory:



For parents and teachers: The Learning Skills Cycle.
For students: Better Grades, Less Effort
For everyone's routine living: Memory Power 101
For seniors: Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine

For details and reviews, see Memory Medic's web site: WRKlemm.com