adsense code

Thursday, January 11, 2018

Scoring Wisdom

Most everybody believes that one becomes wiser with age and experience. People obviously vary across a wide spectrum of foolish to wise. We all have opinions about our own degree of wisdom compared to others, but is there an objective way to measure wisdom?
A group of researchers at U.C. San Diego believes that wisdom can be objectively measured. They tested their ideas on 524 adults, aged 25-104 years, selected from an on-going longitudinal investigation called the Successful Aging Evaluation (SAGE) study. The study population involved near equal numbers of males and females, with more than three-fourths claiming to be non-Latino white. A majority had some college education. The study was funded by three grants from the National Institute of Mental Health, the Veterans Administration, and the Stein Institute for Research on Aging.
The researchers developed a series of questions that focused on physical, cognitive, and psychosocial aspects of successful aging across the adult lifespan. Collectively, the answers provide a numerical index of wisdom that can be used to compare and judge people on the basis of presumed wisdom. Participants rated a set of statements by agreeing or disagreeing with on a scale of one to five. The statements presumably tested the degree of wisdom, covering six specific domains: 1) prosocial attitudes and behaviors such as empathy, altruism and social cooperation, 2) social decision-making/pragmatic knowledge of life, 3) emotional regulation, 4) reflection/self-understanding, 5) tolerance of diverse values, 6) and ability to effectively deal with uncertainty and ambiguity in life.
Factor analysis revealed that the scale reliably measured wisdom as defined by the questions. Thus, their questionnaire makes effective distinctions between individuals’ differing degrees of wisdom.
Limitations of the study are that responses were self-reported, not measured empirically by others. Also, the demographic was narrow (Caucasians with some higher education). Some of the assumptions could be questioned. For example, is a sense of well-being always a reliable indicator of wisdom? A person could feel good because of lucky circumstance or because of delusion. Is it always wise to be tolerant of diverse values, especially if it leads to political correctness run amuck or acceptance of an evil that needs to be overcome? How wise is it to accept ambiguity if it means avoiding the hard work of solving important problems? 
That brings us to the definition of wisdom, which is hard to define. However, we think we know it when we see it. Certainly we should seek to be wise, but not without a lot of hard thought on what that means.
The potential value of wisdom-scoring questionnaires is that they can have a teaching function of helping to show people what wisdom is by identifying its specific domains in a tangible way that could guide the striving for wisdom. Another value could be clinical evaluation of mental deterioration with age. Finally, such questionnaires could be used in screening people for suitability for admission into prestigious universities, hiring in industries requiring emotional and cognitive maturity, or acceptance into certain social groups. However, the judgmental use of such questionnaires opens the door to manipulation by the people taking the test and discrimination by those using the test results for personal judgment.
The researchers promote their "San Diego Wisdom Scale (SD-WISE)" as a new way to judge people. Society already has multiple ways to judge people: IQ scores, SAT scores, "likes" and "followers" on social media—and now on wisdom! Such indices have some valid uses, but the possibilities for abuse are enormous. Why are we always looking for ways to judge people? When people must be judged, why not emphasize what they actually do, not what their test score is?
Get the most out of life as you age. Get my e-book, "Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine." I present research evidence to show that doing the things that help your memory will also help your brain's general functions. You can delay and may even prevent age-induced mental decline. Authoritative, well researched and documented, this book provides in-depth explanations on topics such as brain aging, relationships of memory with other brain functions, how to reduce absent-mindedness, the diseases of aging, and diet and supplements that do and do not help memory. 
Source:


Thomas, M. L. (2017). A new scale for assessing wisdom based on common domains and a neurobiological model: The San Diego Wisdom Scale (SD-Wise). J. Psychiatric Res. Sep 8. DOI: http://dx.doi.org/10.1016/j.jpsychires.2017.09.005

Friday, December 29, 2017

On Making Right Choices

Our lives our filled with making choices. Sometimes we make reasoned choices and sometimes we make irrational choices. The drivers of irrational choices were examined in a series of studies by Daniel Kahnemann and Amos Tversky, who won the Nobel Prize in Economics for their work. Their experiments showed that humans will make irrational choices when the cost-benefit relations are manipulated in certain ways. They established two generic modes of cognitive function: an intuitive mode in which judgments and decisions are made automatically and rapidly, and a controlled mode, which is deliberate and slower. Cost-benefit parameters need not involve money, but they often do, such as "should I wait for the new cars go on sale" to "how much am I willing to save for retirement."

I had the good fortune back in the early 1970s when these Nobel Prize discoveries were being made to be part of a team at Texas A&M that documented and elucidated the founding "behavioral economics" concepts. We use rigorously controlled experiments with rats in an economic environment where we commoditized their food and drink. Prices were set in terms of how many lever presses they had to make to get an item. Buy the way, they normally prefer root beer over Tom Collins mix (without the alcohol). But what they "bought" was readily manipulated by changing the cost and the amount of item they could get. With certain cost-benefit conditions, they made stupid choices even to the point of making themselves sick. Our widely cited paper apparently stimulated the present-day use by drug companies to use our approach with lab animals to test new drugs for their potential to be addictive.

Today, a recent review of behavioral economics emphasizes that foundational principles of behavioral economics may help in the treatment of maladaptive choice-making, as occurs in preventive medical practices, drug addictions, obesity, and assorted compulsions. The heart of the matter is that such choices entail how much one values a desired target (like chocolate cake) and how much one values the future consequences of delaying or minimizing immediate consumption. Choices are irrational and maladaptive when a person is inadequately sensitive to long-term consequences and is controlled mostly by immediate desires.

Choices are a gamble. You can't know for certain you have made the right choice. But being paralyzed with indecision is no solution. Reason helps you understand the odds.

The mind is a strange and wonderful thing. 
Read about it in my book, Mental Biology.

Sources:
Jarmolowicz, D. P., Reed, D. D., Reed, F. D. G.,  and Bickel, W. K. (2015) The behavioral and neuroeconomics of reinforcer pathologies: Implications for managerial and health decision making. Managr. Decis. Econ. DOI: 10.1002/mde.2716


Kagel, J. H., Rachlin, H., Green, L. Battalio, R. C., Basemann, R. I., and Klemm, W. R. (1975) Experimental studies of consumer demand behavior using laboratory animals. Economic Inquiry. 13, 22-38.

Friday, December 15, 2017

The Production Effect in Memory


When you encounter new information and want to remember it, the formation of a memory is enormously affected by what happens immediately afterwards. The most common problem is that you think of something else, and that something else erases what you just learned from your working memory “scratchpad” before it has time to set up in long-term memory. The way to avoid this problem is to do something with the new learning right away. Memory scientists often call this a “production” effect. That is, if you produce something from the new learning right away, it not only reduces interfering distractions but also strengthens the encoding and speeds up memory formation into lasting form.

Common production activities might include using the new information in a new way, to apply it in some kind of activity, such as solving a problem. This option is not always available, but there are other approaches, such as hearing the same information at the same time you read or see it. The most common production method may be taking handwritten notes during the presentation of the new information. I have noticed that many college students do not take notes or have poor note-taking skills. Many apparently have not been taught how to take notes.

A recent study has compared the effects of silently reading, hearing somebody else read aloud, hearing a recording of yourself reading aloud, and actual reading aloud in the real-time of the learning. The last two groups tested to see if the actual mouth and tongue movements of reading aloud at the time of reading had any effect. It does.

The study divided 75 college students into these four groups in which they participated in two 15-minute sessions separated by two weeks. In the first session, they read a list of 160 words presented one at a time on a computer screen. They were to see each word and say it aloud into a microphone. They were not told why they were recording the sound of the words nor told what was to happen in the return session two weeks later.

In the follow-up session, students were randomly presented 20 of the words from the first session, according to the four groups (read silently, hear another say the words, hear the self-recording, or actively say each word). Immediately after this, students took a self-paced recognition test to identify how many of the study words were recognized.

Upon testing, a clear gradient of improvement was evident with increased production effect.



That is, poorest recognition occurred with silent reading and best recognition occurred with actively saying the words.

Why is reading aloud more effective than hearing yourself or others reading? The authors concluded that the self-reference and self-control over speaking produces more engagement with the words. The deeper the engagement, the better the memory. They also attribute the self-referencing as the explanation for why rehearsal helps memory formation. We do it ourselves and do it in our “mind’s ear.”

I think there are other implications of these findings. Other research establishes that rehearsal should require forced recall, rather than just passively looking over the study material. The data shown here suggests that rehearsal would be even more effective if we forced ourselves to recall by stating the material out loud.

Note that this study measured recognition memory. This is similar to what students do when taking a multiple-choice test: they are given prompts to see if they recognize the correct choice. This is much less demanding than requiring the student to generate the right answer “from scratch,” as in fill-in-the-blank type of question. I would expect that open-ended testing would reveal an even greater benefit from production effects, such as reading aloud. Optimal benefit would probably come from reading aloud from notes that the student took at the time of initial exposure to new information.

Source:


Forrin, N. D., and MacLeod, C. M. (2017) This time it’s personal: the memory benefit of hearing oneself. Memory. DOI: 10.1080/09658211.1383434.

Sunday, October 29, 2017

How to Learn Critical Thinking. Learning How to Think Critically Makes You Smart.

Some readers may think you have to be smart to think critically. But a corollary is that learning how to think critically makes you smart. The assumption is that one can learn to think critically (that is, be smart). The assumption is correct. Here, I hope to show you how you can  become smarter by learning critical thinking skills.

Require Yourself to Think Critically


When you read or listening to others talk, force yourself to become more attentive and engaged with the information. Asking questions assures engagement.

Learn and Look for Common Thinking Errors


Unfortunately, most adults are not taught formal logic, even in college. College logic courses are electives and are made confusing by obtuse premises, propositions, and equations. But common-sense logic can suffice. I have posted a list of common thinking errors elsewhere (1). Here are some of the more serious thinking errors:

APPEAL TO AUTHORITY OR CONSENSUS: attempting to justify the conclusion by quoting an authority in its support or on the basis of how many people hold the same view.

ARGUMENT SELECTIVITY: glossing over alternative perspectives (often called “cherry picking.)” It is not only fair but usually helpful to include opposing positions when making arguments to support a position. Commonly, opposing arguments, even when wrong over-all, usually have some grain of truth that needs to be accommodated.

CIRCULAR REASONING: reasoning where the premise of an argument or a conclusion is used as support for the argument. Usually this happens when evidence is missing or glossed over.

COGNITIVE SHORTCUT BIAS: doggedly sticking with a favored view or argument for a position, when other more fruitful possibilities exist. Even chess masters, for example, may use an established gambit when a better tactic is available.

CONFUSING CORRELATION WITH CAUSATION. asserting that when two things happen together, and especially when one occurs just before the other, that one thing causes the other. Without other more direct evidence of causation, this assumption is not justified. Both events could be caused by something else. Example: rain and lightning go together, but neither causes the other.

EXCLUSIVITY CONFUSION. failure to recognize elements of compatibility in multiple apparently conflicting ideas or facts. It is important to know whether they are independent, compatible, or mutually exclusive. Example: concepts of evolution and creationism, as they are typically used, are mutually exclusive. However, stated in other ways, they have shared elements of agreement.

FALSE ANALOGY: explaining an idea with an analogy that is not parallel, as in comparing apples and oranges. While analogies and metaphors are powerful rhetorical tools, they are not equivalent to what they reference.

JUMPING TO CONCLUSIONS: using only a few facts for a definitive conclusion. The most common situation is failure to consider alternatives. An associated cause is failure to question and test assumptions used to arrive at a conclusion.

OVER-GENERALIZATION: assuming that what is true for one is true for something else. Example: some scientists studying free will claim that the decision-making process for making a button press is the same for more complex decisions.

Learn Specific Strategies


Be Aware of Your Thinking. Explain to students the need to think about how they think. This is the art of introspection, focused on being aware of such things as one's own degree of alertness, attentiveness, bias, emotional state, exploration of interpretation options, self-assurance.

Train Yourself to  to Focus. In today's multi-tasking world, students commonly lack the ability to concentrate. They are easily distracted. They don't listen well, and are not very effective at extracting meaning from what they read.

Use evidence-based Reasoning. Don't confuse opinion with fact. When others make a claim, don't accept it without supporting evidence. Even then, look for contrary evidence that is omitted.

Identify what is Missing. In conversation or reading, the most important points may be what is not stated. This is especially true when sometime is trying to persuade you of their viewpoint.

Ask Questions and Providing Your Own Answer. I had a professor, C. S. Bachofer at Notre Dame who built a whole course based on this principle. For every reading assignment, he required the students to ask a provocative question about the reading and then write how it might be answered. Fellow students debated each other's questions and answers. Developing this as a thinking habit will ensure you will become a more critical thinker, learn more, and provide some degree of enlightenment to others with whom you interact.


Professor Klemm is author of a 2017 book, "The Learning Skills Cycle. A Way to Rethink Educational Reform. New York: Rowman & Littlefield.
https://rowman.com/ISBN/9781475833225/The-Learning-Skills-Cycle-A-Way-to-Rethink-Education


(1) Klemm, W. R. (2014). Analytical thinking—logic errors 101. http://thankyoubrain.blogspot.com/2014/10/analytical-thinking-logic-errors-101.html

Saturday, September 23, 2017

Aging Shrinks the Brain

In most people, their brains get smaller as they age. It is not so much that neurons die but that their terminals and synaptic junctions shrivel. A known cause is the over-secretion of cortisol by stress, but perhaps there are also other age-related causes.
However, shrinkage with age is not inevitable. Certain people are "super-agers," defined as adults over 80 with memory at least as good as normal middle-aged adults. A usually reliable index for decline in memory ability is the degree of brain shrinkage, specifically cortical volume. Brain-scan studies show that super-agers have thicker layers of cortex than do others of the same age. Thus, their cortex has not shrunk as much as average elderly or they had more to start with. It is possible that something about the lifestyle of super-agers protected them from brain atrophy. It is not convenient to know how much cortical volume the elderly had in their youth. But the second option has been tested in a study that compared the rate of cortical aging in 36 adults averaging 83 years of age. The investigators recruited super-agers and normal elderly and tested them in an initial visit and again 18 months later. Before and after cognitive and memory tests and brain scans provided a basis for tracking the rate of aging.
Super-agers scored higher on cognitive and memory tests than the average group at both the beginning and end of the study period. This suggests that they may have been endowed with more mental capability when they were young. But it also indicates that super-agers are more resistant to age-induced mental decline. The two groups did not differ in any other neuropsychological measures, education, or estimated IQ.
A clear correlation occurred between the two groups and cortical volume. The average memory group had over twice as much cortical shrinkage over the 18 months as did the super-agers. Some in the average group lost as much as 3.4% of cortical volume per year. If that continued over the next 10 years, they would suffer a devastating loss of over 30% in cortical volume.
Unfortunately, the study did not examine the lifestyles in the two groups. The super-agers may have just had good genes or may have been more mentally active over their lifetime and had healthier diets, more exercise, and less stress than those in the average group. Notably, some shrinkage did occur in the super-agers, on average at a rate of 1.06% per year. They still scored as well as the average 50-year old on various cognitive and memory tests. It is possible that some shrinkage is a good thing, reflecting perhaps a pruning of neural circuitry as the brain learns and develops more efficiency. Pruning is a conspicuous phenomenon in the brains of the fetus and infants as maturation progresses. Obviously too much pruning can leave neural circuitry with insufficient resources.
These results also emphasize that age discrimination is not defensible. Each elderly person's mental competence has to be judged on its own merits, not on a negative stereotype of the elderly.

Sources:

Rogaalski,E. J. et al. (2013) Youthful memory capacity in old brains. J. Cognitive Neuroscience. 25(1), 29-36.


Cook, Amanda H. et al. (2017). Rates of cortical atrophy in adults 80 years and older with superior vs. average episodic memory. JAMA. 317(13), 1373-1375.

Wednesday, September 13, 2017

Teaching Children to Be Honorable

No one is born honorable. If you doubt me, just watch little kids at play. Scholarly research has confirms the point. Michael Lewis, a prominent psychologist at Rutgers, has conducted many studies of how children naturally lie and deceive and are even encouraged to do so by well-meaning parents. Certain ways of expressing emotion are taught to be acceptable while others are not. One experiment, for example, showed that children are usually taught to express sadness when their mother leaves them with a baby sitter. But the reality was that children were not sad and recovered quickly when the mother left. Other examples are how children are taught to express responses to minor insults or injuries. Some kids are taught that it is alright to over-react.
Lewis and his colleagues conducted some classical experiments in young children that revealed how their tendency for false behavior changed with age. They secretly videotaped children in an honesty test in which they were told not to peek at a toy that was placed behind them. The child was told that the adult had to leave the room for a few minutes, but when she comes back the two will play with the toy.
One-hundred percent of children as young as two peeked at the toy. As evidence that self-control grows with age, only 35% of six-year-olds peeked. However, lying seemed to increase with age as children learned to perceive a benefit from lying. When asked if they peeked, 38% of two-year-olds lied. But among six-year-olds, one-hundred percent of peekers lied about it. Boys generally had less self-control in resisting peeking, but no sex differences occurred in the extent of lying.
Clear correlations were seen with other aspects of cognitive function. For example, how quickly a child yielded to the temptation of peeking varied with IQ. Those who peeked sooner had lower IQ scores. They also had less emotional intelligence, that is, were less able to name the emotions revealed by pictures of human faces and less able to predict the kind of emotion that would be generated by certain experiences.
However, the lying varied directly with IQ and emotional intelligence. Smarter kids were more likely to lie. Moreover, Lewis and others contend that lying and deception are normal and good. Lying seems to be associated with pro-social behavior and with creativity.

Doing What Comes Naturally

Lewis and others think that children should not be condemned for their dishonorable behavior. It comes from self-serving biology. Human weakness is most evident in children, and they will often do things they know they should not do.
It is hard to know how a child really feels, because the parents are continually teaching them how they should express feelings and reactions to life events. When children become adults, the lifetime of conditioning about expressing emotions creates problems for mental health workers to treat patients because true feelings are so buried and masked.
Children learn to think and behave in untruthful ways for three reasons:
1.      Avoid negative consequences or punishment.
2.      Protect the ego from assaults on their sense of self-worth or confidence.
3.      Benefit oneself or take advantage of others.
Children also learn self-deception at rates that vary with age. The development of self-esteem is at play here. A child learns to avoid or minimize honest judgments that unnecessarily diminish their self-esteem. At the same time, a child could learn that honest self-appraisal serves the useful purpose of avoiding future mistakes or taking some necessary action.
Experimentally, pretend play provides a paradigm for studying self-deception. Very young children imitate the actions of others around them. As they get a little older, they pretend that one toy is doing something with another toy, as for example, toy soldiers engaging in battle.
Pretend play begins at around age one.  Lewis gives the example of a one-year may pretend seeing his mother talking on the phone. By age two or three the child might pretend that her doll is talking on the phone. By three years of age, a child is able to consider success or failure of the pretend scenarios and to assign blame or credit for them. At this point, self-conscious emotions have emerged that lead to shame for failure and pride in success.
Children readily learn to seek self-benefit  and to take advantage of others, as when a child lies about a misdeed and blames it on an innocent, such as a sibling. Unfortunately, little research seems to have been done on childhood development of this level of dishonesty. How does it change with age? What factors promote it? Or mitigate it? The social consequences are profound.
Children are biologically wired to behave falsely. Where do they learn moral values and respect for truth? Traditionally, this was in houses of worship. But as many parents have left formal religion, this teaching is increasingly absent. But we know that teaching of children has lasting effects, good or bad. Both Jesuits and Communist Lenin have claimed, "give me a child until age seven, and I have that child for life."
Knowledge and life experience do change what a person thinks as true. Highly intelligent kids can figure a lot of this out on their own. But they need to question, and most humans are prone to take things at face value. This formed the basis of the life of Socrates, whose mission was to show people the importance of asking questions and introspection. In my decades of teaching at the college level, I have learned that most students are intellectually compliant and do not question. Maybe they are indoctrinated to be this way. After all, they have had 12 years of taking multiple-choice quizzes where each question is deemed to be the right question that has only one correct answer.
In most cases, behaving untruthfully is stupid, because we may eventually get caught. When that happens, who will trust us again? We are made still more stupid by life experiences that create the illusion that clever false behavior works. We learn our counterproductive attitudes and behaviors, and worse yet, we reinforce them by repetition and turn them into bad habits. We do stupid things because our brains have been programmed by our learning experiences to keep doing things that are not in our best long-term interest.
Michael Lewis and some other psychologists think childhood lying, deceptions, and other forms of untruthfulness are normal developmental features  that even help children become more emotionally and socially competent as an adult. I vigorous disagree. The price to be paid for accepting childhood dishonor is that children are learning dishonorable bad habits. Children usually have a selfish reason for being untruthful and dishonorable. If parents and other adults do not correct such bad behavior as it occurs because they believe it is normal for that age, children can become spoiled brats that grow up to be self-absorbed adults who feel entitled, make excuses, shift blame to innocents, and accept and spread false narratives. They may demand "safe spaces," or even react violently when their views are not accepted by others.

Why Honor Needs to Be Taught Early

Social pathologies are rooted in flaws of conscience. These flaws begin in childhood, for it is in childhood that people develop, or fail to develop, their conscience. What’s right and what’s wrong must be taught. In an era where the divorce rate is over 50%, where out-of-wedlock births exceed 70% in some racial groups, and where more and more kids are raised by a single mom, it is not surprising that this country has so much dysfunctional behavior and crime. I read somewhere that the U.S. has the highest percentage of people in prison than any other country. Kids learn their values from somewhere, if not from a loving two-parent family, then from a gang of peers. Lack of conscience causes much youth violence, peer cruelty, stealing, cheating, sexual promiscuity, and substance abuse.
Children have a particularly hard time knowing their feelings and limitations. That is why adult nurturing is so important to help children learn how to “grow up.” It is also why children, especially teenagers, are so prone to angst and poor choices. Indeed, a critical element of growing up is self-awareness, seeing things as they really are, and resolving the things that cause problems and unhappiness. All children develop attitudes, emotions, beliefs and behaviors that need some degree of correction. In that sense, all children have a “bad brain,” and they need to be taught how to develop it constructively.

Childhood Training

Children have to learn right from wrong and then develop the discipline and character to do the right. I have a new book that should inspire readers to include in their life purpose a pursuit of truth that goes beyond the usual casual and superficial level. Each of us needs and should want to know how to detect, understand, and deal with dishonorable behavior in others, lest others betray or exploit us. Likewise, each of us needs to understand our own character weaknesses, so that we can become better, more respected, and trusted people. Children need to know the seven deadly forms of untruthfulness, which I identify as:
Lying
Deception
Pretension
Denial
Cheating
Withholding
Delusion
A useful acrostic might be: “Low-Down People Don’t Care What (they) Do”
How does one teach honorable behavior? First, note the assertion that honor is something a person can learn to embrace. If parents don’t teach right and wrong, children may not learn it. Teaching anything can involve a mixture of positive and negative reinforcement. For misbehaving children, spanking used to be the standard remedy. That does not always work, and is subject to abuse. Alternatives include withholding privileges. Young people frequently think they have a right to much of what they want in life. A loving parent may need to take away these “rights” and dole them back out as “privileges” at the pace that children grow morally.
A parent can structure rewards for good behavior. You can keep score on a calendar or in a notebook on the progress at developing a certain desirable behavior. The Boy Scout idea of doing a good deed every day has great merit. Too bad it is considered old fashioned. The merit- badge concept in both Boy and Girl Scouts is sound psychological practice for instilling the desire to do the right thing and get recognition for it.
The best teaching is not telling, but of prompting students to question what is appropriate. For example, a parent who finds her child cheating in school, should ask, “Have you thought about what might be wrong with cheating?” Or a child who steals another kid’s lunch money should be challenged with “Was it fair for you to take that money? Would you mind if another kid took your lunch money?” Or when caught in a lie, a parent could say, “Please don’t lie to me. I need to trust you. Don’t you want me to be able to trust you?”
Of course, the best approach is to prevent wrong behavior by setting a good personal example. We all know that children learn more from what we do than what we say. From family and personal relationships to practicing one's religion, what could be more destructive than hypocrisy?
Seeking recognition or rewards for good character is itself an unworthy motive. We should do the right thing for the right reasons. When a child's reason is self-serving, such as seeking praise or reward, a child may be deceiving herself and others about her real character.

In today's world, we are becoming accustomed to concealment, half-truths, misrepresentation, spin, fake news, and other forms of untruthfulness. Social media spread these distortions like a viral epidemic. I have a new inexpensive e-book at Amazon that addresses the issues: To Tell the Truth. Save Us from Concealment, Half-truths, Misrepresentation, Spin, and Fake News. This book aims to show why truth matters, identifies seven kinds of falsehood, explains the common causes, and suggests many ways we can reduce the falsehoods we commit. A concluding chapter presents an ethics model that can be used for a variety of real-world situations.
Reference

Lewis, Michael. (2015). The origins of lying and deception in everyday life. American Scientist. 103: 128-135.

Saturday, August 26, 2017

Do We See the World Like a Movie?

We have the feeling that we experience the world like a continuously sampled data stream. If we perceive multiple objects of events seemingly at the same time, we may actually be multiplexing the several data streams; that is, we take a sample from one data stream, switch to take a sample from the next stream and so onall on a millisecond time scale.

But another possibility is that we perceive objects and events like a movie frame, where the brain takes working-memory snapshots and plays them in succession. Like still frames in a movie, if played at a high-enough speed, the frames will blend in our mind to give the illusion of continuous monitoring.

In either case, we have to account for working memory. That is, we can only hold a small amount of information in our working memory at any one instant, as in being able to dial a seven-digit phone number you just looked up. In the phone number case, does our brain accumulate and buffer the representation of each integer until reaching the working memory holding capacity and then report it to consciousness as a set? Or is each integer transferred to consciousness and concantenated until the working memory capacity is filled?

A profound recent model of perception addresses the issue of continuous or movie-like perception, but unfortunately, it did not take working memory into consideration.  The model did address the issue of how consciousness integrates the static and dynamic aspects of the object of attention. For example, when viewing a white and moving baseball, consciousness apparently tracks both the static white color and shape of the ball and its movement at the same time. Are these two visual features bundled together and made available to consciousness on a continual basis or as a batch frame?
A related issue is the so-called flash-lag illusion. Displaying a moving object and a stationary light flash at the same time and location creates the illusion that the flash is lagging. There is some debate over why this happens, but it does argue against continuous monitoring of linked objects.

Another phenomenon that argues against continuous monitoring is the “color phi” phenomenon. Here, if two differently colored disks are shown at two locations in rapid succession, a viewer perceives just one disk that moves from the first location to the second, and the color of the first disk changes along the illusory path of movement. But the viewer cannot know in advance what the color and location of the second disk is. The brain must construct that perception after the fact.
Another way of studying fusion phenomenon is to show two different colored disks in rapid succession at the same location. In this case, an initial red disk followed by a green disk will be perceived as only one yellow disk. A viewer cannot consciously recognize the individual properties if there is not enough time between the two disks. This suggests that information is batched processed unconsciously and later made available to conscious awareness. Transcranial magnetic stimulation can disrupt the fusion, but only for about 400 milliseconds after the first stimulus when presumably the processing is unconscious. Since the presentation of the two disks only takes about 60 msecs, it means that unconscious processing of the fusion takes some 340 milliseconds before the results become available for conscious recognition.

Similar fusion can occur with other sense modalities. For example, the “cutaneous rabbit” effect is a somatosensory fusion illusion in which touch stimulation of first the wrist followed quickly by stimulation near the elbow produces the feeling of touch along the nerve pathway between the two points, as if a rabbit was hopping along the nerve. There is no way for conscious mind to know the pathway without the second touch near the elbow actually occurring. Perception of that pathway information is delayed until the information has been processed unconsciously.

So while these examples argue against continuous conscious monitoring of sensation, they don’t really fit well with the movie-frame idea either. We can distinguish two visual stimuli only 3 msecs apart, but a snapshot model that samples stimuli say every 40 msecs would miss the second stimulus. So to reconcile these conflicting possibilities, the authors advance a two-step model in which sensations are processed unconsciously at high speed, but the conscious percept is reported periodically or is read out when unconscious activity reaches a certain threshold or when there is top-down demand.. This fits the data from others that conscious awareness is delayed after the actual sensory event. For visual stimuli, this delay can be as long as 400 msecs.

Here the question of interest is why sensory awareness might require a mixture of continuous monitoring and periodic reporting of immediately prior data segments. Continuous monitoring and processing permits high-temporal resolution. Snapshot reporting conserves neural resources because information accumulates as a batch (a few bytes) before becoming available to consciousness. The really interesting question is what, if anything, happens to that string of movie-like snapshots that are captured in consciousness. How do these frames affect subsequent unconscious processing in the absence of further sensory input? Can unconscious processes capture and operate on the frames of conscious data? Or can successive frames of conscious data be processed batch wise in consciousness? A useful analogy might be whole-word reading. A beginning reader must sound out each letter in a word, which is comparable to the high-resolution time tracking of sensory input. However, whole word reading allows the more efficient capture of meaning because meaning has been batch pre-processed.

How do these ideas fit with the claim of other scholars that consciousness is just an observer witnessing the movie of life as it occurs? However, this assumption ignores the role that consciousness might have in reasoning, making decisions, and issuing commands. I argue this point elsewhere in my books, Mental Biology, and Making a Scientific Case for Conscious Agency and Free Will.

Research claimed as showing that free will is illusory needs reinterpretation in light of this two-step model of perception. Those experiments typically involved asking a subject to make a simple movement, like press a button, whenever they “freely” want to do so. They are to note when they made the decision by looking at a large, high-resolution clock. At the same time, their brain activity is monitored before, during, and after the chain of events.

The first event is the intention to button press. Intention is a conscious event. Was it preceded by unconscious high-resolution processing? If so, what was the need for high resolution? Or maybe this is just the way the brain is built to operate. The button press decision-making is a slow, deliberative process, which perhaps could be handled consciously as a slow progression of successive frames of conscious thought. Critics may say that there is no such thing as conscious processing, but there is no evidence for such conjecture. Once an intent is consciously realized, the subject is now thinking about when to make the press. This decision may well be determined unconsciously, but again there is no need for high temporal resolution. Moreover, there are intervening conscious steps, where the subject may think to himself, “I just did a press. Shouldn’t I wait? Is there any point in making many presses with short intervals? Or with long intervals? Or with some random mixture? Are each of these questions answered by the two-step model of sensory processing?” However the decision developed, corresponding brain electrical activity is available to be measured.

Then, there is the actual button press, the conscious realization that it has occurred, and the conscious registration of the time on the clock when the subject thought the decision to button press was made. Does the two-step model apply here? If so, there has to be a great deal of timing delays between what actually happened consciously in the brain and what the subject eventually realized the conscious thoughts.

The point is that the two-stage model of perception may have profound implications beyond sensation that involve ideation, reasoning, decision-making, and voluntary behavior. I have corresponded with the lead author to verify that I have a correct understanding of the publication. He said that his group does plan to study the implications for working memory and for free will.

Source:

Herzog, M. H., Kammer, Thomas, and Scharnowski, F. (2017). Time slices: What is the duration of a pecept? PLOS. April 12. Hrp://de.doi.org/10.1371/journal.pbio/1002433


Tuesday, August 15, 2017

Is Your Brain Older Than You Are?

"You are as old as you think you are," the saying goes. Well, not quite. You, that is the inner you in your brain, is as old as your brain is. But your brain age may or may not correlate with chronological age.

The other day at my gym workout, I again saw a young black guy, built like Captain America, whose workout schedule sometimes overlaps with mine. We had not met, and out of the blue he came up to me and said, “You are my inspiration. You inspire me to be able to work out like you when I get your age.” Wow! I inspire somebody! Then my balloon popped when I realized that he knew I was old just by looking at me. My body may not look like I’m 83, but I guess hair loss and the lines in my face betray me.

The point of this story is that the bodily organs do not have the same rate of aging. Skin ages rather conspicuously in most older people. Specific organs may age at different rates depending on what they have been exposed to, for example skin and sun, liver and alcohol, lungs and smoking, or fat tissue and too many calories. The brain may age more rapidly than other organs if you damage it with drugs or concussion, or clog its small arteries with high cholesterol, or shrivel its synaptic connections by lack of mental stimulation or not coping with stress.

Is there some biological equivalent to tree rings to show how old your brain actually is?  A scientist at the Imperial College in London, James Cole, is developing an interesting approach for estimating brain age. Moreover, the technique seems to predict approximately when you will die.

In the study thus far, MRI brain scans were taken on 2,001 people between 18 and 90 years of age. A computer algorithm evaluated these scans to construct a frame of reference for what is normal for a given age. Then the scans from 669 adults, all born in 1936, were compared against the norms to determine whether the 81 year-old brains were normal for that age.

The people whose brains were older than normal performed more poorly on fitness measures such as lung function, walking speed, and fluid intelligence. They also had increased risk of dying sooner. Predictions became more reliable when the brain-scan data were combined with the methylation of blood DNA, a marker of life experience effects on gene expression.

Another group of workers at UCLA had determined that these kinds of gene changes predict the risk of mortality. This group, headed by Steve Horvath, evaluated these gene expression changes in various tissues of a 112-year-old woman and found that her brain was younger than her other tissues. A "young" brain will help you to live longer and also have a better quality of life.

There are two take-home implications of such research. The first is that lifestyle and environmental influences affect one's age and that not all tissues age at the same rate. The second is that it may now be possible to test which interventions to slow brain aging actually work. We currently think aging brain is slowed by exercise, by anti-oxidants, by healthy diets, by reducing stress. Having objective measures for aging in general and brain in particular will help us decide how well such preventive measures work. There is also the possibility that such measurement tools may help us identify who is aging too fast and why that is happening, which in turn may lead to better therapy. 

While we wait on technology, there is one symptom of excessive brain aging we can all notice: memory loss. As the title of my book suggests, memory is the canary in your brain's coal mine.


Get the most out of life as you age. You can slow brain aging by following the advice in Memory Medic's inexpensive e-book, "Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine." It is available in Kindle at Amazon and all formats at Smashwords.com.



Sources:

Kwon, Diana (2017). How to tell a person's "brain age." The Scientist. May 22.


Cole, James H. et al. (2015). Prediction of brain age suggests accelerated atrophy after traumatic brain injury. Annals Neurology.77(4), 571-581.  doi: 10.1oo2/ana.24367.  http://onlinelibrary.wiley.com/doi/10.1002/ana.24367/full