adsense code

Sunday, November 11, 2018

Two New Discoveries to Explain Why Exercise Is Good for You

Have you noticed that so many elderly people seem frail, walk slowly, and seem to lack energy? If this applies to you, noticing it is unavoidable. These problems are preventable. For 25 years, I jogged at least a mile and a half three times a week. This was crucial for helping me stop smoking. I don’t know why, except that I could not smoke and jog at the same time. Also, the 15-30 minute recovery time reminded me just how bad the smoking had been for my health.

Why did I quit jogging? The jogging messed up my joints. So, I took up swimming, but since I sink like a lead mannequin, that is just too much work. So now, I joined a gym, where I use the elliptical, treadmill, and muscle-building machines. This environment helps because I have companions in my discomfort, and occasionally I get the satisfaction of comparing myself to the few “90-pound weaklings” that show up.

We have known for many years that exercise is good for you, especially as you get older. Known benefits of exercise include:

  • Relieve stress and promote a sense of well being. (Well, at least after the soreness wears off).
  • Improve heart and cardiovascular function. (If the damage is already done, don’t expect huge improvements).
  • Lose weight. (Pushing away from the table is the best exercise for this effect).
  • Strengthens bones. (Reduces loss of bone density in old age. But high-impact exercise may damage joints).
  • Lower blood sugar and help insulin work better.
  • Help quit smoking. (Ever try to smoke while jogging? Ha!).
  • Improve mood and resist depression. (Ever heard of “runner’s high? It comes from release of endogenous opiates).
  • Releases proteins and other chemicals that improve the structure and function of your brain. (Memory ability improves too).
  • Improve your sleep. (I mean, besides making you really tired. To reduce interference from soreness, take acetaminophen before bed time).\
  • Reduce your risk of some cancers, including colon, breast, uterine, and lung cancer.

What was not as well known until recently was the effect of exercise on the immune system. Recent research indicates that exercise in older age can prevent the immune system from declining and protect people against infections. A recent study followed 125 long-distance cyclists, and found that some of those in their 80s had the immune systems of 20-year olds. Maybe this is a reason exercise can help prevent cancer.

The key indicator was the level of T-cells in the blood. T cells, named after the thymus where they first appear, are a type of white blood cell that makes antibodies. As people age, the thymus gland, located in the neck, shrinks and T-cell activity resides mostly in bone marrow. The study of cyclists revealed that they were producing the same level of T-cells as 20-year olds, whereas a comparison group of inactive older adults were producing very few. Thus, it would seem that, though not tested in this study,   physically active seniors would also respond better to vaccines than sedentary people.

The other new discovery is the importance of exercise on brain white matter integrity. White matter electrically insulates nerve fibers, which has two effects: 1) speeds communication in neural networks and 2) reduces “cross talk” among adjacent fibers. The study compared people averaging 65 who were mentally normal and those who had mild cognitive impairment, which is a risk factor for later development of Alzheimer’s Disease. In both groups, investigators measured cardiovascular function with a standard measure of heart and respiratory fitness, the VO2 Max test. They also used brain scans to measure white matter integrity. Levels of physical activity were positively associated with white matter (WM) integrity and cognitive performance in normal adults and even in patients with mild cognitive impairment.

Given all this, how much more reason do you need to get off the couch and start moving? Besides, at the end of a good workout, it feels so good to quit.

"Memory Medic's latest book is for seniors: "Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine," available in inexpensive e-book format at  See also his recent books, "Memory Power 101" (Skyhorse), and "Mental Biology. The New Science of How the Brain and Mind Relate" (Prometheus).
Ding, Kan, et al. (2018).Cardiorespiratory fitness and white matter neuronal fiber integrity in mild cognitive impairment. Journal of Alzheimer's Disease, 61(2), 729-739.

Duggai, Niharika A. et al. (2018). Major features of immune senescence, including reduced thymic output, are ameliorated by high levels of physical activity in adulthood. Aging Cell. 8 March.

Thursday, July 12, 2018

The Better Things Get, the Worse They May Seem

“Too much of a good thing” and “it’s all relative” now take on new meaning. A new research report of seven studies suggests an explanation for the paradox that humans misjudge the extent of a changing situation. This report, published in the June 29th issue of the premier journal, Science, demonstrated that people often respond to diminished prevalence of a stimulus by expanding their perception of its prevalence. For example, when looking at a matrix panel of blue and purple dots, if the experimenter reduces the percentage of blue dots, the subjects began to see purple dots as blue. Or when shown panels of threatening faces mixed with neutral faces in which the percentage of angry faces became rarer, they began to see neutral faces as threatening. Or when unethical requests of the subjects were made rarer, subjects began to regard innocuous requests as unethical. In other words, reduced prevalence of a certain stimulus created a bias for finding more of that stimulus than actually existed.

The investigators began with the blue/purple dot test. When they saw the biasing effect of reducing incidence of blue dots, they wondered if this same principle applied to other kinds of stimuli and to more abstract comparisons. The bias showed up also in their test with angry and neutral faces and in the test with unethical and innocuous requests.

Everyday experiences suggested this research. For example, others had reported that when unprovoked attacks and invasions decline, the perception of new instances receive magnified judgement. I might speculate that the empowerment of women by the women’s rights movement has made recent incidents of sexual harassment more notable than would have been the case years ago when it was not so unexpected. Or perhaps the current outrage over illegal immigrant children separated from their parents and attempts to close the border are magnified by the fact that so many have been already reunited and set free in the U.S.

The authors rightly concluded, “These results may have sobering implications. Many organizations and institutions are dedicated to identifying and reducing the prevalence of social problems, from unethical research to unwarranted aggression. But our studies suggest that even well-meaning agents may sometimes fail to recognize the success of their own efforts.”

They add reference citations that show that societies have made “extraordinary” progress in solving a wide range of social problems, but that the majority of people think the world is getting worse. In prosperous countries, like the U.S., social problems usually continue to improve. However, many people in such environments seem to keep finding more and more things to complain about. For example, as the economy improves, it seems increasingly easy to find poverty or wealth gaps. As civil rights improve, it seems easy to find abuses and even to misinterpret neutral events as abuse. Thus, despite all progressive efforts, the problems seem intractable, when in fact they are not. Politics is contaminated by flawed judgment caused by changed prevalence of social problems contaminates our politics.

We tend to cling to old myths when they no longer apply as well as before. This diminishes appreciation of the successes of government policy. In the U.S., the growing hostility of citizens toward their country may actually be the result of the improvements in the country. Compounding the problem is the common feeling that it is not politically correct to consider that this kind of bias might exist. Even when a person knows of this bias, sometimes it is of political benefit to keep contentious issues alive.

Help kiddos get ready for school. 
My popular, 5-star book,Better Grades, Less Effort, is now available in hard copy. from all bookstores and Amazon 


Levari, David E. et al. (2018). Prevalence-induced concept change in human judgment. Science. 360(6396), 1462-1467.

Thursday, June 21, 2018

Consciousness Explanation. Part II

In an earlier post, “Where Neuroscience Stands in Understanding Consciousness,” I presented a summary of the progress occurring in neuroscientific understanding of consciousness (

Now a recent report in the May issue of Science adds to a growing understanding of how the brain generates conscious recognition. The study examined neural impulse discharge responses of monkey brain to visual stimuli. Electrodes were implanted in the four visual cortex areas that are sequentially activated by visual stimuli. The stimulus was a circular spot of varying contrast in the lower left area of the visual field. Monkeys were cued when a stimulus was delivered, though whether they saw it or not varied with the spot’s contrast against the visual background. Monkeys were trained to report when they knew they saw the spot by shifting  their gaze from a central fixation point to the spot’s prior location some 450 msecs earlier. Monkeys reported unrecognized spots by shifting the gaze to the right of the default fixation point. Investigators imposed the delay for reporting to eliminate the response being a simple reflex saccade. A longer delay would have been more convincing, but it might have taxed the monkey’s working memory and easy distractibility.

As expected, spots of sufficient contrast evoked impulse discharge in each of the four visual cortex regions. Whether or not the monkey reported actually seeing the spot depended on whether there was also increased impulse discharge in the region of frontal cortex that had implanted electrodes. No doubt, other non-monitored frontal areas might also have been activated under conditions where recognition was reported. The point is that conscious recognition requires activation of widely separated brain areas at the same time.

Not demonstrated here is how the frontal activity is interacting with activity in the visual cortex areas, but that certainly could be predicted from studies in my lab, reported in 2000. We showed that conscious realization of alternate perceptions of ambiguous figures in humans occurred when brain electrical activity (EEG) over the visual areas of scalp became highly synchronized, over a wide range of frequencies with multiple frontal areas, both in the same and even opposite hemisphere. Figure 1 shows the topography of coherence change at the moment of realization for the upper frequency band of 25-50Hz.

Figure 1. Topographic summary of p<0.01-level coherence increases across all 10 ambiguous figures, all subjects, in the 25-50 Hz band. Each square matches a given electrode and shows how activity at that location became more coherent with activity at other locations at the instant the subject consciously realized the alternate perception in any of 10 ambiguous figures. From Klemm et al. (2000). Widespread coherence increases were also seen in the band below 25 Hz.

Thus, it seems that a meaningful detectable signal, which need not be limited to vision, not only activates its immediate neural targets, but those target cells can trigger feed-forward to trigger activity in more frontal areas. Feedback from those frontal areas can set up time-locked oscillatory coupling across wide expanses of cortex that is apparently necessary for conscious recognition. The time locking probably amplifies the signals to the threshold for conscious realization.

The distributed signal processing does not necessarily mean consciousness requires huge expanses of neural tissue. Recall from the split-brain studies in Roger Sperry’s lab that even half a brain can be fully conscious of the stimuli it can receive. The magic of consciousness seems to lie in the qualitative nature of data sharing, not in the volume of tissue involved.
Thus, the major issue is how oscillatory coupling of otherwise isolated circuitry amplifies signals to become consciously recognized. “Amplify” may be a misleading word, inasmuch as there is no compelling evidence as yet that consciousness is related to having more nerve impulses per unit of time. The impulses certainly don’t get bigger, because their voltage magnitude is constrained by concentration and electrostatic gradients. Rather, the secret may lay in the controlled timing of impulses. A likely form of amplification results from the reverberation of activity among coherent neuronal ensembles, which could have the effect of sustaining the stimulus long enough to be consciously detected, that is, for the brain to “see” what the eyes were looking at.

Consciousness may also simply be a matter of improving the signal-to-noise ratio. Time-locked, reverberating activity should be more isolated and protected from random activity which is unreliably associated with a given stimulus. Intuitively, that is what we sense in daily experiences. When we look at a tree, the cognitive noise of the multitude of tree signals may obscure our seeing the bird in the tree until, by accident or intent, we are able to see the bird.

This still leaves us with an incomplete answer. What is it about amplifying or reducing background noise that makes stimuli consciously recognizable? Where is the “who” in the brain that does the realizing? When my brain sees or hears something, it is “I” who consciously see or hear it. How does my brain create my “I” and where in my brain is my “I?” One possibility is that the unconscious brain can release a set of unique network activity that operates much like an avatar, giving brain a functionality it otherwise would not have. I elaborated this idea in my post, “The Avatar Theory of Consciousness” (

How does this avatar “I” find a stimulus that it recognizes? Is it searching for it, like a searchlight scanning across the cortex for stimulus-induced activity? Or maybe it is not “looking for” sensation but rather is triggered into temporary existence when a stimulus acquires the needed threshold to launch consciousness. The monkey experiments support the latter option. However a stimulus becomes recognized, the awareness may outlast the triggering. We often consciously think about the meaning of a momentary stimulus, integrate it with memories, and develop beliefs, intentions, and responses, either cognitive or behavioral or both.

One more thing needs mention. In the monkey experiments, it was clear that the monkeys were continuously awake, even when they were not detecting presented stimuli. Thus, being awake is not the same as being conscious. We know this also from human experiments on inattentional blindness, which reveal that consciousness depends on selective attention. Wakefulness is a necessary condition for consciousness but not, by itself, sufficient.

For more explanation of brain function, see my book:


Van Vugt, Bram, et al. (2018). The threshold for conscious report: signal loss and response bias in visual and frontal cortex. Science. 360 (6388), 537-542.

Klemm, W. R., Li, T. H., and Hernandez, J. L. (2000). Coherent EEG indicators of cognitive binding during ambiguous figure tasks. Consciousness & Cognition. 9, 66-85.

Wednesday, May 30, 2018

The Joys of Consciousness

The Joys of Consciousness

You take time to be alone, valuing your personal time.
You meditate.
You feel light and buoyant.
You feel spiritually uplifted.
You find a solution to a problem.
You have a fresh new idea.
You notice something beautiful.
You walk outside in nature and feel refreshed.
You engage in physical activity that's invigorating.
You are playful and take time to play.
There is a moment of pure joy.

You smile in appreciation.
You respect someone else's boundaries without being asked.
You lift someone else's spirits.
You make another person laugh.
You give someone a helping hand.
You do something kind.
You forgive a slight.
You offer yourself in service to someone in need.
You feel a close bond with another person.
You cherish another person.

from The Healing Self by Deepak Chopra and Rudolph E. Tanzi
     re-sequenced to show the joys of nurturing oneself and then nurturing others.

Sunday, May 27, 2018

IQ Changes in Teenagers

Common wisdom asserts that your IQ is fixed. Of course, the various “multiple intelligences” change with personal life experiences and growth, but we usually consider the standard IQ score to be inherent and unchangeable. But even the standard IQ measure changes during different life stages. Clearly, the IQ of young children changes as they mature. Several studies even show that working-memory training can raise the IQ of elementary-school children. More than one analyst claims that a rigorous PhD program can raise IQ in adults. Most obvious is the decline of IQ in those elderly who do not age well because of disease.

A neglected segment along the age spectrum is the teenage years. Now, evidence indicates that this age group experiences IQ changes ranging from a decline to an increase. A study of this issue shows that both verbal and non-verbal IQ scores in teenagers relate closely to the developmental changes that occur in brain structure during the teenage years. Longitudinal brain-imaging studies in the same individuals reveal that either increases or decreases in IQ occur coincident with structural changes in cerebral grey matter that occur in teenagers.

The study conducted MRI brain scans and IQ tests on 33 normal adolescents in early teenage years and then again in late teenage years.  A wide range of IQs were noted, 77 to 135 in the early group and 87 to 143 in the late group. For any given individual, the change in IQ score changed from -20 to +23 for verbal IQ and -28- to +17 for non-verbal IQ. Correlation analysis revealed that increases in IQ were associated with increased in cortical density and volume for brain regions involved in verbal and movement functions.

The implications are profound, especially as they relate to the local environment of a given teenager. What happens during the teenager years apparently changes brain structure and mental ability. Many influences likely damage the brain, such as drug abuse, or social stress, or poor education and intellectual stimulation. Conversely, the data indicate that positive benefits to both brain structure and mental capability can result from a mentally healthy environment and rich educational experience.
The data suggest that all the emphasis on pre-school and “Head Start” initiatives may diminish our attention to the key role played by middle school and early high school. This confirms what many of us always suspected, namely that our society tends to insufficiently nurture “late bloomers.” Maybe the early high achievers who fail to live up to their promise do so, because we wrongly assume they can manage without much help. Parents, educators, and education policy makers need to take notice.
Few books can change a person's future. One of them could be my book, Better Grades, Less Effort, which explains the learning tips and tricks that I used to become valedictorian, when a high school teacher said my modest IQ did not justify the high grades I was making. Teachers predicted I "would have trouble with college." Really? I went on to be an Honors student in three universities -- including graduating early with a D.V.M. degree and securing a PhD in two-and-a-half years. My IQ documented that I was not so smart. I believe that poor learning skills are what hold back most students from superior achievement. This book can change a person's life, as my own experiences with learning how to learn have changed my life. I suspect it helped my brain development as well.


Ramsden, Sue et al. (2011). Verbal and non-verbal intelligence changes in the teenage brain. Nature. May 17. Doi:10:1038/nature10514.

Tuesday, May 01, 2018

The “Production Effect” Aids Memory

The hardest memory task I ever had was to give an 18-minute TED talk from memory. I remember struggling with remembering my core ideas and their sequence. To solve this problem, my first task was to create some slides, which the TED format allows. The directors even show the slides on a monitor at the foot of the stage that only the speaker can see. Looking at each slide as it advanced helped provide cues in the proper order, but to be effective, slides must not have much text, and in no case can a given slide reveal on its own the associated content. I still had a memorization problem. Then I remembered the “production effect,” which basically is a way to strengthen memory by actually forcing the recall in the appropriate setting. In other words, I needed to rehearse by actually giving the speech, vocalizations, mannerisms, and all, in front of a mirror.

The usual thing we think of about improving memory is the need for rehearsal, especially the kind of rehearsal where you force recall at spaced intervals after the initial learning. But another factor in improving memory is to strengthen the initial encoding at the time of learning. Actually, this is common sense. We all have experienced the case where we remember an intense experience primarily because it is intense. In other words, the intensity strengthened the encoding.

A well-known technique is to use the “production effect.” Basically, this means that encoding is strengthened by generating what you are learning at the time of learning by speaking it, singing it, drawing it, or deploying it in some way (as in “hands on”). Handwriting or typing the information strengthens encoding, and studies have shown that handwriting is more effective than typing. Any of these approaches is much more effective than silent reading, viewing, or listening.

Many such studies confirm the effect. For example, in one study, saying each word in a word list to be memorized, improves recall more than 15% more than silent reading. The same degree of improvement occurs with such mouthing the words.

Why this works to improve memory probably relates to the fact that more attentiveness and processing is required in production than in just silent reading or listening. One common explanation is that production makes each item more distinctive. That is, by saying it, drawing it, or whatever, the item acquires more features and becomes more distinctive.

As far as I know, the production effect has been studied only with respect to rote memory tasks. I should think that it would be even more powerful if applied when using mnemonics. For example, if you are using the “memory palace,” as you place an item to be memorized on a room object in your mind’s eye, you might actually describe out loud what you are imagining.

The production effect should also be useful during forced retrieval rehearsals as well, as I did in learning my TED talk. I am not aware of experiments that test use of production in rehearsal. Anytime you retrieve a memory item, it is an opportunity to re-learn it in a sense, and the information gets re-consolidated. So, if you speak, draw, or use another production effect during forced recall, you further strengthen the encoding and subsequent consolidation.

Whether you are a student seeking better grades, a professional trying to stay at the top of your game, or a senior hoping to stave off mental decline, my book Memory Power 101 is your key to developing and maintaining a sharper mind. The book shares Memory Medic's  decades of professional experience in education and neuroscience. 


Bodner, Glen E. and MacLeod, Colin M. (2016). The benefits of studying by production … and of studying production: Introduction to the Special Issue on the Production Effect in Memory. Canadian Journal of Psychology. 70(2),89-92.

MacLeod, Colin M., and Bodner, Glen E. (2017) The production effect in memory. Current Directions in Psychological Science. 26(4), 390-395.

Friday, April 20, 2018

Take the Stress out of School

Got kids or grandkids in school? Are you in school or college? This blog is for you. I don’t have to tell you that school is stressful, what with hard courses, tough teachers, and high-stakes tests. The stress is understandable, but also counterproductive. Anxiety and other negative emotions interfere with learning, remembering, and test taking. So why don’t schools put more emphasis on helping students cope?

Good teachers do help students cope by making their explanations as simple and clear as possible. But unless they lower their standards, which benefits nobody, school will still be stressful. Research has shown some things that teachers and students could do to reduce stress and improve academic performance.

The most obvious is to understand the principles for efficient and effective learning, as I have tried to outline in my latest book, The Learning Skills Cycle. My two other books focus specifically on improving memorization skills.

The American Psychological Associations 2013 report reminded us of the study, which I posted a blog on, that a student’s test anxiety will be reduced by writing about the anxiety before the test; and the test score will be higher. Another study showed that a student’s attitude toward their anxiety can reverse the negative effect. When taught to re-interpret the symptoms, such as sweaty palms and racing pulse, as signs of excitement and being “up” for the test rather than fear, they perform better on the test.

Mindfulness meditation can also relieve stress, but it has to be done diligently, which many younger students can’t do well. Sometimes, teachers say that just having students take a few slow, deep breaths will help them do better on tests. The neurons that mediate slow breathing also impinge on the cortex and moderate excessive activity. A more systematic approach to teach children how to meditate has been developed by James Butler in the Austin, Texas, school district. He has developed a 36 week curriculum to teach teachers effective ways to teach mindfulness to students.

The usual excuse schools make for not promoting mindfulness meditation is that it violates separation of church and state. This kind of meditation is not religious. It encompasses sound neurophysiology and is not kooky.


Klemm, W. R. (2017). The Learning Skills Cycle. A Way to Rethink Education Reform. New York: Rowman and Littlefield.

Klemm, W. R. (2016). Better Grades, Less Effort. (e-book)

Klemm, W. R. (2012) Memory Power 101. New York: Skyhorse

Monday, April 02, 2018

Where We Stand in Understanding Consciousness

Many scientists, even physical scientists, assert that the Holy Grail of science is to understand human consciousness. This human state is even hard to define, but is characterized by a state in which we know what we believe, know, and imagine, know what we decide and plan, and feel what we feel. That explains nothing.

The problem in understanding is not only that the mechanisms must surely be complicated, but also that we don’t have good non-invasive experimental tools. There are only two useful tools, a metabolic proxy of neural electrical activity (functional fMRI) and scalp monitoring of electrical activity (the electroencephalogram {EEG), or its magnetic field counterpart. Among the problems with fMRI are that it is only an indirect measure of the actual signaling within the brain that generates thought and feeling and enables consciousness. Its time resolution is about one-second or more, whereas signaling in the brain occurs on a millisecond scale. Although the EEG monitors activity on the appropriate time scale, it has very poor spatial resolution, inasmuch as voltage fields over various regions of cortex overlap, because the voltage extends in progressively diminished amplitude throughout the conductive medium of brain from its source of generation to other source generators. Although the EEG does monitor the appropriate target (electrical activity), that activity is an envelope of the algebraically summed signals from heterogeneous neuronal ensembles, which are nerve impulses and their associated postsynaptic potentials nearest the sensing electrodes.

                                             By Davidboyashi - Own work, CC BY-SA 4.0

Nonetheless, we do know many useful things about brain function that are surely involved in conscious functioning. Neuroscientists have discovered much of this in lower animals from invasive procedures that are not permissible in humans. In summary, we can list the following brain functions that are relevant to consciousness:

  • The brain is a network of richly inter-connected networks.
  • Functions are modular. Different networks have different and shifting primary functions, and some may be selectively recruited when their function is needed.
  •  Some networks can perform multiple functions, depending on which other networks have recruited them into action.
  • Some aspects of functional connectivity of different networks differ in unconscious and conscious states.
  • Wakefulness and consciousness are not the same. Wakefulness is necessary but not sufficient for consciousness.
  •  A great deal has been learned about the neural mechanisms causing wakefulness but that has not helped much in understanding consciousness.
  • The messaging signals of brain are nerve impulses and their neurotransmitter postsynaptic effects.
  • The summed voltages of the messaging have electrostatic effects that alter the excitability of the neurons within the voltage field.
  • The frequency of bursts of impulses and their EEG envelope impose important effects on gating and throughput of information as it propagates and is modified throughout the global workspace of networks.
  • There are multiple neural correlates of consciousness, but we have not identified with certainty which ones are necessary and sufficient for consciousness.

Oscillatory electrical activity is thought to have a key role in selective routing of information in the brain. Oscillations seem to modulate excitability, depending on phase relationships of linked neuronal ensembles. Two prominent hypotheses have been advanced as crucial for consciousness, and they are not mutually exclusive:

  • Phase-locked activity in two or more ensembles (coherence)
  • Inhibitory gating that directs pathways for propagation within networks.

The key to discovering mechanisms of consciousness is to identify all the neural correlates and then winnow the list to those that are both necessary and sufficient for consciousness. Sometimes, important discoveries occur when you study the opposite of what you want to study. This principle is manifest in studies on brain function during various states of unconsciousness (like anesthesia, coma, or non-dream sleep). A recent review of research compared the neural correlates of unconsciousness with those of consciousness. The evaluation showed disrupted connectivity in the brain and greater modularity during unconscious state, which inhibited the efficient integration of information required during consciousness. Additionally, the review made the key point that the neural correlates of consciousness that matter are the ones that occur in consciousness but not in unconscious states. Of particular relevance are the correlates related to functional connectivity among networks, because multiple lines of evidence reveal that this connectivity degrades during unconscious states and returns when consciousness resumes.

In rodents, multi-array recordings in visual cortex indicate that connectivity patterns are the same during anesthesia as in wakefulness. Perhaps this indicates that rodents do not have the needed network architecture to enable consciousness. They can be awake but not conscious. Being awake is clearly necessary for consciousness, but not sufficient. In addition (if you don’t believe me, see the classical U-tube basketball-game video on inattentional blindness). At any given instant, we are only consciously aware of the specific cognitive targets to which we attend.

 Statistical co-variation of activity in linked networks is a measure of functional connectivity. The activity in linked networks may randomly jitter or be in phase or locked at certain time lags. Operationally, the connectivity may enable one group of neurons to mediate or modulate activity in another for past, present, or future operations. The temporal dynamics of these processes differ depending on the state of consciousness.

A very popular view on consciousness among neuroscientists these days is that higher-order thinking, especially conscious thinking, is mediated by extracellular voltage fields that oscillate in the range of 12 to 60  or more waves per second. Changes in oscillatory frequency and coherent coupling of the oscillations among various pools of neurons are thought to reflect the nature and intensity of thought.

The issue arises as to how these voltages, commonly called field potentials, can influence the underlying nerve impulse activity that causes the oscillation in the first place. The messages of thought are carried in patterns of nerve impulses flowing in neural networks. Field potentials are not signaling, at least  not directly. They may well indirectly influence messaging by electrostatically biasing networks to be more or less able to generate and propagate nerve impulse traffic.

Neuroscientists attach much importance to the temporal dynamics of EEG voltage frequencies. For example, at one time neuroscientists believed that 40/sec synchrony was critical to consciousness, but later studies revealed that this synchrony can be maintained and even enhanced during anesthesia. Later, investigators thought they had found a crucial role for higher frequency gamma synchrony, but that too is now called into question. This gamma synchrony can be present or even enhanced during unconsciousness. However, the spatial extent of synchrony may be the meaningful correlate of consciousness. Widespread synchrony breaks down during unconsciousness, while more localized synchrony remains intact or even enhanced.

Numerous studies show a breakdown of functional connectivity during various states of unconsciousness. For example, fMRIs reveal cortico-cortical and thalamocortical disconnections during sleep, general anesthesia, and pathological states. EEG analysis shows similar connectivity breakdowns. Additionally, the repertoire of possible connectivity configurations that can be accessed diminishes during unconscious states and is restored as consciousness resumes. This obviously limits the robustness of information processing that can occur in unconsciousness. Conscious selective attentiveness likely requires a different repertoire of connectivity than inattentive consciousness.

Neuroscientists are also discovering the importance not only of multi-area coherence at a given frequency band, but also that the phase synchrony to two different frequencies can also modulate network communication. Cross-frequency coupling of the alpha and beta oscillations with higher frequency gamma oscillations can amplify, inhibit, or gate the flow of nerve impulses throughout circuitry.

Future advancements will surely include more emphasis on monitoring functional connectivity as the brain shifts into and out of various states of consciousness and unconsciousness. I think, however, that we will not make definitive progress in consciousness research until we make progress in one area of theory and another of tactical methodology.

The theory deficiency lies in models of neural networks. Computer models of man-made networks yield interesting results, but they are probably  not relevant. Brains do not work with the same principles that computers do. Moreover, brain networks have intrinsic plasticity that cannot yet be duplicated by computers.

The method deficiency is that we have no non-invasive way to monitor the actually signaling in even a significant fraction of all the neurons in all the networks. Moreover, even if we had a way to monitor individual neurons noninvasively, it would likely be necessary to selectively monitor neurons in defined circuits. Ultimately, we may confirm that some things are just not knowable. Surely, however, we can learn more than we do now.


Bonnefond, Mathilde et al. (2017). Communication between brain areas base on nested oscillators. eNeuro. 10 March. 4(2) ENEURO.0153-16.2017. doi:

Mashour, George A., and Hudetz, Anthony G. (2018). Neural correlates of unconsciousness in large-scale brain networks. Trends in Neurosciences. 41(3), 150-160.