Blog

Share this page on your social media


Neurofeedback Training of Amygdala Increases Emotional Regulation
By Jason von Stietz, M.A.
September 18, 2016
Getty Images

 

The amygdala plays a key role in emotional regulation. Researchers from Tel-Aviv University examined the impact of neurofeedback aimed at reducing EEG activity in the amygdala on emotional regulation in healthy participants. The study was discussed in a recent article in NeuroScientistNews: 

 

Training the brain to treat itself is a promising therapy for traumatic stress. The training uses an auditory or visual signal that corresponds to the activity of a particular brain region, called neurofeedback, which can guide people to regulate their own brain activity.

 

However, treating stress-related disorders requires accessing the brain's emotional hub, the amygdala, which is located deep in the brain and difficult to reach with typical neurofeedback methods. This type of activity has typically only been measured using functional magnetic resonance imaging (fMRI), which is costly and poorly accessible, limiting its clinical use.

 

A study published in Biological Psychiatry  tested a new imaging method that provided reliable neurofeedback on the level of amygdala activity using electroencephalography (EEG), and allowed people to alter their own emotional responses through self-regulation of its activity.

 

"The major advancement of this new tool is the ability to use a low-cost and accessible imaging method such as EEG to depict deeply located brain activity," said both senior author Dr. Talma Hendler of Tel-Aviv University in Israel and The Sagol Brain Center at Tel Aviv Sourasky Medical Center, and first author Jackob Keynan, a PhD student in Hendler's laboratory, in an email toBiological Psychiatry.

 

The researchers built upon a new imaging tool they had developed in a previous study that uses EEG to measure changes in amygdala activity, indicated by its "electrical fingerprint". With the new tool, 42 participants were trained to reduce an auditory feedback corresponding to their amygdala activity using any mental strategies they found effective.

 

During this neurofeedback task, the participants learned to modulate their own amygdala electrical activity. This also led to improved downregulation of blood-oxygen level dependent signals of the amygdala, an indicator of regional activation measured with fMRI.

 

In another experiment with 40 participants, the researchers showed that learning to downregulate amygdala activity could actually improve behavioral emotion regulation. They showed this using a behavioral task invoking emotional processing in the amygdala. The findings show that with this new imaging tool, people can modify both the neural processes and behavioral manifestations of their emotions.

 

"We have long known that there might be ways to tune down the amygdala through biofeedback, meditation, or even the effects of placebos," said John Krystal, Editor of Biological Psychiatry. "It is an exciting idea that perhaps direct feedback on the level of activity of the amygdala can be used to help people gain control of their emotional responses."

 

The participants in the study were healthy, so the tool still needs to be tested in the context of real-life trauma. However, according to the authors, this new method has huge clinical implications.

 

The approach "holds the promise of reaching anyone anywhere," said Hendler and Keynan. The mobility and low cost of EEG contribute to its potential for a home-stationed bedside treatment for recent trauma patients or for stress resilience training for people prone to trauma.

 

Read the original article Here

Comments (2)
Ultrasound Used to "Jumpstart" Patient's Brain
By Jason von Stietz, M.A.
September 9, 2016
Photo Credit: Martin Monti/UCLA

 

Researchers at UCLA used examined the use of ultrasound to stimulate the brain of a 25-year-old man in a coma. The treatment involved directing low-intensity acoustic energy to the patient’s thalamus, which is significantly underactivated in coma patients. The study was discussed in a recent article in Medical Xpress:

 

The technique uses sonic stimulation to excite the neurons in the thalamus, an egg-shaped structure that serves as the brain's central hub for processing information.
 

"It's almost as if we were jump-starting the neurons back into function," said Martin Monti, the study's lead author and a UCLA associate professor of psychology and neurosurgery. "Until now, the only way to achieve this was a risky surgical procedure known as deep brain stimulation, in which electrodes are implanted directly inside the thalamus," he said. "Our approach directly targets the thalamus but is noninvasive."

 

Monti said the researchers expected the positive result, but he cautioned that the procedure requires further study on additional patients before they determine whether it could be used consistently to help other people recovering from comas.

 

"It is possible that we were just very lucky and happened to have stimulated the patient just as he was spontaneously recovering," Monti said.

 

A report on the treatment is published in the journal Brain Stimulation. This is the first time the approach has been used to treat severe brain injury.

 

The technique, called low-intensity focused ultrasound pulsation, was pioneered by Alexander Bystritsky, a UCLA professor of psychiatry and biobehavioral sciences in the Semel Institute for Neuroscience and Human Behavior and a co-author of the study. Bystritsky is also a founder of Brainsonix, a Sherman Oaks, California-based company that provided the device the researchers used in the study.

 

That device, about the size of a coffee cup saucer, creates a small sphere of acoustic energy that can be aimed at different regions of the brain to excite brain tissue. For the new study, researchers placed it by the side of the man's head and activated it 10 times for 30 seconds each, in a 10-minute period.

 

Monti said the device is safe because it emits only a small amount of energy—less than a conventional Doppler ultrasound.

 

Before the procedure began, the man showed only minimal signs of being conscious and of understanding speech—for example, he could perform small, limited movements when asked. By the day after the treatment, his responses had improved measurably. Three days later, the patient had regained full consciousness and full language comprehension, and he could reliably communicate by nodding his head "yes" or shaking his head "no." He even made a fist-bump gesture to say goodbye to one of his doctors.

 

"The changes were remarkable," Monti said.The technique targets the thalamus because, in people whose mental function is deeply impaired after a coma, thalamus performance is typically diminished. And medications that are commonly prescribed to people who are coming out of a coma target the thalamus only indirectly.

 

Under the direction of Paul Vespa, a UCLA professor of neurology and neurosurgery at the David Geffen School of Medicine at UCLA, the researchers plan to test the procedure on several more people beginning this fall at the Ronald Reagan UCLA Medical Center. Those tests will be conducted in partnership with the UCLA Brain Injury Research Center and funded in part by the Dana Foundation and the Tiny Blue Dot Foundation.

 

If the technology helps other people recovering from coma, Monti said, it could eventually be used to build a portable device—perhaps incorporated into a helmet—as a low-cost way to help "wake up" patients, perhaps even those who are in a vegetative or minimally conscious state. Currently, there is almost no effective treatment for such patients, he said.

 

Read the original article Here

Comments (0)
The Brain of a Musician: Case study of Sting
By Jason von Stietz, M.A.
August 27, 2016
Photo Credit: Owen Egan

 

What does musical ability look like in the brain? Researchers at McGill University were presented with a unique opportunity. In an effort to understand how a musician’s brain operates, they were able to perform brain imaging on multiple Grammy Award winning artist Sting. The study was discussed in a recent article NeuroScientistNews: 

 

What does the 1960s Beatles hit "Girl" have in common with Astor Piazolla's evocative tango composition "Libertango?"

 

Probably not much, to the casual listener. But in the mind of one famously eclectic singer-songwriter, the two songs are highly similar. That's one of the surprising findings of an unusual neuroscience study based on brain scans of the musician Sting.

 

The paper, published in the journal Neurocase, uses recently developed imaging-analysis techniques to provide a window into the mind of a masterful musician. It also represents an approach that could offer insights into how gifted individuals find connections between seemingly disparate thoughts or sounds, in fields ranging from arts to politics or science.

 

"These state-of the-art techniques really allowed us to make maps of how Sting's brain organizes music," says lead author Daniel Levitin, a cognitive psychologist at McGill University. "That's important because at the heart of great musicianship is the ability to manipulate in one's mind rich representations of the desired soundscape."

 

Lab tour with a twist

 

The research stemmed from a serendipitous encounter several years ago. Sting had read Levitin's book This Is Your Brain on Music and was coming to Montreal to play a concert. His representatives contacted Levitin and asked if he might take a tour of the lab at McGill. Levitin—whose lab has hosted visits from many popular musicians over the years—readily agreed. And he added a unique twist: "I asked if he also wanted to have his brain scanned. He said 'yes'."

 

So it was that McGill students in the Stewart Biology Building one day found themselves sharing an elevator with the former lead singer of The Police, who has won 16 Grammy Awards, including one in 1982 for the hit single "Don't Stand So Close To Me."

 

Both functional and structural scans were conducted in a single session at the brain imaging unit of McGill's Montreal Neurological Institute, the hot afternoon before a Police concert. A power outage knocked the entire campus off-line for several hours, threatening to cancel the experiment. Because it takes over an hour to reboot the fMRI machine, time grew short. Sting generously agreed to skip his soundcheck in order to do the scan.

 

Levitin then teamed up with Scott Grafton, a leading brain-scan expert at the University of California at Santa Barbara, to use two novel techniques to analyze the scans. The techniques, known as multivoxel pattern analysis and representational dissimilarity analysis, showed which songs Sting found similar to one another and which ones are dissimilar—based not on tests or questionnaires, but on activations of brain regions.

 

"At the heart of these methods is the ability to test if patterns of brain activity are more alike for two similar styles of music compared to different styles. This approach has never before been considered in brain imaging experiments of music," notes Scott Grafton.

 

Unexpected connections

 

"Sting's brain scan pointed us to several connections between pieces of music that I know well but had never seen as related before," Levitin says. Piazzola's "Libertango" and the Beatles' "Girl" proved to be two of the most similar. Both are in minor keys and include similar melodic motifs, the paper reveals. Another example: Sting's own "Moon over Bourbon Street" and Booker T. and the MG's "Green Onions," both of which are in the key of F minor, have the same tempo (132 beats per minute) and a swing rhythm.

 

The methods introduced in this paper, Levitin says, "can be used to study all sorts of things: how athletes organize their thoughts about body movements; how writers organize their thoughts about characters; how painters think about color, form and space."
 

 

Read the original article Here

Comments (0)
The Myth of the Second Head Trauma Cure
By Jason von Stietz, M.A.

 

According to cartoons and other popular media, the only cure for head trauma induced amnesia is a second head trauma. When a cartoon characters such as Fred Flintsone loses his memory following a lump on the head, it is only a second lump that brings it back. This idea seems silly to modern scientists and clinicians. However, where did this myth come from? The history of this myth was discussed in a recent article in Drexel Now: 

 

While that worked in “The Flintstones” world and many other fictional realms, the medical community knows that like doesn’t cure like when it comes to head trauma.

 

However, a shockingly high level of the general public endorse the Flintstones solution, with 38–46 percent believing that a second blow to the head could cure amnesia, according to Drexel’s Mary Spiers. And, believe it or not, that belief was spurred by members of the medical community dating as far back as the early 19th century.

 

Spiers, PhD, associate professor in the College of Arts and Sciences’ Department of Psychology, traced the origins of the double-trauma amnesia cure belief in a paper for Neurology titled, “The Head Trauma Amnesia Cure: The Making of a Medical Myth.”

 

For a long time, scientists worked to figure out why the brain had two hemispheres.

 

“Studying the brain in the past was very challenging for several reasons,” Spiers explained. “There was no way to look into the living brain, as powerful functional imaging now allows us to do. Also, many people, including physicians, philosophers and those in the arts, speculated about the function of the brain, the soul and consciousness, so there were many competing ideas.”

 

At one point, scientists landed on the idea that it was a double organ, like a person’s eyes or ears, two pieces that were redundant — doing the same work.

 

Around the turn of the 19th century, a French scientist named Francois Xavier Bichat decided that the two hemispheres acted in synchrony. One side mirrored the other, and vice versa.

 

As such, he reasoned that an injury to one side of the head would throw off the other, “untouched” side.

 

“[Bichat] seriously proposed the notion that a second blow could restore the wits of someone who had a previous concussion,” Spiers wrote in her paper. “Bichat justified this idea by reasoning that hemispheres that are in balance with each other functioned better, while those out of balance cause perceptual and intellectual confusion.”

 

Bichat never cited any specific cases to back up his theory and, ironically enough, he died of a probable head injury in 1802.

 

“From my reading of Bichat’s work, it seems that he felt that the second trauma amnesia cure was a common occurrence and didn’t need the citation of an individual case,” Spiers said. “This was not unusual at the time, to forgo evidence like that.”

 

Despite backup to his claims, Bichat’s ideas continued on after his death and became known as Bichat’s Law of Symmetry. Books in the following decades cited brain asymmetry as the root of different mental health issues.

 

Compounding the symmetry idea was also the dominant thought that human memories could never be lost. However, Samuel Taylor Coleridge — a philosopher, not a physician — was credited with popularizing that idea.

 

It wasn’t until the mid-1800s that scientists began to realize that taking a hit to the head might just destroy memories completely. A second blow wasn’t likely to jump-start the brain, they realized, but create further damage. 

 

By this time, however, enough anecdotes about curing amnesia with a second head trauma were floating around from otherwise respectable scientists that the theory invaded the general public’s consciousness. With “no hard and fast lines between scientific and popular writing,” myths like the second trauma amnesia cure circulated out of control, according to Spiers.

 

Even as modern scientists began to fully understand the brain, the theory still stuck with a large amount of the public, resulting in the lumps we continue to see on cartoon characters’ heads. 

 

“One of the issues we see in the persistence of this myth is that understanding how the brain forgets, recovers and/or loses information is a complicated matter that is still being studied by brain scientists,” Spiers said. “As individuals, we may have had the experience of a ‘memory jog’ or cue that reminds us of a long-forgotten memory. Because our own experiences serve as powerful evidence to us, this reinforces the myth that all memories are forever stored in the brain and only need some sort of jolt to come back.”

 

But, obviously, that jolt isn’t exactly advisable.

 

“In the case of a traumatic brain injury, learning and memory may be temporarily or permanently impaired due to swelling and injured or destroyed neurons,” Spiers concluded. “Some memories may return as the brain recovers, but a second brain injury is never a good treatment for a first brain injury.”

 

Read the original Here

 

Comments (0)
Study Identifies Biomarkers for Alzheimer's Disease
By Jason von Stietz, M.A.
August 15, 2016
Getty Images

 

Researchers at the University of Wisconsin Madison have identified biomarkers helpful in predicting the development of Alzheimer’s disease. The researchers analyzed the data of 175, which included brain scans, measures of cognitive abilities, and genotyping. The study was discussed in a recent article in NeuroscientistNews: 

 

This approach – which statistically analyzes a panel of biomarkers – could help identify people most likely to benefit from drugs or other interventions to slow the progress of the disease. The study was published in the August edition of the journal Brain.

 

"The Alzheimer's Association estimates that if we had a prevention that merely pushed back the clinical symptoms of the disease by five years, it would almost cut in half the number of people projected to develop dementia,'' says Annie Racine, a doctoral candidate and the study's lead author. "That translates into millions of lives and billions of dollars saved."

 

Dr. Sterling Johnson, the study's senior author, says that while brain changes – such as the buildup of beta-amyloid plaques and tangles of another substance called tau – are markers of the disease, not everyone with these brain changes develops Alzheimer's symptoms.

 

"Until now, we haven't had a great way to use the biomarkers to predict who was going to develop clinical symptoms of the disease,'' Johnson says. "Although the new algorithm isn't perfect, now we can say with greater certainty who is at increased risk and more likely to decline."

 

The research team recruited 175 older adults at risk for Alzheimer's disease, and used statistical algorithms to categorize them into four clusters based on different patterns and profiles of pathology in their brains. Then, the researchers analyzed cognitive data from the participants to investigate whether these cluster groups differed on their cognitive abilities measured over as many as 10 years.

 

As it turns out, the biomarker panels were predictive of cognitive decline over time. One cluster in particular stood out. The group that had a biomarker profile consistent with Alzheimer's – abnormal levels of tau and beta-amyloid in their cerebrospinal fluid – showed the steepest decline on cognitive tests of memory and language over the 10 years of testing. About two-thirds of the 22 people sorted into this group were also positive for the APOE4 gene—the greatest known risk factor for sporadic Alzheimer's disease—compared with about one-third in the other clusters.

 

At the other end of the spectrum, the largest group, 76 people, were sorted into a cluster that appears to be made up of healthy aging individuals. They showed normal levels on the five biomarkers and did not decline cognitively over time.

 

In between, there were two clusters that weren't classified as Alzheimer's but who don't seem to be aging normally either. A group of 32 people showed signs of mixed Alzheimer's and vascular dementia. They had some of the amyloid changes in their cerebrospinal fluid, but also showed lesions in their brains' white matter, which indicate scarring from small ischemic lesions which can be thought of as minor silent strokes.

 

The other cluster of 45 people had signs of brain atrophy, with brain imaging showing that the hippocampus, the brain's memory center, was significantly smaller than the other groups. The authors speculate this group could have intrinsic brain vulnerability or could be affected by some other process that differentiates them from healthy aging. Both the in-between clusters showed non-specific decline on a test of global cognitive functioning, which further differentiates them from the healthy aging cluster.

 

The study participants came from a group of more than 1,800 people enrolled in two studies – the Wisconsin Alzheimer's Disease Research Center (WADRC) study and the Wisconsin Registry for Alzheimer's Prevention (WRAP). Both groups are enriched for people at risk for getting Alzheimer's because about ¾ of participants have a parent with the disease.

 

"This study shows that just having a family history doesn't mean you are going to get this disease," Johnson says. "Some of the people in our studies are on a trajectory for Alzheimer's, but more of them are aging normally, and some are on track to have a different type of brain disease." A comprehensive panel of biomarkers – such as the one evaluated in this study – could help to predict those variable paths, paving the way for early interventions to stop or slow the disease.

 

The authors of the study are affiliated with the WADRC, the Wisconsin Alzheimer's Institute, the Institute on Aging, the Waisman Center, and the Neuroscience and Public Policy program, all at UW-Madison; and the Geriatrics Research Education and Clinical Center at the William S. Middleton Veterans Hospital.

 

Read the original Here

Comments (0)
New Imaging Tool Measures Synaptic Density
By Jason von Stietz, M.A.
July 29, 2016
Photo Credit: Finemma et al. (2016). Science Translational Medicine 

 

Previously researcher have only been able to study the synaptic changes caused by brain disorders through autopsy. However, recently researcher from Yale have developed a new approach to brain scanning that allows for the measurement synaptic density. The study was discussed in a recent article in Medical Xpress: 

 

The study was published July 20 in Science Translational Medicine.

 

Certain changes in synapses—the junctions between nerve cells in the brain—have been linked with brain disorders. But researchers have only been able to evaluate synaptic changes during autopsies. For their study, the research team set out to develop a method for measuring the number of synapses, or synaptic density, in the living brain.

 

To quantify synapses throughout the brain, professor of radiology and biomedical imaging Richard Carson and his coauthors combined PET scanning technology with biochemistry. They developed a radioactive tracerthat, when injected into the body, binds with a key protein that is present in all synapses across the brain. They observed the tracer through PET imaging and then applied mathematical tools to quantify synaptic density.

 

The researchers used the imaging technique in both baboons and humans. They confirmed that the new method did serve as a marker for synaptic density. It also revealed synaptic loss in three patients with epilepsy compared to healthy individuals.

 

"This is the first time we have synaptic density measurement in live human beings," said Carson, who is senior author on the study. "Up to now any measurement of synaptic density was postmortem."

 

The finding has several potential applications. With this noninvasive method, researchers may be able to follow the progression of many brain disorders, including epilepsy and Alzheimer's disease, by measuring changes in synaptic density over time. Another application may be in assessing how pharmaceuticals slow the loss of neurons. "This opens the door to follow the natural evolution of synaptic density with normal aging and follow how drugs can alter synapses or synapse formation."

 

Carson and his colleagues plan future studies involving PET imaging of synapses to research epilepsy and other brain disorders, including Alzheimer's disease, schizophrenia, depression, and Parkinson's disease. "There are many diseases where neuro-degeneration comes into play," he noted.

 

Read the original article Here

Comments (0)
Impact Of Baby's Cries on Cognition
By Jason von Stietz, M.A.
July 20, 2016
Getty Images

 

People often refer to “parental instincts” as an innate drive to care for one’s offspring. However, we know little of the role cognition play in this instinct.  Researchers at the University of Toronto examined the impact of the sound of a baby’s cries on performance during a cognitive task. The study was discussed in a recent article in Neuroscience Stuff:

 

“Parental instinct appears to be hardwired, yet no one talks about how this instinct might include cognition,” says David Haley, co-author and Associate Professor of psychology at U of T Scarborough.

 

“If we simply had an automatic response every time a baby started crying, how would we think about competing concerns in the environment or how best to respond to a baby’s distress?”

 

The study looked at the effect infant vocalizations—in this case audio clips of a baby laughing or crying—had on adults completing a cognitive conflict task. The researchers used the Stroop task, in which participants were asked to rapidly identify the color of a printed word while ignoring the meaning of the word itself. Brain activity was measured using electroencephalography (EEG) during each trial of the cognitive task, which took place immediately after a two-second audio clip of an infant vocalization.

 

The brain data revealed that the infant cries reduced attention to the task and triggered greater cognitive conflict processing than the infant laughs. Cognitive conflict processing is important because it controls attention—one of the most basic executive functions needed to complete a task or make a decision, notes Haley, who runs U of T’s Parent-Infant Research Lab.

 

“Parents are constantly making a variety of everyday decisions and have competing demands on their attention,” says Joanna Dudek, a graduate student in Haley’s Parent-Infant Research Lab and the lead author of the study.

 

“They may be in the middle of doing chores when the doorbell rings and their child starts to cry. How do they stay calm, cool and collected, and how do they know when to drop what they’re doing and pick up the child?”

 

A baby’s cry has been shown to cause aversion in adults, but it could also create an adaptive response by “switching on” the cognitive control parents use in effectively responding to their child’s emotional needs while also addressing other demands in everyday life, adds Haley.

 

“If an infant’s cry activates cognitive conflict in the brain, it could also be teaching parents how to focus their attention more selectively,” he says.

 

“It’s this cognitive flexibility that allows parents to rapidly switch between responding to their baby’s distress and other competing demands in their lives—which, paradoxically, may mean ignoring the infant momentarily.”

 

The findings add to a growing body of research suggesting that infants occupy a privileged status in our neurobiological programming, one deeply rooted in our evolutionary past. But, as Haley notes, it also reveals an important adaptive cognitive function in the human brain.

 

Read the original article here

Comments (0)
Hypoconnectivity Found In Brains Of Those With Intermittent Explosive Disorder
By Jason von Stietz, M.A.
July 15, 2016
Photo Credit: Lee et al., Neuropsychopharmacology

 

Why do those with anger issues tend to misunderstand social situations? Researchers at the University of Chicago Medical School compared the brains of those suffering from intermittent explosive disorder (IED) to the brains of healthy controls using diffusion tensor imaging. Findings indicated that brains of people with IED showed less white matter connecting the frontal cortex to the parietal lobes. The study was discussed in a recent article in Medical Xpress: 

 

In a new study published in the journal Neuropsychopharmacology, neuroscientists from the University of Chicago show that white matter in a region of the brain called the superior longitudinal fasciculus (SLF) has less integrity and density in people with IED than in healthy individuals and those with other psychiatric disorders. The SLF connects the brain's frontal lobe—responsible for decision-making, emotion and understanding consequences of actions—with the parietal lobe, which processes language and sensory input.

 

"It's like an information superhighway connecting the frontal cortex to the parietal lobes," said Royce Lee, MD, associate professor of psychiatry and behavioral neuroscience at the University of Chicago and lead author of the study. "We think that points to social cognition as an important area to think about for people with anger problems."

 

Lee and his colleagues, including senior author Emil Coccaro, MD, Ellen C. Manning Professor and Chair of Psychiatry and Behavioral Neuroscience at UChicago, used diffusion tensor imaging, a form ofmagnetic resonance imaging (MRI) that measures the volume and density of white matter connective tissue in the brain. Connectivity is a critical issue because the brains of people with psychiatric disorders usually show very few physical differences from healthy individuals.

 

"It's not so much how the brain is structured, but the way these regions are connected to each other," Lee said. "That might be where we're going to see a lot of the problems in psychiatric disorders, so white matter is a natural place to start since that's the brain's natural wiring from one region to another."

 

People with anger issues tend to misunderstand the intentions of other people in social situations. They think others are being hostile when they are not and make the wrong conclusions about their intentions. They also don't take in all the data from a social interaction, such as body language or certain words, and notice only those things that reinforce their belief that the other person is challenging them.

 

Decreased connectivity between regions of the brain that process a social situation could lead to the impaired judgment that escalates to an explosive outburst of anger. The discovery of connectivity deficits in a specific region of the brain like the SLF provides an important starting point for more research on people with IED, as well as those with borderline personality disorder, who share similar social and emotional problems and appear to have the same abnormality in the SLF.

 

"This is another example of tangible deficits in the brains of those with IED that indicate that impulsive aggressive behavior is not simply 'bad behavior' but behavior with a real biological basis that can be studied and treated," Coccaro said.

 

Read the original article Here

Comments (0)
Strategy-Based Training Might Help Mild Cognitive Impairment
By Jason von Stietz, M.A.
June 26, 2016
Photo Credit: Neuroscience News

 

Making sense of everyday spoken and written language is an ongoing daily challenge for individuals with mild cognitive impairment, who are in a preclinical stage of Alzheimer’s disease. Researchers at the University of Texas at Dallas and the University of Illinois at Urbana Champagne investigated the usefulness of strategy-based reasoning training in improving cognitive ability. The study was discussed in a recent article in Neuroscience News: 

 

“Changes in memory associated with MCI are often disconcerting, but cognitive challenges such as lapses in sound decision-making and judgment can have potentially worse consequences,” said Dr. Sandra Bond Chapman, founder and chief director at the Center for BrainHealth and Dee Wyly Distinguished University Professor in the School of Behavioral and Brain Sciences. “Interventions that mitigate cognitive deterioration without causing side effects may provide an additive, safe option for individuals who are worried about brain and memory changes.”

 

For the study, 50 adults ages 54-94 with amnestic MCI were randomly assigned to either a strategy-based, gist reasoning training group or a new-learning control group. Each group received two hour-long training sessions each week. The gist reasoning group received and practiced strategies on how to absorb and understand complex information, and the new-learning group used an educational approach to teach and discuss facts about how the brain works and what factors influence brain health.

 

Strategies in the gist reasoning training group focused on higher-level brain functions such as strategic attention — the ability to block out distractions and irrelevant details and focus on what is important; integrated reasoning — the ability to synthesize new information by extracting a memorable essence, pearl of wisdom, or take-home message; and innovation — the ability to appreciate diverse perspectives, derive multiple interpretations and generate new ideas to solve problems.

 

Pre- and post-training assessments measured changes in cognitive functions between the two groups. The gist reasoning group improved in executive function (i.e., strategic attention to recall more important items over less-important ones) and memory span (i.e., how many details a person can hold in their memory after one exposure, such as a phone number). The new learning group improved in detail memory (i.e., a person’s ability to remember details from contextual information). Those in the gist reasoning group also saw gains in concept abstraction, or an individual’s ability to process and abstract relationships to find similarities (e.g., how are a car and a train alike).

 

“Our findings support the potential benefit of gist reasoning training as a way to strengthen cognitive domains that have implications for everyday functioning in individuals with MCI,” said Dr. Raksha Mudar, study lead author and assistant professor at the University of Illinois at Urbana-Champaign. “We are excited about these preliminary findings, and we plan to study the long-term benefits and the brain changes associated with gist reasoning training in subsequent clinical trials.”

 

“Extracting sense from written and spoken language is a key daily life challenge for anyone with brain impairment, and this study shows that gist reasoning training significantly enhances this ability in a group of MCI patients,” said Dr. Ian Robertson, T. Boone Pickens Distinguished Scientist at the Center for BrainHealth and co-director of The Global Brain Health Initiative. “This is the first study of its kind and represents a very important development in the growing field of cognitive training for age-related cognitive and neurodegenerative disorders.”

 

“Findings from this study, in addition to our previous Alzheimer’s research, support the potential for cognitive training, and specifically gist reasoning training, to impact cognitive function for those with MCI,” said Audette Rackley, head of special programs at the Center for BrainHealth. “We hope studies like ours will aid in the development of multidimensional treatment options for an ever-growing number of people with concerns about memory in the absence of dementia.”

 

Read the original article Here

Comments (0)
Regular Exercise Protects Cognition in Older Adults
By Jason von Stietz, M.A.
June 17, 2016
Photo Credit: Getty Images

 

As baby boomers continue to age, the need for methods of maintaining cognitive abilities in older adults continues to grow. Researchers at the University of Melbourne found that regular exercise is the most effective approach to preventing cognitive decline. Consistent exercise in any form, even as simple as walking, led to cognitive benefits. The study was discussed in a recent article in Medical Xpress: 

 

University of Melbourne researchers followed 387 Australian women from the Women's Healthy Ageing Project for two decades. The women were aged 45 to 55-years-old when the study began in 1992.

 

The research team made note of their lifestyle factors, includingexercise and diet, education, marital and employment status, number of children, mood, physical activity and smoking.

 

The women's' hormone levels, cholesterol, height, weight, Body Mass Index and blood pressure were recorded 11 times throughout the study. Hormone replacement therapy was factored in.

 

They were also asked to learn a list of 10 unrelated words and attempt to recall them half an hour later, known as an Episodic Verbal Memory test.

 

When measuring the amount of memory loss over 20 years, frequent physical activity, normal blood pressureand high good cholesterol were all strongly associated with better recall of the words.

 

Study author Associate Professor Cassandra Szoeke, who leads the Women's Healthy Ageing Project, said once dementia occurs, it is irreversible. In our study more weekly exercise was associated with better memory.

 

"We now know that brain changes associated with dementia take 20 to 30 years to develop," Associate Professor Szoeke said.

 

"The evolution of cognitive decline is slow and steady, so we needed to study people over a long time period. We used a verbal memory test because that's one of the first things to decline when you develop Alzheimer's Disease."

 

Regular exercise of any type, from walking the dog to mountain climbing, emerged as the number one protective factor against memory loss. Asoc Prof Szoeke said that the best effects came from cumulative exercise, that is, how much you do and how often over the course of your life.

 

"The message from our study is very simple. Do more physical activity, it doesn't matter what, just move more and more often. It helps your heart, your body and prevents obesity and diabetes and now we know it can help your brain.

 

It could even be something as simple as going for a walk, we weren't restrictive in our study about what type."

 

But the key, she said, was to start as soon as possible.

 

"We expected it was the healthy habits later in life that would make a difference but we were surprised to find that the effect of exercise was cumulative. So every one of those 20 years mattered.

 

"If you don't start at 40, you could miss one or two decades of improvement to your cognition because every bit helps. That said, even once you're 50 you can make up for lost time."

 

Read the original article Here

Comments (0)