Blog

Share this page on your social media


Strategy-Based Training Might Help Mild Cognitive Impairment
By Jason von Stietz, M.A.
June 26, 2016
Photo Credit: Neuroscience News

 

Making sense of everyday spoken and written language is an ongoing daily challenge for individuals with mild cognitive impairment, who are in a preclinical stage of Alzheimer’s disease. Researchers at the University of Texas at Dallas and the University of Illinois at Urbana Champagne investigated the usefulness of strategy-based reasoning training in improving cognitive ability. The study was discussed in a recent article in Neuroscience News: 

 

“Changes in memory associated with MCI are often disconcerting, but cognitive challenges such as lapses in sound decision-making and judgment can have potentially worse consequences,” said Dr. Sandra Bond Chapman, founder and chief director at the Center for BrainHealth and Dee Wyly Distinguished University Professor in the School of Behavioral and Brain Sciences. “Interventions that mitigate cognitive deterioration without causing side effects may provide an additive, safe option for individuals who are worried about brain and memory changes.”

 

For the study, 50 adults ages 54-94 with amnestic MCI were randomly assigned to either a strategy-based, gist reasoning training group or a new-learning control group. Each group received two hour-long training sessions each week. The gist reasoning group received and practiced strategies on how to absorb and understand complex information, and the new-learning group used an educational approach to teach and discuss facts about how the brain works and what factors influence brain health.

 

Strategies in the gist reasoning training group focused on higher-level brain functions such as strategic attention — the ability to block out distractions and irrelevant details and focus on what is important; integrated reasoning — the ability to synthesize new information by extracting a memorable essence, pearl of wisdom, or take-home message; and innovation — the ability to appreciate diverse perspectives, derive multiple interpretations and generate new ideas to solve problems.

 

Pre- and post-training assessments measured changes in cognitive functions between the two groups. The gist reasoning group improved in executive function (i.e., strategic attention to recall more important items over less-important ones) and memory span (i.e., how many details a person can hold in their memory after one exposure, such as a phone number). The new learning group improved in detail memory (i.e., a person’s ability to remember details from contextual information). Those in the gist reasoning group also saw gains in concept abstraction, or an individual’s ability to process and abstract relationships to find similarities (e.g., how are a car and a train alike).

 

“Our findings support the potential benefit of gist reasoning training as a way to strengthen cognitive domains that have implications for everyday functioning in individuals with MCI,” said Dr. Raksha Mudar, study lead author and assistant professor at the University of Illinois at Urbana-Champaign. “We are excited about these preliminary findings, and we plan to study the long-term benefits and the brain changes associated with gist reasoning training in subsequent clinical trials.”

 

“Extracting sense from written and spoken language is a key daily life challenge for anyone with brain impairment, and this study shows that gist reasoning training significantly enhances this ability in a group of MCI patients,” said Dr. Ian Robertson, T. Boone Pickens Distinguished Scientist at the Center for BrainHealth and co-director of The Global Brain Health Initiative. “This is the first study of its kind and represents a very important development in the growing field of cognitive training for age-related cognitive and neurodegenerative disorders.”

 

“Findings from this study, in addition to our previous Alzheimer’s research, support the potential for cognitive training, and specifically gist reasoning training, to impact cognitive function for those with MCI,” said Audette Rackley, head of special programs at the Center for BrainHealth. “We hope studies like ours will aid in the development of multidimensional treatment options for an ever-growing number of people with concerns about memory in the absence of dementia.”

 

Read the original article Here

Comments (0)
Regular Exercise Protects Cognition in Older Adults
By Jason von Stietz, M.A.
June 17, 2016
Photo Credit: Getty Images

 

As baby boomers continue to age, the need for methods of maintaining cognitive abilities in older adults continues to grow. Researchers at the University of Melbourne found that regular exercise is the most effective approach to preventing cognitive decline. Consistent exercise in any form, even as simple as walking, led to cognitive benefits. The study was discussed in a recent article in Medical Xpress: 

 

University of Melbourne researchers followed 387 Australian women from the Women's Healthy Ageing Project for two decades. The women were aged 45 to 55-years-old when the study began in 1992.

 

The research team made note of their lifestyle factors, includingexercise and diet, education, marital and employment status, number of children, mood, physical activity and smoking.

 

The women's' hormone levels, cholesterol, height, weight, Body Mass Index and blood pressure were recorded 11 times throughout the study. Hormone replacement therapy was factored in.

 

They were also asked to learn a list of 10 unrelated words and attempt to recall them half an hour later, known as an Episodic Verbal Memory test.

 

When measuring the amount of memory loss over 20 years, frequent physical activity, normal blood pressureand high good cholesterol were all strongly associated with better recall of the words.

 

Study author Associate Professor Cassandra Szoeke, who leads the Women's Healthy Ageing Project, said once dementia occurs, it is irreversible. In our study more weekly exercise was associated with better memory.

 

"We now know that brain changes associated with dementia take 20 to 30 years to develop," Associate Professor Szoeke said.

 

"The evolution of cognitive decline is slow and steady, so we needed to study people over a long time period. We used a verbal memory test because that's one of the first things to decline when you develop Alzheimer's Disease."

 

Regular exercise of any type, from walking the dog to mountain climbing, emerged as the number one protective factor against memory loss. Asoc Prof Szoeke said that the best effects came from cumulative exercise, that is, how much you do and how often over the course of your life.

 

"The message from our study is very simple. Do more physical activity, it doesn't matter what, just move more and more often. It helps your heart, your body and prevents obesity and diabetes and now we know it can help your brain.

 

It could even be something as simple as going for a walk, we weren't restrictive in our study about what type."

 

But the key, she said, was to start as soon as possible.

 

"We expected it was the healthy habits later in life that would make a difference but we were surprised to find that the effect of exercise was cumulative. So every one of those 20 years mattered.

 

"If you don't start at 40, you could miss one or two decades of improvement to your cognition because every bit helps. That said, even once you're 50 you can make up for lost time."

 

Read the original article Here

Comments (0)
Depression in Low SES Adolescents Linked to Epigenetics
By Jason von Stietz, M.A.
June 7, 2016
Photo Credit: Getty Images

 

Research findings have long pointed to the link between poverty and depression in high-risk adolescents. Findings by researchers at Duke University identified one mechanism involving epigenetic modification of gene expression. These epigenetic changes resulted in adolescents with more active amygdala’s, which led to higher rates of depression. The study was discussed in a recent article in Medical Xpress: 

 

The results are part of a growing body of work that may lead to biological predictors that could guide individualized depression-prevention strategies.

 

Adolescents growing up in households with lower socioeconomic statuswere shown to accumulate greater quantities of a chemical tag on a depression-linked gene over the course of two years. These "epigenetic" tags work by altering the activity of genes. The more chemical tags an individual had near a gene called SLC6A4, the more responsive was their amygdala—a brain area that coordinates the body's reactions to threat—to photographs of fearful faces as they underwent functional MRI brain scans. Participants with a more active amygdala were more likely to later report symptoms of depression.

 

"This is some of the first research to demonstrating that low socioeconomic status can lead to changes in the way genes are expressed, and it maps this out through brain development to the future experience of depression symptoms," said the study's first author Johnna Swartz, a Duke postdoctoral researcher in the lab of Ahmad Hariri, a Duke professor of psychology and neuroscience.

 

Adolescence is rarely an easy time for anyone. But growing up in a family with low socioeconomic status or SES—a metric that incorporates parents' income and education levels—can add chronic stressors such as family discord and chaos, and environmental risks such as poor nutrition and smoking.

 

"These small daily hassles of scraping by are evident in changes that build up and affect children's development," Swartz said.

 

The study included 132 non-Hispanic Caucasian adolescents in the Teen Alcohol Outcomes Study (TAOS) who were between 11 and 15 years old at the outset of the study and came from households that ranged from low to high SES. About half of the participants had a family history of depression.

 

"The biggest risk factor we have currently for depression is a family history of the disorder," said study co-author Douglas Williamson, principal investigator of TAOS and professor of psychiatry and behavioral sciences at Duke. "Our new work reveals one of the mechanisms by which such familial risk may be manifested or expressed in a particular group of vulnerable individuals during adolescence."

 

The group's previous work, published last year in the journal Neuron, had shown that fMRI scan activity of the amygdala could signal who is more likely to experience depression and anxiety in response to stress several years later. That study included healthy college-aged participants of Hariri's ongoing Duke Neurogenetics Study (DNS), which aims to link genes, brain activity, and other biological markers to a risk for mental illness.

 

This study asked whether higher activity in the same brain area could predict depression in the younger, at-risk TAOS participants. Indeed, about one year later, these individuals (now between 14 and 19 years of age) were more likely to report symptoms of depression, especially if they had a family history of the disorder.

 

Swartz said the new study examined a range of socioeconomic status and did not focus specifically on families affected by extreme poverty or neglect. She said the findings suggest that even modestly lower socioeconomic status is associated with biological differences that elevate adolescents' risk for depression.

 

Most of the team's work so far has focused on epigenetic chemical tags near the SLC6A4 gene because it helps control the brain's levels of serotonin, a neurochemical involved in clinical depression and other mood disorders. The more marks present just upstream of this gene, the less likely it is to be active.

 

In 2014, Williamson and Hariri first showed that the presence of marks near the SLC6A4 gene can predict the way a person's amygdala responds to threat. That study included both Williamson's TAOS and Hariri's DNS participants, but had looked at the chemical tags at a single point in time.

 

Looking at the changes in these markers over an extended time is a more powerful way to understand an individual's risk for depression, said Hariri, who is also a member of the Duke Institute for Brain Sciences.

 

The team is now searching the genome for new markers that would predict depression. Ultimately, a panel of markers used in combination will lead to more accurate predictions, Swartz said.

 

They also hope to expand the age ranges of the study to include younger individuals and to continue following the TAOS participants into young adulthood.

 

"As they enter into young adulthood they are going to be experiencing more problems with depression or anxiety—or maybe substance abuse," Hariri said. "The extent to which our measures of their genomes and brains earlier in their lives continue to predict their relative health is something that's very important to know and very exciting for us to study."

 

Read the original article Here

Comments (0)
Sleep Patterns Uncovered Using Smartphone Application
By Jason von Stietz, M.A.
May 31, 2016
Getty Images

 

What is the powerful influencer of our sleep patterns? Circadian rhythms? The needs and demands of society? Why is jet lag so difficult to overcome? Researchers from University of Michigan studied sleep patterns by employing a free app. The findings were discussed in a recent article in Medical Xpress:  

 

A pioneering study of worldwide sleep patterns combines math modeling, mobile apps and big data to parse the roles society and biology each play in setting sleep schedules.

 

The study, led by University of Michigan mathematicians, used a free smartphone app that reduces jetlag to gather robust sleep data from thousands of people in 100 nations. The researchers examined how age, gender, amount of light and home country affect the amount of shut-eye people around the globe get, when they go to bed, and when they wake up.

 

Among their findings is that cultural pressures can override natural circadian rhythms, with the effects showing up most markedly at bedtime. While morning responsibilities like work, kids and school play a role in wake-time, the researchers say they're not the only factor. Population-level trends agree with what they would expect from current knowledge of the circadian clock.

 

"Across the board, it appears that society governs bedtime and one's internal clock governs wake time, and a later bedtime is linked to a loss of sleep," said Daniel Forger, who holds faculty positions in mathematics at the U-M College of Literature, Science, and the Arts, and in the U-M Medical School's Department of Computational Medicine and Bioinformatics. "At the same time, we found a strong wake-time effect from users' biological clocks—not just their alarm clocks. These findings help to quantify the tug-of-war between solar and social timekeeping."

 

When Forger talks about internal or biological clocks, he's referring to circadian rhythms—fluctuations in bodily functions and behaviors that are tied to the planet's 24-hour day. These rhythms are set by a grain-of-rice-sized cluster of 20,000 neurons behind the eyes. They're regulated by the amount of light, particularly sunlight, our eyes take in.

 

Circadian rhythms have long been thought to be the primary driver of sleep schedules, even since the advent of artificial light and 9-to-5 work schedules. The new research helps to quantify the role that society plays.

 

Here's how Forger and colleague Olivia Walch arrived at their findings. Several years ago, they released an app called Entrain that helps travelers adjust to new time zones. It recommends custom schedules of light and darkness. To use the app, you have to plug in your typical hours of sleep and light exposure, and are given the option of submitting your information anonymously to U-M.

 

The quality of the app's recommendations depended on the accuracy of the users' information, and the researchers say this motivated users to be particularly careful in reporting their lighting history and sleep habits.

 

With information from thousands of people in hand, they then analyzed it for patterns. Any correlations that bubbled up, they put to the test in what amounts to a circadian rhythm simulator. The simulator—a mathematical model—is based on the field's deep knowledge of how light affects the brain's suprachiasmatic nucleus (that's the cluster of neurons behind the eyes that regulates our internal clocks). With the model, the researchers could dial the sun up and down at will to see if the correlations still held in extreme conditions.

 

"In the real world, bedtime doesn't behave how it does in our model universe," Walch said. "What the model is missing is how society affects that."

 

The spread of national averages of sleep duration ranged from a minimum of around 7 hours, 24 minutes of sleep for residents of Singapore and Japan to a maximum of 8 hours, 12 minutes for those in the Netherlands. That's not a huge window, but the researchers say every half hour of sleep makes a big difference in terms of cognitive function and long-term health.

 

The findings, the researchers say, point to an important lever for the sleep-deprived—a set that the Centers for Disease Control and Prevention is concerned about. A recent CDC study found that across the U.S., one in three adults aren't getting the recommended minimum of seven hours. Sleep deprivation, the CDC says, increases the risk of obesity, diabetes, high blood pressure, heart disease, stroke and stress.

 

The U-M researchers also found that:

 

  • Middle-aged men get the least sleep, often getting less than the recommended 7 to 8 hours.
  • Women schedule more sleep than men, about 30 minutes more on average. They go to bed a bit earlier and wake up later. This is most pronounced in ages between 30 and 60.
  • People who spend some time in the sunlight each day tend to go to bed earlier and get more sleep than those who spend most of their time in indoor light.
  • Habits converge as we age. Sleep schedules were more similar among the older-than-55 set than those younger than 30, which could be related to a narrowing window in which older individuals can fall and stay asleep.

 

Sleep is more important than a lot of people realize, the researchers say. Even if you get six hours a night, you're still building up a sleep debt, says Walch, doctoral student in the mathematics department and a co-author on the paper.

 

"It doesn't take that many days of not getting enough sleep before you're functionally drunk," she said. "Researchers have figured out that being overly tired can have that effect. And what's terrifying at the same time is that people think they're performing tasks way better than they are. Your performance drops off but your perception of your performance doesn't."

 

Aside from the findings themselves, the researchers say the work demonstrates that mobile technology can be a reliable way to gather massive data sets at very low cost.

 

"This is a cool triumph of citizen science," Forger said.

 

Read the original article Here

 

 

 

Comments (0)
Role of Frontal Cortex in Perceptual Decision-Making
By Jason von Stietz, M.A.
May 22, 2016
 

 

Why do we sometimes not see what is directly in front of us? Researchers at the Georgia Institute of Technology and University of California Berkeley studied the role of the frontal cortex in perceptual decision-making by utilizing transcranial magnetic stimulation. The study was published in the peer-reviewed journal Proceedings of the National Academy of the Sciences of the United States and later discussed in a recent article in Medical Xpress:  

 

A sportscaster lunges forward. "Interception! Drew Brees threw the ball right into the opposing linebacker's hands! Like he didn't even see him!"

 

The quarterback likely actually did not see the defender standing right in front of him, said Dobromir Rahnev, a psychologist at the Georgia Institute of Technology. Rahnev leads a research team making new discoveries about how the brain organizes visual perception, including how it leaves things out even when they're plainly in sight.

 

Rahnev and researchers from the University of California, Berkeley have come up with a rough map of the frontal cortex's role in controlling vision. They published their findings on Monday, May 9, 2016 in the journal of theProceedings of the National Academy of Sciences.

 

Thinking cap

 

The frontal cortex is often seen as our "thinking cap," the part of the brain scientists associate with thinking and making decisions. But it's not commonly connected with vision. "Some people believe that the frontal cortex is not involved," said Rahnev, an assistant professor at the School of Psychology. The new research adds to previous evidence that it is, he said.

 

The lack of association with that part of the brain may have to do with the fact it's other parts that transform information coming from the eyes into sight and others still that make sense of it by doing things like identifying objects in it.

 

But the thinking cap of the brain controls and oversees this whole process, making it as essential to how we see as those other areas, Rahnev said. How that works also accounts for why we sometimes miss things right in front of us.

 

A camera it's not

 

"We feel that our vision is like a camera, but that is utterly wrong," Rahnev said. "Our brains aren't just seeing, they're actively constructing the visual scene and making decisions about it." Sometimes the frontal cortex isn't expecting to see something, so although it's in plain sight, it blots it out of consciousness.

 

To test out the fontal cortex's involvement in vision, the researchers ran a two-part experiment.

 

First, they observed which regions of the brain—in particular the frontal cortex—lit up with activity while healthy volunteers completed visual tasks corresponding to three basic stages of conscious visual perception.

 

Second, they inhibited those same regions using magnetic stimulation to confirm their involvement in each visual stage.

 

Believing is part of seeing

 

The first stage of the visual perception the researchers tested for was selection, Rahnev said. That's when the brain picks out part of the vast array of available visual stimuli to actually pay attention to.

 

In the case of the football quarterback, this might mean focusing on the route the receiver takes.

 

The second stage is combination, he said. The brain merges the visual information it processed with other material. "The quarterback's brain is putting what he actually sees together with expectations based on the play he called," Rahnev said.

 

Then comes evaluation. The quarterback needs to decide whether to release the ball given everything he has processed.

 

Expecting a blocker to stop the defending player (which didn't happen), he may have blotted him out of perception and thrown the ball right at him. Interception.

 

"The frontal cortex sends a signal to move your attention onto the object you select," Rahnev said. "It does some of the combining with other information, and then it's probably the primary evaluator of what you think you saw."

 

Simple vision brain map

 

In experiments, during a functional MRI scan, different parts of the frontal cortex of the participants lit up, corresponding to each vision function.

 

The back of the frontal cortex activated during selection; its midsection lit up during combination, and the front, or anterior, part cranked up during evaluation.

 

That's how the researchers arrived at a kind of vision map of the frontal cortex. "It's a rudimentary map," Rahnev said. "A very simple one that just says, 'This is the back. This in the middle. This is the front.'"

 

The critical evidence

 

The critical evidence for this map came from the use of magnetic stimulation. When the researchers used it to inhibit the back and middle of the frontal cortex separately, subjects became less able to complete the corresponding functions of selection and combination.

 

When they stimulated the front, the opposite happened. Subjects were slightly but significantly better able to evaluate the accuracy of what they think they saw.

 

"This is a really clear demonstration of the role that the frontal cortex, which is usually seen as the seat of thought, plays in controlling vision."

 

Sorry officer!

 

And there is a practical takeaway for health and safety. Instead of the quarterback telling the coach, "I swear I didn't see that coming," often it's motorists telling police officers the same thing after a car accident.

 

Distraction is often the culprit, because it overtaxes the organization of perception, Rahnev said. These three functions are going on all the time in multiple scenarios in our brains while it processes the world around us.

 

But add too much to the pile, like texting behind the wheel, Rahnev said, and "you can run right into a parked car without ever seeing it."

 

Read the original Here

 

Comments (0)
Do Mirror Neurons Influence Action Recognition?
By Jason von Stietz, M.A.
May 20, 2016
Getty Images

 

Mirror neurons are thought to be the switching point between visual and motor centers in the brain. Mirror neurons help our brain to interpret the actions of others. When someone raises a fist toward us do they mean to give us a friendly greeting (fist bump) or do they mean to attack us? Researchers at the Max Plank Institute studied the role of mirror neurons in action recognition. The study was discussed in a recent article in Medical Xpress: 

 

It is suspected that mirror neurons enable us to empathize and put ourselves 'in other people's shoes'. When we see that someone has been injured, we also experience internal suffering: these special neurons cause what we see to be simulated in our brain in a way that makes us feel as though we are experiencing it in our own bodies.

 

In perception research, it is assumed that mirror neurons enable people go through a movement they have seen in their own motor system. This internal recreation of what we have seen probably enables us to infer the meaning of the observed action. The mirror neurons act as the switching point between the motor and visual areas of the brain. Conversely, when the motor system is supposed to be the determining factor in the classification of an action, it means that the perception can also be manipulated by our own implementation of an action.

 

Attack or greeting?

 

In their study, the researchers analyzed the mechanism by which the brain recognizes an action. To do this, they showed the test subjects two different movements: a punch and a greeting gesture known as the 'fist bump', practised by young men in particular. The researchers arranged the scenario as realistically as possible. A life-sized avatar was shown on a screen facing the test subjects. Using 3D glasses, the subjects were able to see their virtual partners in three dimensions – the avatar's movements appeared as though they were unfolding within the test subjects' reach.

 

All the test subjects were required to do was to decide whether they were being presented with an aggressive punch or well-intentioned greeting. However, the scientists made the conditions more difficult by combining the two gestures in a single movement. The avatar's intentions were thus a matter of interpretation.

 

The question behind the experiment then was whether people allow themselves to be influenced by their own motor system when interpreting the actions of others. The test subjects were manipulated in different ways in the experiment: they could observe a clearly identifiable action played in a continuous loop on a screen. They became active at the same time themselves by carrying out air punches, for example. They were then asked to assess how the indefinable movement of the avatar should be interpreted.

 

I only believe what I also see

 

When the two sensory stimuli were played out against each other – that is the test subjects saw a fist bump in front of them while carrying out a punch movement themselves – the visual impression was the clear winner. The subject's own movement did not have any influence on the perception. Contrary to what was previously assumed, the motor system had little or no influence on the participants' assessment of the movement. To the astonishment of the scientists, the mirror neurons associated with the motor system clearly did not have any major role to play in the action recognition process.

 

With their experiment set up, the team was able to study the contribution of the motor system to action recognition during social interaction for the first time and, thereby also the existing theory on the interaction between mirror neurons and stimulus processing. "Contrary to what was previously assumed, the mirror neurons do not have a particularly significant influence on the interpretation of an action. Visual perception is namely far more important for our brain – in social situations, we rely almost exclusively on what we see," says the head of the study, Stephan de la Rosa, summarizing the study findings.

 

Read the original article Here

Comments (0)
Memory Impairment Related To Brain Signal Between Seizures
By Jason von Stietz, M.A.
May 15, 2016
Getty Images

 

Many patients with epilepsy suffer from cognitive deficits. Researchers at New York University Langone Medical Center conducted an animal model study in which the relationship between signals from the hippocampus to the cortex relate to impaired memory in seizure patients. The study was published in Nature Medicine and discussed in a recent article of NeuroScientistNews: 

 

Between seizures and continually, brain cells in epileptic patients send signals that make "empty memories," perhaps explaining the learning problems faced by up to 40 percent of patients. This is the finding of a study in rats and humans led by researchers at New York University (NYU) Langone Medical Center and published in Nature Medicine.

 

"Our study sheds the first light on the mechanisms by which epilepsy hijacks a normal brain process, disrupting the signals needed to form memories," says study lead author and NYU Langone pediatric neurologist Jennifer Gelinas, MD, PhD. "Many of my patients feel that cognitive problems have at least as much impact on their lives as seizures, but we have nothing to offer them beyond seizure control treatments. We hope to change that."

 

The study results revolve around two brain regions, the hippocampus and cortex, shown by past studies to exchange precise signals as each day's experiences are converted into permanent memories during sleep. The study authors found that epileptic signals come from the hippocampus, not as the part of normal memory consolidation, but instead as meaningless commands that the cortex must process like memories.

 

Study rats experiencing such abnormal signals had significant difficulties navigating to places where they had previously found water. Furthermore, the degree of abnormal hippocampal-cortical signaling in study animals tracked closely with the level of memory impairment.

 

The study also looked at data from epilepsy patients that had their brain signals monitored as part of surgery preparation. Researchers found that rats and humans with epilepsy experienced similar, abnormal hippocampal discharges between seizures that resembled but out-competed normal memory-forming communication between brain regions.

 

Given the tens of milliseconds delay observed between hippocampus signals and the response from the cortex, researchers see a time window during which an implanted device might interrupt disease-related signals, and have launched a related design effort.

 

Foundation Built over Decades

 

Senior author of the study and NYU Langone neuroscientist Gyorgy Buzsaki, PhD, had established, starting in 1989, the theory that memories form in two stages: one while awake and another where we replay the day's events during sleep.

 

As the latest step in that work, Buzsaki also led a study published in the journal Science last month that explained key mechanisms behind hippocampal-cortical memory consolidation.

 

Countering the idea most neurons contribute equally as memories form, his team found that a few strongly active "rigid" neurons perform the same way before and after experiences; while a second set of rarely contributing "plastic" neurons behave differently before and after opportunities for memory consolidation.

 

"We seem to have evolved with both a stable template of neurons that process what is the same about the things we encounter, and a second group that can learn with new experiences," says Buzsaki. "This new understanding of memory consolidation made possible our insights into epilepsy."

 

Buzsaki has shown that the hippocampus processes information in rhythmic cycles, with thousands of nerve signals sent regularly and within milliseconds of each other. By firing in synchrony, brain cells cooperate to achieve complex signals, but only if this wave is sculpted, with signals afforded proper strengths and placed in order. Unfortunately, the synchronous nature of hippocampal signaling creates risk, says Buzsaki, because without proper control it can convey powerful nonsense messages to the rest of the brain.

 

Read the original article Here

Comments (0)
Autism Biomarker for Boys Found
By Jason von Stietz, M.A.
April 29, 2016
Credit: George Washington University

 

Researchers at the George Washington University investigated the use of brain scans in measuring the progress of treatments for Autism Spectrum Disorder in boys.  Findings indicated that a biomarker related to brain circuitry involved in social perception may assist in more quickly diagnosing and treating difficult to diagnose patients. The study was discussed in a recent article in Medical Xpress: 

 

Researchers have developed a new method to map and track the function of brain circuits affected by autism spectrum disorder (ASD) in boys using brain imaging. The technique will provide clinicians and therapists with a physical measure of the progress patients are making with behavioral and/or drug treatments - a tool that has been elusive in autism treatment until this point.

 

For the first time, doctors would be able to quantify how that brain circuit is working in their patients and assess the effectiveness of an intervention. The research is outlined in a paper, "Quantified Social Perception Circuit Activity as a Neurobiological Marker of Autism Spectrum Disorder," published Wednesday in JAMA Psychiatry. The paper focuses on the use of biomarkers, measurable indicators of a biological condition, to measure the function of the social perception circuit of the brain.

 


"This is significant because biomarkers give us a 'why' for understanding autism in boys that we haven't had before," said Kevin Pelphrey, a co-author of the paper, who is the Carbonell Family Professor in Autism and Neurodevelopmental Disorders and director of the Autism and Neurodevelopmental Disorders Institute at the George Washington University. "We can now use functional biomarkers to identify what treatments will be effective for individual cases and measure progress."

 


Researchers analyzed a series of 164 images from each of 114 individuals and discovered the brain scans of the social perception circuits only indicated ASD in boys. This new research has the potential to improve treatment for ASD by measuring changes in the social perception brain circuit in response to different interventions. The researchers found the brain scan data can be an effective indicator of function of the circuit in younger children and older patients alike.

 

The research is particularly relevant for ASD patients who are difficult to diagnose and treat by providing a more definitive diagnosis and in developing a treatment program when it is not clear if behavioral, drug or a combination of the treatments will be most effective.

 

"The behavioral symptoms of ASD are so complex and varied it is difficult to determine whether a new treatment is effective, especially within a realistic time frame," said Malin Björnsdotter, assistant professor at the University of Gothenburg and lead author of the paper. "Brain function markers may provide the specific and objective measures required to bridge this gap."

 

A Path to Widespread use of Brain Scans?

 

In addition to helping to identify the most effective ASD treatment for an individual, this research provides evidence that brain imaging is an important intervention tool. Currently, functional MRI, the type of brain scan used in this study, is not a standard part of ASD treatment, as there is not enough evidence linking the scan to effective treatments. The Autism and Neurodevelopmental Disorders Institute at GW aims to make significant contributions toward the establishment of evidence-based therapies for ASD.

 

Credit: George Washington University

 

 

"This kind of imaging can help us answer the question, 'On day one of treatment, will this child benefit from a 16-week behavioral intervention?'" Dr. Pelphrey said. "Answering that question will help parents save time and money on diagnosis and treatments."

 

Following the study, Dr. Pelphrey and his colleagues will test their findings at the next level: studying a larger pool of people with autism and other neurological disorders in collaboration with Children's National Medical Center to see if the scan can successfully distinguish ASD from other disorders and track treatment progress.

 

The authors emphasized that this research is still in the earliest days, pointing out that doctors' offices and most hospitals do not have the specialized imaging equipment necessary to carry out the brain scans used by the team involved in this study.

 

"To really help patients we need to develop inexpensive, easy-to-use techniques that can be applied in any group, including infants and individuals with severe behavioral problems," said Dr. Björnsdotter. "This study is a first step toward that goal."

 

While this method currently only works for boys with autism, the researchers are leading a large-scale, nationwide study of girls with autism to identify equivalent techniques that will work for them. The group expects to have the initial results from that study later this year.

 

Read the original article Here

Comments (0)
Anticholinergics Linked to Changes in Brain and Cognitive Impairment
By Jason von Stietz, M.A.
April 26, 2016
Getty Images

 

Are over the counter drugs completely safe? Researchers at Indiana University School of Medicine used positron emission tests (PET) to study the relationship between anticholinergic drugs and cognitive decline. Anticholinergics are commonly used as sleep aids as wells as treatments for hypertension and cardiovascular disease. Findings linked the use of anticholinergics to physical changes in the brain and cognitive impairment. The study was discussed in a recent article in Medical Xpress:  

 

Older adults might want to avoid a using class of drugs commonly used in over-the-counter products such as nighttime cold medicines due to their links to cognitive impairment, a research team led by scientists at Indiana University School of Medicine has recommended.

 

Using brain imaging techniques, the researchers found lower metabolism and reduced brain sizes among study participants taking the drugs known to have an anticholinergic effect, meaning they block acetylcholine, a nervous system neurotransmitter.

 

Previous research found a link between between the anticholinergic drugs and cognitive impairment and increased risk of dementia. The new paper published in the journal JAMA Neurology, is believed to be the first to study the potential underlying biology of those clinical links using neuroimaging measurements of brain metabolism and atrophy.

 

"These findings provide us with a much better understanding of how this class of drugs may act upon the brain in ways that might raise the risk of cognitive impairment and dementia," said Shannon Risacher, Ph.D., assistant professor of radiology and imaging sciences, first author of the paper, "Association Between Anticholinergic Medication Use and Cognition, Brain Metabolism, and Brain Atrophy in Cognitively Normal Older Adults."

 

"Given all the research evidence, physicians might want to consider alternatives to anticholinergic medications if available when working with their older patients," Dr. Risacher said.

 

Drugs with anticholinergic effects are sold over the counter and by prescription as sleep aids and for many chronic diseases including hypertension, cardiovascular disease, and chronic obstructive pulmonary disease.

 

A list of anticholinergic drugs and their potential impact is athttp://www.agingbraincare.org/uploads/products/ACB_scale_-_legal_size.pdf.

 

Scientists have linked anticholinergic drugs cognitive problems among older adults for at least 10 years. A 2013 study by scientists at the IU Center for Aging Research and the Regenstrief Institute found that drugs with a strong anticholinergic effect cause cognitive problems when taken continuously for as few as 60 days. Drugs with a weaker effect could cause impairment within 90 days.

 

The current research project involved 451 participants, 60 of whom were taking at least one medication with medium or high anticholinergic activity. The participants were drawn from a national Alzheimer's research project—the Alzheimer's Disease Neuroimaging Initiative—and the Indiana Memory and Aging Study.

 

To identify possible physical and physiological changes that could be associated with the reported effects, researchers assessed the results of memory and other cognitive tests, positron emission tests (PET) measuring brain metabolism, and magnetic resonance imaging (MRI) scans for brain structure.

 

The cognitive tests revealed that patients taking anticholinergic drugs performed worse than older adults not taking the drugs on short-term memory and some tests of executive function, which cover a range of activities such as verbal reasoning, planning, and problem solving.

 

Anticholinergic drug users also showed lower levels of glucose metabolism—a biomarker for brain activity—in both the overall brain and in the hippocampus, a region of the brain associated with memory and which has been identified as affected early by Alzheimer's disease.

 

The researchers also found significant links between brain structure revealed by the MRI scans and anticholinergic drug use, with the participants using anticholinergic drugs having reduced brain volume and larger ventricles, the cavities inside the brain.

 

"These findings might give us clues to the biological basis for the cognitive problems associated with anticholinergic drugs, but additional studies are needed if we are to truly understand the mechanisms involved," Dr. Risacher said.

 

Read the original article Here

Comments (0)
Study Links Affective Understanding to Attraction
By Jason von Stietz, M.A.
April 8, 2016
Getty Images

 

What is it that leads to attraction? Researchers conducted an experiment utilizing fMRI scans of participants watching a video of a woman. It was found that participants found her more attractive, which was reflected in the reward center of the brain, when they perceived themselves to understand her emotional experience.  The findings were discussed in a recent article in Medical Xpress: 

 

(Medical Xpress)—A team of researchers with members from a large number of institutions in Germany has conducted a study that has revealed more about the way interpersonal attraction works in the brain. In their paper published in Proceedings of the National Academy of Sciences, the group describes two experiments they conducted with volunteers, their results and what they believe was revealed about the nature of the mechanism of attraction between people.

 

Most everyone has experienced near instant attraction to someone else, whether of a social or sexual nature, but few are able to pin down exactly why they felt that attraction. Based on two experiments they conducted with human volunteers, the researchers suggest it may have to do with matchingneural circuitry.

 

To learn more about attraction, the researchers ran two experiments, the first consisted of showing 19 male and 21 female volunteers, videos of six different women as they experienced fear or sadness. The volunteers were asked to choose which emotion was being shown, and then to mark down how confident they were in their choice. To gauge how much of an attraction they volunteers felt for the women in the videos, they were asked to enlarge a picture of the woman both before and after seeing her in the video—each was also asked to answer questions about each woman, such as how much they would like to meet her in real life, if she would understand them, etc.

The second experiment was run with a different set of volunteers who were also asked to watch the woman in the videos, but the second group did so while undergoing an fMRI imaging—the researchers were specifically looking for activity in the part of the brain known to be associated with rewards.

 

The final phase of the experiment involved combining data from both experiments to see if any patterns might emerge. The researchers report that most of the volunteers were able to identify the emotions being portrayed, and the more confident they felt they were able to identify the correct emotion, the more attracted to her they felt. This was confirmed in the fMRI scans—reward centers in the volunteers' brains lit up more when watching women they felt they could read their emotions better.

 

The researchers propose that their results suggest that in addition to physical attractiveness, people are attracted to other people due to their own feelings of similarity to another person, which gives them a feeling of understanding, or connectedness.

 

Read the original article Here

Comments (0)
by -