Blog

Share this page on your social media


tDCS Increases Honest Behavior
By Jason von Stietz, M.A.
April 21, 2017
Getty Images

 

Can honesty be strengthened like a muscle? Researchers at University of Zurich led a study examining the relationship between honesty and non-invasive brain stimulation known as transcranial direct current stimulation (tDCS). Findings indicated that tDCS applied over the right dorsolateral prefrontal cortex increased honesty in situations in which it is tempting to cheat for personal gain. The study was discussed in a recent article in MedicalXpress: 

 

Honesty plays a key role in social and economic life. Without honesty, promises are not kept, contracts are not enforced, taxes remain unpaid. Despite the importance of honesty for society, its biological basis remains poorly understood. Researchers at the University of Zurich, together with colleagues from Chicago and Boston, now show that honest behavior can be increased by means of non-invasive brain stimulation. The results of their research highlight a deliberation process between honesty and self-interest in the right dorsolateral prefrontal cortex (rDLPFC).

 

Occasional lies for material self interest

 

In their die-rolling experiment, the participants could increase their earnings by cheating rather than telling the truth (see box below). The researchers found that people cheated a significant amount of the time. However, many participants also stuck to the truth. "Most people seem to weigh motives of self-interest against honesty on a case-by-case basis; they cheat a little but not on every possible occasion." explains Michel Maréchal, UZH Professor for Experimental Economics. However, about 8% of the participants cheated in whenever possible and maximized their profit.

 

Less lies through brain stimulation

 

The researchers applied transcranial direct current stimulation over a region in the right dorsolateral prefrontal cortex (rDLPFC). This noninvasive brain stimulation method makes brain cells more sensitive i.e., they are more likely to be active. When the researchers applied this stimulation during the task, participants were less likely to cheat. However, the number of consistent cheaters remained the same. Christian Ruff, UZH Professor of Neuroeconomics, points out "This finding suggests that the stimulation mainly reduced cheating in participants who actually experienced a moral conflict, but did not influence the decision making process in those not in those who were committed to maximizing their earnings".

 

Conflict between money and morals

 

The researchers found that the stimulation only affected the process of weighing up material versus moral motives. They found no effects for other types of conflict that do not involve moral concerns (i.e., financial decisions involving risk, ambiguity, and delayed rewards). Similarly, an additional experiment showed that the stimulation did not affect honest behavior when cheating led to a payoff for another person instead of oneself and the conflict was therefore between two moral motives. The pattern of results suggests that the stimulated neurobiological process specifically resolves trade-offs between material self-interest and honesty.

 

Developing an understanding of the biological basis of behavior

 

According to the researchers, these findings are an important first step in identifying the brain processes that allow people to behave honestly. "These brain processes could lie at the heart of individual differences and possibly pathologies of honest behavior", explains Christian Ruff. And finally, the new results raise the question to what degree honest behavior is based on biological predispositions, which may be crucial for jurisdiction. Michel Maréchal summarizes: "If breaches of honesty indeed represent an organic condition, our results question to what extent people can be made fully liable for their wrongdoings."

 


Read the original article Here

Comments (0)
Mental Health Issues Stigmatized Among Providers
By Jason von Stietz, M.A.
April 15, 2017
Getty Images

 

Mental health providers often help people work through their most traumatic and personal issues. Although they attempt to act as a model of mental health, their own wellbeing can often suffer. Mental health providers often work long hours, receive inadequate support in the workplace, and are expected to remain resilient in the face of trauma. In other words, admitting to their own mental health issues and seeking help can be highly stigmatized. This phenomenon was discussed in a recent article in The Guardian:   

 

Understanding around mental health is improving: campaigns such as Time To Change have drawn public awareness to the issue and employers are realising the affects of dedicated wellbeing support to staff – which have led to a 30% reduction in mental health-related absences. But surprisingly, stigma still exists for those working in mental healthcare themselves.

 

Many people working in the sector are reluctant to talk about their own experiences, says Elizabeth Cotton, an academic at Middlesex University researching the topic. She was one of them. “I walked a thin line between being a competent professional and feeling like a fraud at managing my own mental health at work.”

 

This experience also rings true for Sarah Jones*, a peer support manager for a mental health charity. In previous jobs she was forced to actively hide her mental health problems.

 

“I have friends who work within the sector who have very little support at work, and have told me they just wouldn’t disclose their mental health issues to colleagues or managers due to fear of judgment or discrimination,” she says.

 

Mental health workers are often expected to be “self-reliant [and] cope in the face of traumatic and emotionally challenging work,” says Ruth Allen, chief executive of the British Association of Social Workers.

 

There is expectation that because these care workers are supporting mentally ill people, their own mental health must be under control. This can put immense pressure on them.

 

“Even when I was working in a psychiatric hospital and suffering from mental health issues, I didn’t feel I could trust my supervisor to get beyond the notion that I was just being ‘a bit anxious’,” says Laura Sharp*, a social worker who works in adult and child care. “I was working with suicidal people all day every day, but I was told to be less emotive.”

 

Sharp says this experience made her work – which often involved restraining suicidal patients from self harm or cleaning up blood – traumatising. “I felt neither contained nor safe, physically, emotionally or professionally.”

 

It is widely assumed that workers in the sector will be less susceptible to trauma and are somehow desensitised to the emotionally difficult work they carry out, says Allen. These unrealistic exceptions can often be “a dangerous contributor to less compassionate and engaged care and support,” she warns.

 

Allen also says that people working in the mental health sector often find that the job does not meet their expectations. “They often are in a different culture, working with a financial regime that is at odds with their personal commitment,” she says. This creates a stressful situation: “staff either bend themselves out of shape, become exhausted trying to make the system work, or simply leave,” she says.

 

Other elements of the job can exacerbate or catalyse this distress. Cuts to fundingand stretched services, particularly in the NHS, can cause internal support infrastructure to crumble, meaning workers needing help can also fall through the cracks. A survey (pdf) from the British Psychoanalytic Council and UK Council for Psychotherapy found that therapists felt burnt out and distressed because of their work, while not receiving adequate supervision to help deal with these issues.

 

Mental health workers, who often work punishingly long hours to begin with, are working additional hours to compensate for lack of staffing and rising demand, says Allen. “Staff are left feeling like they have no time to adapt to changes and perform at their best.”

 

Sharp, who has since got a new job, thinks the sector needs a further push in looking after employees’ mental health, which will ultimately benefit service users. “I still think every day about the patients I worked with and how the system is failing them,” she says. “I’m a social worker and I still love my job. But I only managed to get this love back by leaving a job I had initially felt passionate about. That’s not right.”

 

The sector needs to do more to support mental health issues in staff. Since changing jobs, Jones has also found an employer that views her own experiences of mental health as an asset. She says the support at her current work has helped her to thrive and eradicated her feelings of shame around mental illness.

 

*Some names have been changed

 

Read the original article Here

Comments (0)
Can Fast fMRI Monitor Brain Activity of Thoughts?
By Jason von Stietz, M.A.
March 30, 2017
Photo Credit: Lewis et al

 

Can fMRI detect human thought? Researchers from Harvard utilized recent developments in fast fMRI techniques to examine rapid oscillations of brain activity during human thought. Researchers showed participants rapidly oscillating images and monitored the rapid oscillations of brain activity in the visual cortex. These findings ark the first steps towards better studying the neural networks as they produce human thought. The study was discussed funded by the National Institutes for Health and discussed in a recent press release: 

 

By significantly increasing the speed of functional MRI (fMRI), NIBIB-funded researchers have been able to image rapidly fluctuating brain activity during human thought. fMRI measures changes in blood oxygenation, which were previously thought to be too slow to detect the subtle neuronal activity associated with higher order brain functions. The new discovery that fast fMRI can detect rapid brain oscillations is a significant step towards realizing a central goal of neuroscience research: mapping the brain networks responsible for human cognitive functions such as perception, attention, and awareness.

 

“A critical aim of the President’s BRAIN Initiative1 is to move neuroscience into a new realm where we can identify and track functioning neural networks non-invasively,” explains Guoying Liu, Ph.D., Director of the NIBIB program in Magnetic Resonance Imaging. “This work demonstrates the potential of fMRI for mapping healthy neural networks as well as those that may contribute to neurological diseases such as dementia and other mental health disorders, which are significant national and global health problems.” 

 

fMRI works by detecting local increases in oxygen as blood is delivered to a working part of the brain. The technique has been instrumental for identifying which areas in the brain control functions such as vision, hearing, or touch. However, standard fMRI can only detect the blood flow coming to replenish an area of the brain several seconds after it has performed a function. It was generally accepted that this was the limit of what could be detected by fMRI—identification of a region in the brain that had responded to a large stimulus, such as a continuous 30 second “blast” of bright light. 

 

Combining several new techniques, Jonathan R. Polimeni, Ph.D., senior author of the study, and his colleagues at Harvard’s Athinoula A. Martinos Center for Biomedical Imaging, applied fast fMRI in an effort to track neuronal networks that control human thought processes, and found that they could now measure rapidly oscillating brain activity. The results of this groundbreaking work are reported in the October 2016 issue of the Proceedings of the National Academy of Sciences.2

 

The researchers used fast fMRI in human volunteers observing a rapidly fluctuating checkerboard pattern. The fast fMRI was able to detect the subtle and very rapid oscillations in cerebral blood flow in the brain’s visual cortex as the volunteers observed the changing pattern.

 

“The oscillating checkerboard pattern is a more “naturalistic” stimulus, in that its timing is similar to the very subtle neural oscillations made during normal thought processes,” explains Polimeni. “The fast fMRI detects the induced neural oscillations that allow the brain to understand what the eye is observing --- the changing checkerboard pattern. These subtle oscillations were completely undetectable with standard fMRI. This exciting result opens the possibility of using fast fMRI to image neural networks as they guide the process of human thought.” 

 

One such possibility is suggested by first author of the study Laura D. Lewis, Ph.D. “This technique now gives us a method for obtaining much more detailed information about the complex brain activity that takes place during sleep, as well as other dynamic switches in brain states, such as when under anesthesia and during hallucinations.” 

 

Concludes Polimeni, “It had always been thought that fMRI had the potential to play a major role in these types of studies. Meaningful progress in cognitive neuroscience depends on mapping patterns of brain activity, which are constantly and rapidly changing with every experience we have. Thus, we are extremely excited to see our work contribute significantly to achieving this goal.”

 

Read the original article Here

Comments (1)
Interhemispheric Connectivity Related to Creativity
By Jason von Stietz, M.A.
March 23, 2017
Shutterstock

 

Popular media often suggests that the “right brain” is more associated with creativity and artistic ability. Researchers at Duke University and University of Padova examined the relationship diffusor tensor imaging brain scans and measures of creativity in 68 participants. Findings indicated that greater interhemispheric connectivity related to higher creativity. The study was discussed in a recent article in Medical Xpress:  

 

For the study, statisticians David Dunson of Duke University and Daniele Durante of the University of Padova analyzed the network of white matter connections among 68 separate brain regions in healthy college-age volunteers.

 

The brain's white matter lies underneath the outer grey matter. It is composed of bundles of wires, or axons, which connect billions of neurons and carry electrical signals between them.

 

A team led by neuroscientist Rex Jung of the University of New Mexico collected the data using an MRI technique called diffusion tensor imaging, which allows researchers to peer through the skull of a living person and trace the paths of all the axons by following the movement of water along them. Computers then comb through each of the 1-gigabyte scans and convert them to three-dimensional maps—wiring diagrams of the brain.

 

Jung's team used a combination of tests to assess creativity. Some were measures of a type of problem-solving called "divergent thinking," or the ability to come up with many answers to a question. They asked people to draw as many geometric designs as they could in five minutes. They also asked people to list as many new uses as they could for everyday objects, such as a brick or a paper clip. The participants also filled out a questionnaire about their achievements in ten areas, including the visual arts, music, creative writing, dance, cooking and science.

 

The responses were used to calculate a composite creativity score for each person.

 

Dunson and Durante trained computers to sift through the data and identify differences in brain structure.

 

They found no statistical differences in connectivity within hemispheres, or between men and women. But when they compared people who scored in the top 15 percent on the creativity tests with those in the bottom 15 percent, high-scoring people had significantly more connections between the right and left hemispheres.

 

The differences were mainly in the brain's frontal lobe.

 

Dunson said their approach could also be used to predict the probability that a person will be highly creative simply based on his or her brain network structure. "Maybe by scanning a person's brain we could tell what they're likely to be good at," Dunson said.
 

 

The study is part of a decade-old field, connectomics, which uses network science to understand the brain. Instead of focusing on specific brain regions in isolation, connectomics researchers use advanced brain imaging techniques to identify and map the rich, dense web of links between them.

 

Dunson and colleagues are now developing statistical methods to find out whether brain connectivity varies with I.Q., whose relationship to creativity is a subject of ongoing debate.

 

In collaboration with neurology professor Paul Thompson at the University of Southern California, they're also using their methods for early detection of Alzheimer's disease, to help distinguish it from normal aging.

 

By studying the patterns of interconnections in healthy and diseased brains, they and other researchers also hope to better understand dementia, epilepsy, schizophrenia and other neurological conditions such as traumatic brain injury or coma.

 

"Data sharing in neuroscience is increasingly more common as compared to only five years ago," said Joshua Vogelstein of Johns Hopkins University, who founded the Open Connectome Project and processed the raw data for the study.

 

Just making sense of the enormous datasets produced by brain imaging studies is a challenge, Dunson said.

 

Most statistical methods for analyzing brain network data focus on estimating properties of single brains, such as which regions serve as highly connected hubs. But each person's brain is wired differently, and techniques for identifying similarities and differences in connectivity across individuals and between groups have lagged behind.

 

The study appears online and will be published in a forthcoming issue of the journal Bayesian Analysis.

 

Read the original article Here



 

Comments (0)
Underestimating the Value Of Being in Someone Else's Shoes
By Jason von Stietz, M.A.
March 19, 2017
Getty Images

 

Do we need to walk a mile in people’s shoes to understand them? Although people are often confident in their understanding of other’s emotions, recent research found that individual’s often overestimate the accuracy of their understanding. Furthermore, when individuals simulate the experiences of others and infer their emotional response they are significantly more accurate. The study was discussed in a recent article in Medical Xpress: 

 

We tend to believe that people telegraph how they're feeling through facial expressions and body language and we only need to watch them to know what they're experiencing—but new research shows we'd get a much better idea if we put ourselves in their shoes instead. The findings are published in Psychological Science, a journal of the Association for Psychological Science.

 

"People expected that they could infer another's emotions by watching him or her, when in fact they were more accurate when they were actually in the same situation as the other person. And this bias persisted even after our participants gained firsthand experience with both strategies," explain study authors Haotian Zhou (Shanghai Tech University) and Nicholas Epley (University of Chicago).

 

To explore out how we go about understanding others' minds, Zhou, Epley, and co-author Elizabeth Majka (Elmhurst College) decided to focus on two potential mechanisms: theorization and simulation. When we theorize about someone's experience, we observe their actions and make inferences based on our observations. When we simulate someone's experience, we use our own experience of the same situation as a guide.

 

Based on previous research showing that people tend to assume that our feelings 'leak out' through our behavior, Zhou, Epley, and Majka hypothesized that people would overestimate the usefulness of theorizing about another person's experience. And given that we tend to think that individual experiences are unique, the researchers also hypothesized that people would underestimate the usefulness of simulating another person's experience.

 

In one experiment, the researchers asked 12 participants to look at a series of 50 pictures that varied widely in emotional content, from very negative to positive. A webcam recorded their faces as these "experiencers" rated their emotional feelings for each picture. The researchers then brought in a separate group of 73 participants and asked them to predict the experiencers' ratings for each picture. Some of these "predictors" simulated the experience, looking at each picture; others theorized about the experience, looking at the webcam recording of the experiencer; and a third group were able to simulate and theorize at the same time, looking at both the picture and accompanying recording.

 

The results revealed that the predictors were much more accurate when they saw the pictures just as the experiencer had than they were when they saw the recording of the experiencer's face. Interestingly, seeing both the picture and the recording simultaneously yielded no additional benefit—being able to simulate the experience seemed to underlie participants' accuracy.

 

Despite this, people didn't seem to appreciate the benefit of simulation. In a second experiment, only about half of the predictors who were allowed to choose a strategy opted to use simulation. As before, predictors who simulated the rating experience were much more accurate in predicting the experiencer's feelings, regardless of whether they chose that strategy or were assigned to it.

 

In a third experiment, the researchers allowed for dynamic choice, assuming that predictors may increase in accuracy over time if they were able to choose their strategy before each trial. The results showed, once again, that simulation was the better strategy across the board—still, participants who had the ability to choose opted to simulate only about 48% of the time.

 

A fourth experiment revealed that simulation was the better strategy even when experiencers had been told to make their reactions as expressive and "readable' as possible.

 

"Our most surprising finding was that people committed the same mistakes when trying to understand themselves," Zhou and Epley note.

 

Participants in a fifth experiment expected they would be more accurate if they got to watch the expressions they had made while looking at emotional pictures one month earlier—but the findings showed they were actually better at estimating how they had felt if they simply viewed the pictures again.

 

"They dramatically overestimated how much their own face would reveal, and underestimated the accuracy they would glean from being in their own past shoes again," the researchers explain.

 

Although reading other people's mental states is an essential part of everyday life, these experiments show that we don't always pick the best strategy for the task.

 

According to Zhou and Epley, these findings help to shed light on the tactics that people use to understand each other.

 

"Only by understanding why our inferences about each other sometimes go astray can we learn how to understand each other better," the researchers conclude.

 


Read the original article Here


 

Comments (0)
Brain Scans May Predict Adolescent Drug Use
By Jason von Stietz, M.A.
February 26, 2017
Getty Images

 

Is it possible to predict who will face significant drug problems in their adolescence? Some adolescents battle with addictions broadcast all the warning signs whereas others appear to show no warnings at all. In an effort to predict drug use, researchers examined the relationship between brain scans, novelty seeking, and future drug use. The study was discussed in a recent article in Medical Xpress: 

 

There's an idea out there of what a drug-addled teen is supposed to look like: impulsive, unconscientious, smart, perhaps – but not the most engaged. While personality traits like that could signal danger, not every adolescent who fits that description becomes a problem drug user. So how do you tell who's who?

 

There's no perfect answer, but researchers report February 21 in Nature Communications that they've found a way to improve our predictions – using brain scans that can tell, in a manner of speaking, who's bored by the promise of easy money, even when the kids themselves might not realize it.

 

That conclusion grew out of a collaboration between Brian Knutson, a professor of psychology at Stanford, and Christian Büchel, a professor of medicine at Universitätsklinikum Hamburg Eppendorf. With support from the Stanford Neurosciences Institute's NeuroChoice program, which Knutson co-directs, the pair started sorting through an intriguing dataset covering, among other things, 144 European adolescents who scored high on a test of what's called novelty seeking – roughly, the sorts of personality traits that might indicate a kid is at risk for drug or alcohol abuse.

 

Novelty seeking in a brain scanner

 

Novelty seeking isn't inherently bad, Knutson said. On a good day, the urge to take a risk on something new can drive innovation. On a bad day, however, it can lead people to drive recklessly, jump off cliffs and ingest whatever someone hands out at a party. And psychologists know that kids who score high on tests of novelty seeking are on average a bit more likely to abuse drugs. The question was, could there be a better test, one both more precise and more individualized, that could tell whether novelty seeking might turn into something more destructive.

 

Knutson and Büchel thought so, and they suspected that a brain-scanning test called the Monetary Incentive Delay Task, or MID, could be the answer. Knutson had developed the task early in his career as a way of targeting a part of the brain now known to play a role in mentally processing rewards like money or the high of a drug.

 

The task works like this. People lie down in an MRI brain scanner to play a simple video game for points, which they can eventually convert to money. More important than the details of the game, however, is this: At the start of each round, each player gets a cue about how many points he stands to win during the round. It's at that point that players start to anticipate future rewards. For most people, that anticipation alone is enough to kick the brain's reward centers into gear.

 

A puzzle and the data to solve it

 

This plays out differently – and a little puzzlingly – in adolescents who use drugs. Kids' brains in general respond less when anticipating rewards, compared with adults' brains. But that effect is even more pronounced when those kids use drugs, which suggests one of two things: Either drugs suppress brain activity, or the suppressed brain activity somehow leads youths to take drugs.

 

If it's the latter, then Knutson's task could predict future drug use. But no one was sure, mainly because no one had measured brain activity in non-drug-using adolescents and compared it to eventual drug use.

 

No one, that is, except Büchel. As part of the IMAGEN consortium, he and colleagues in Europe had already collected data on around 1,000 14-year-olds as they went through Knutson's MID task. They had also followed up with each of them two years later to find out if they'd become problem drug users – for example, if they smoked or drank on a daily basis or ever used harder drugs like heroin. Then, Knutson and Büchel focused their attention on 144 adolescents who hadn't developed drug problems by age 14 but had scored in the top 25 percent on a test of novelty seeking.

 

Lower anticipation

 

Analyzing that data, Knutson and Büchel found they could correctly predict whether youngsters would go on to abuse drugs about two-thirds of the time based on how their brains responded to anticipating rewards. This is a substantial improvement over behavioral and personality measures, which correctly distinguished future drug abusers from other novelty-seeking 14-year-olds about 55 percent of the time, only a little better than chance.

 

"This is just a first step toward something more useful," Knutson said. "Ultimately the goal – and maybe this is pie in the sky – is to do clinical diagnosis on individual patients" in the hope that doctors could stop drug abuse before it starts, he said.

 

Knutson said the study first needs to be replicated, and he hopes to follow the kids to see how they do further down the line. Eventually, he said, he may be able not just to predict drug abuse, but also better understand it. "My hope is the signal isn't just predictive, but also informative with respect to interventions."

 

Read the original article Here


 

Comments (0)
Gist Reasoning and Traumatic Brain Injury
By Jason von Stietz, M.A.
February 24, 2017
BigStock

 

The ability to understand the “gist” of things is an important and likely underappreciated skill. Gist reasoning, or the ability to look at complex, concrete information and make deeper level abstract interpretations is an essential part of each person’s daily activities. Researchers at Texas Woman’s University and University of Texas Dallas studied the relationship between gist reasoning and cognitive deficits in individual’s with traumatic brain injury. The study was discussed in a recent article in Medical Xpress:     

 

The study, published in Journal of Applied Biobehavioral Research, suggests the gist reasoning test may be sensitive enough to help doctors and clinicians identify previously undiagnosed cognitive changes that could explain the daily life difficulties experienced by TBI patients and subsequently guide appropriate therapies.

 

The gist reasoning measure, called the Test of Strategic Learning, accurately identified 84.7 percent of chronic TBI cases, a much higher rate than more traditional tests that accurately identified TBI between 42.3 percent and 67.5 percent of the time.

 

"Being able to 'get the gist' is essential for many day-to-day activities such as engaging in conversation, understanding meanings that are implied but not explicitly stated, creating shopping lists and resolving conflicts with others," said study lead author Dr. Asha Vas of Texas Woman's University who was a postdoctoral fellow at the Center for BrainHealth at the time of the study. "The gist test requires multiple cognitive functions to work together."

 

The study featured 70 participants ages 18 to 55, including 30 who had experienced a moderate to severe chronic traumatic brain injury at least one year ago. All the participants had similar socioeconomic status, educational backgrounds and IQ.

 

Researchers were blinded to the participant's TBI status while administering four different tests that measure abstract thinking—the ability to understand the big picture, not just recount the details of a story or other complex information. Researchers used the results to predict which participants were in the TBI group and which were healthy controls.

 

During the cognitive tests, the majority of the TBI group easily recognized abstract or concrete information when given prompts in a yes-no format. But the TBI group performed much worse than controls on tests, including gist reasoning, that required deeper level processing of information with fewer or no prompts.

 

The gist reasoning test consists of three texts that vary in length (from 291 to 575 words) and complexity. The test requires the participant to provide a synopsis of each of the three texts.

 

Vas provided an example of what "getting the gist" means using Shakespeare's play Romeo and Juliet.

 

"There are no right or wrong answers. The test relies on your ability to derive meaning from important story details and arrive at a high-level summary: Two young lovers from rival families scheme to build a life together and it ends tragically. You integrate existing knowledge, such as the concept of love and sacrifice, to create a meaning from your perspective. Perhaps, in this case, 'true love does not conquer all,'" she said.

 

Past studies have shown that higher scores on the gist reasoning test in individuals in chronic phases of TBI correlate to better ability to perform daily life functions.

 

"Perhaps, in the future, the gist reasoning test could be used as a tool to identify other cognitive impairments," said Dr. Jeffrey Spence, study co-author and director of biostatistics at the Center for BrainHealth. "It may also have the potential to be used as a marker of cognitive changes in aging."

 

 

Read the original article Here

 

Comments (0)
Long-Term Effects of TBI Studied
By Jason von Stietz, M.A.
February 19, 2017
Getty Images

 

The long-term effects of traumatic brain injury (TBI) are just now beginning to be understood. Researchers from Cincinnati Children’s Hospital Medical Center examined the impact of TBI on pediatric patients about seven years after the injury. The findings indicated that children who have suffered a mild to moderate TBI are twice as likely to have issues with attention than their healthy counterparts. The study was discussed in a recent issue in Medical Xpress:  

 

In a study to be presented Friday Feb. 10 at the annual meeting of the Association of Academic Physiatrists in Las Vegas, researchers from Cincinnati Children's will present research on long-term effects of TBI—an average of seven years after injury. Patients with mild to moderate brain injuries are two times more likely to have developed attention problems, and those with severe injuries are five times more likely to develop secondary ADHD. These researchers are also finding that the family environment influences the development of these attention problems.

 

  • Parenting and the home environment exert a powerful influence on recovery. Children with severe TBI in optimal environments may show few effects of their injuries while children with milder injuries from disadvantaged or chaotic homes often demonstrate persistent problems.

 

 

  • Early family response may be particularly important for long-term outcomes suggesting that working to promote effective parenting may be an important early intervention.
  • Certain skills that can affect social functioning, such as speed of information processing, inhibition, and reasoning, show greater long-term effects.
  • Many children do very well long-term after brain injury and most do not have across the board deficits.

More than 630,000 children and teenagers in the United States are treated in emergency rooms for TBI each year. But predictors of recovery following TBI, particularly the roles of genes and environment, are unclear. These environmental factors include family functioning, parenting practices, home environment, and socioeconomic status. Researchers at Cincinnati Children's are working to identify genes important to recovery after TBI and understand how these genes may interact with environmental factors to influence recovery.

 

  • They will be collecting salivary DNA samples from more than 330 children participating in the Approaches and Decisions in Acute Pediatric TBI Trial.
  • The primary outcome will be global functioning at 3, 6, and 12 months post injury, and secondary outcomes will include a comprehensive assessment of cognitive and behavioral functioning at 12 months post injury.
  • This project will provide information to inform individualized prognosis and treatment plans.

Using neuroimaging and other technologies, scientists are also learning more about brain structure and connectivity related to persistent symptoms after TBI. In a not-yet-published Cincinnati Children's study, for example, researchers investigated the structural connectivity of brain networks following aerobic training. The recovery of structural connectivity they discovered suggests that aerobic training may lead to improvement in symptoms.

 

Over the past two decades, investigators at Cincinnati Children's have conducted a series of studies to develop and test interventions to improve cognitive and behavioral outcomes following pediatric brain injury. They developed an innovative web-based program that provides family-centered training in problem-solving, communication, and self-regulation.

 

  • Across a series of randomized trials, online family problem-solving treatment has been shown to reduce behavior problems and executive dysfunction (management of cognitive processes) in older children with TBI, and over the longer-term improved everyday functioning in 12-17 year olds.
  • Web-based parenting skills programs targeting younger children have resulted in improved parent-child interactions and reduced behavior problems. In a computerized pilot trial of attention and memory, children had improvements in sustained attention and parent-reported executive function behaviors. These intervention studies suggest several avenues for working to improve short- and long-term recovery following TBI.

 

Read the original article Here


 

Comments (0)
Amygdalar Activity Predicts Cardiovascular Events, Study Finds
By Jason von Stietz, M.A.
January 31, 2017
Tawakol et al, 2017, The Lancet

 

It is well known that emotional stress is related to risk of cardiovascular disease. However, the mechanism of this phenomenon has not always been clearly understood. Researchers at Massachusetts General Hospital, Icahn School of Medicine at Mount Sinai, and Tufts University collaborated to investigate the relationship between amygdalar activity and cardiovascular events. Findings indicated that amygdalar activity predicted bone-marrow activity and arterial inflammation. The study was discussed in a recent article in Medical Xpress: 

 

"While the link between stress and heart disease has long been established, the mechanism mediating that risk has not been clearly understood," says Ahmed Tawakol, MD, co-director of the Cardiac MR PET CT Program in the MGH Division of Cardiology and lead author of the paper. "Animal studies have shown that stress activates bone marrow to produce white blood cells, leading to arterial inflammation, and this study suggests an analogous path exists in humans. Moreover, this study identifies, for the first time in animal models or humans, the region of the brain that links stress to the risk of heart attack and stroke."

 

The paper reports on two complementary studies. The first, conducted at MGH, analyzed imaging and medical records data from almost 300 individuals who had PET/CT brain imaging, primarily for cancer screening, using a radiopharmaceutical called FDG that both measures the activity of areas within the brain and also reflects inflammation within arteries. All participants in that study had no active cancer or cardiovascular disease at the time of imaging and each had information in their medical records on at least three additional clinical visits in the two to five years after imaging. The second study, conducted at the Translational and Molecular Imaging Institute (TMII) at ISMMS in New York, enrolled 13 individuals with a history of post-traumatic stress disorder, who were evaluated for their current levels of perceived stress and received FDG-PET scanning to measure both amygdala activity and arterial inflammation.

 

Among participants in the larger, longitudinal study, 22 experienced a cardiovascular event—such as a heart attack, stroke or episodes of angina—in the follow-up period; and the prior level of activity in the amygdala strongly predicted the risk of a subsequent cardiovascular event. That association remained significant after controlling for traditional cardiovascular risk factors, and after controlling for presence of symptom-free atherosclerosis at the time of imaging. The association became even stronger when the team used a more stringent definition of cardiovascular events—major adverse cardiovascular events.


 

Amygdalar activity was also associated with the timing of events, as those with the highest levels of activity had events sooner than those with less extreme elevation, and greater amygdalar activity was also linked to elevated activity of the blood-cell-forming tissue in the bone marrow and spleen and to increased arterial inflammation. In the smaller study, participants' current stress levels were strongly associated with both amygdalar activity and arterial inflammation.

 

Co-senior author Zahi A. Fayad, PhD, vice-chair for Research in the Department of Radiology and the director of TMII at ISMMS in New York, says, "This pioneering study provides more evidence of a heart-brain connection, by elucidating a link between resting metabolic activity in the amygdala, a marker of stress, and subsequent cardiovascular events independently of established cardiovascular risk factors. We also show that amygdalar activity is related to increased associated perceived stress and to an increased vascular inflammation and hematopoeitic activity."

 

Tawakol adds, "These findings suggest several potential opportunities to reduce cardiovascular risk attributable to stress. It would be reasonable to advise individuals with increased risk of cardiovascular disease to consider employing stress-reduction approaches if they feel subjected to a high degree of psychosocial stress. However, large trials are still needed to confirm that stress reduction improves cardiovascular disease risk. Further, pharmacological manipulation of the amygdalar-bone marrow-arterial axis may provide new opportunities to reduce cardiovascular disease. In addition, increased stress associates with other diseases, such as cancer and inflammatory conditions, including rheumatoid arthritis and psoriasis. So it will be important to evaluate whether calming this stress mechanism produces benefits in those diseases as well."


 

Read the original article Here

Comments (0)
Weaker Connectivity In Brains of Those with Bipolar Disoder
By Jason von Stietz, M.A.
January 29, 2017
Getty Images

 

According to the National Institute of Mental Health, 2.6 percent of the U.S. adult population suffers from bipolar disorder. Bipolar disorder is a debilitating illness that can involve several years of mental health treatment before a proper diagnosis is given. Australian researchers recently conducted a study investigating the use of MRI scans of the brain in detecting biomarkers for the disorder. Findings indicated that participants suffering from bipolar disorder had weaker connectivity in emotional centers of the brain, such as right-sided fronto-temporal and temporal areas, than their healthy counterparts. The study was discussed in a recent article in MedicalXpress: 

 

It is hoped the findings will lead to new tools to identify and manage those at risk before the onset of the disorder and help reduce its impact once it develops.

 

The study, published today in the prestigious Nature journal Molecular Psychiatry, was a collaboration between researchers from QIMR Berghofer Medical Research Institute in Brisbane and UNSW in Sydney.

 

Researchers conducted MRI scans on the brains of three groups: people who had been diagnosed with bipolar disorder; people who had a first-degree relative (parent, sibling or child) with bipolar and who were at high genetic risk themselves; and people unaffected by bipolar disorder.

 

They found networks of weaker connections between different brain regions in both the bipolar and high-risk subjects and disturbances in the connections responsible for regulating emotional and cognitive processes.

 

"We know that changes in these brain wiring patterns will impact upon a person's capacity to perform key emotional and cognitive functions," said study author Scientia Professor Philip Mitchell from UNSW's School of Psychiatry.

 

"Each year we will be following up with participants from this study who are at high genetic risk of developing bipolar disorder, to see if the brain changes identified in MRI scans reveal who will develop episodes of mania," Professor Mitchell said.

 

Bipolar disorder is a debilitating illness affecting about one in 70 Australians. It typically involves unstable mood swings between manic 'highs' and depressive 'lows'. The age of onset is usually between 18 and 30 years.

 

Professor Michael Breakspear, from QIMR Berghofer and Brisbane's Metro North Hospital and Health Service, said the research team hoped to use the findings to develop a way of identifying those at risk of bipolar before the onset of the disorder.

 

"At the moment we don't have any markers or tests for predicting who is at risk of developing bipolar disorder, as we do for heart disease," Professor Breakspear said.

 

"If we can develop a tool to identify and confirm those who are at the very highest risk, then we can advise them on how to minimise their risk of developing bipolar, for example, by avoiding illicit drugs and minimising stress. These discoveries may open the door to starting people on medication before the illness, to reduce the risk of manic episodes before the first one occurs.

 

"Our long-term goal is to develop imaging-based diagnostic tests for bipolar. At the moment, diagnosis relies on the opinion of a doctor. Recent UNSW-led research found an average delay of six years between when a person with bipolar experiences their first manic episode and when they receive a correct diagnosis," Professor Breakspear said.

 

"Many people are incorrectly diagnosed with depression or other disorders. This delays the start of proper treatment with medications that are specific to bipolar disorder. Bipolar has the highest suicide rate of any mental illness, so it's crucial that we diagnose people correctly straight away so they can start receiving the right treatment."

 

The study was funded by the National Health and Medical Research Council and the Landsdowne Foundation.

 

 

Read the original article Here
 

Comments (0)