Blog

Share this page on your social media


Sleep Patterns Uncovered Using Smartphone Application
By Jason von Stietz, M.A.
May 31, 2016
Getty Images

 

What is the powerful influencer of our sleep patterns? Circadian rhythms? The needs and demands of society? Why is jet lag so difficult to overcome? Researchers from University of Michigan studied sleep patterns by employing a free app. The findings were discussed in a recent article in Medical Xpress:  

 

A pioneering study of worldwide sleep patterns combines math modeling, mobile apps and big data to parse the roles society and biology each play in setting sleep schedules.

 

The study, led by University of Michigan mathematicians, used a free smartphone app that reduces jetlag to gather robust sleep data from thousands of people in 100 nations. The researchers examined how age, gender, amount of light and home country affect the amount of shut-eye people around the globe get, when they go to bed, and when they wake up.

 

Among their findings is that cultural pressures can override natural circadian rhythms, with the effects showing up most markedly at bedtime. While morning responsibilities like work, kids and school play a role in wake-time, the researchers say they're not the only factor. Population-level trends agree with what they would expect from current knowledge of the circadian clock.

 

"Across the board, it appears that society governs bedtime and one's internal clock governs wake time, and a later bedtime is linked to a loss of sleep," said Daniel Forger, who holds faculty positions in mathematics at the U-M College of Literature, Science, and the Arts, and in the U-M Medical School's Department of Computational Medicine and Bioinformatics. "At the same time, we found a strong wake-time effect from users' biological clocks—not just their alarm clocks. These findings help to quantify the tug-of-war between solar and social timekeeping."

 

When Forger talks about internal or biological clocks, he's referring to circadian rhythms—fluctuations in bodily functions and behaviors that are tied to the planet's 24-hour day. These rhythms are set by a grain-of-rice-sized cluster of 20,000 neurons behind the eyes. They're regulated by the amount of light, particularly sunlight, our eyes take in.

 

Circadian rhythms have long been thought to be the primary driver of sleep schedules, even since the advent of artificial light and 9-to-5 work schedules. The new research helps to quantify the role that society plays.

 

Here's how Forger and colleague Olivia Walch arrived at their findings. Several years ago, they released an app called Entrain that helps travelers adjust to new time zones. It recommends custom schedules of light and darkness. To use the app, you have to plug in your typical hours of sleep and light exposure, and are given the option of submitting your information anonymously to U-M.

 

The quality of the app's recommendations depended on the accuracy of the users' information, and the researchers say this motivated users to be particularly careful in reporting their lighting history and sleep habits.

 

With information from thousands of people in hand, they then analyzed it for patterns. Any correlations that bubbled up, they put to the test in what amounts to a circadian rhythm simulator. The simulator—a mathematical model—is based on the field's deep knowledge of how light affects the brain's suprachiasmatic nucleus (that's the cluster of neurons behind the eyes that regulates our internal clocks). With the model, the researchers could dial the sun up and down at will to see if the correlations still held in extreme conditions.

 

"In the real world, bedtime doesn't behave how it does in our model universe," Walch said. "What the model is missing is how society affects that."

 

The spread of national averages of sleep duration ranged from a minimum of around 7 hours, 24 minutes of sleep for residents of Singapore and Japan to a maximum of 8 hours, 12 minutes for those in the Netherlands. That's not a huge window, but the researchers say every half hour of sleep makes a big difference in terms of cognitive function and long-term health.

 

The findings, the researchers say, point to an important lever for the sleep-deprived—a set that the Centers for Disease Control and Prevention is concerned about. A recent CDC study found that across the U.S., one in three adults aren't getting the recommended minimum of seven hours. Sleep deprivation, the CDC says, increases the risk of obesity, diabetes, high blood pressure, heart disease, stroke and stress.

 

The U-M researchers also found that:

 

  • Middle-aged men get the least sleep, often getting less than the recommended 7 to 8 hours.
  • Women schedule more sleep than men, about 30 minutes more on average. They go to bed a bit earlier and wake up later. This is most pronounced in ages between 30 and 60.
  • People who spend some time in the sunlight each day tend to go to bed earlier and get more sleep than those who spend most of their time in indoor light.
  • Habits converge as we age. Sleep schedules were more similar among the older-than-55 set than those younger than 30, which could be related to a narrowing window in which older individuals can fall and stay asleep.

 

Sleep is more important than a lot of people realize, the researchers say. Even if you get six hours a night, you're still building up a sleep debt, says Walch, doctoral student in the mathematics department and a co-author on the paper.

 

"It doesn't take that many days of not getting enough sleep before you're functionally drunk," she said. "Researchers have figured out that being overly tired can have that effect. And what's terrifying at the same time is that people think they're performing tasks way better than they are. Your performance drops off but your perception of your performance doesn't."

 

Aside from the findings themselves, the researchers say the work demonstrates that mobile technology can be a reliable way to gather massive data sets at very low cost.

 

"This is a cool triumph of citizen science," Forger said.

 

Read the original article Here

 

 

 

Comments (0)
Role of Frontal Cortex in Perceptual Decision-Making
By Jason von Stietz, M.A.
May 22, 2016
 

 

Why do we sometimes not see what is directly in front of us? Researchers at the Georgia Institute of Technology and University of California Berkeley studied the role of the frontal cortex in perceptual decision-making by utilizing transcranial magnetic stimulation. The study was published in the peer-reviewed journal Proceedings of the National Academy of the Sciences of the United States and later discussed in a recent article in Medical Xpress:  

 

A sportscaster lunges forward. "Interception! Drew Brees threw the ball right into the opposing linebacker's hands! Like he didn't even see him!"

 

The quarterback likely actually did not see the defender standing right in front of him, said Dobromir Rahnev, a psychologist at the Georgia Institute of Technology. Rahnev leads a research team making new discoveries about how the brain organizes visual perception, including how it leaves things out even when they're plainly in sight.

 

Rahnev and researchers from the University of California, Berkeley have come up with a rough map of the frontal cortex's role in controlling vision. They published their findings on Monday, May 9, 2016 in the journal of theProceedings of the National Academy of Sciences.

 

Thinking cap

 

The frontal cortex is often seen as our "thinking cap," the part of the brain scientists associate with thinking and making decisions. But it's not commonly connected with vision. "Some people believe that the frontal cortex is not involved," said Rahnev, an assistant professor at the School of Psychology. The new research adds to previous evidence that it is, he said.

 

The lack of association with that part of the brain may have to do with the fact it's other parts that transform information coming from the eyes into sight and others still that make sense of it by doing things like identifying objects in it.

 

But the thinking cap of the brain controls and oversees this whole process, making it as essential to how we see as those other areas, Rahnev said. How that works also accounts for why we sometimes miss things right in front of us.

 

A camera it's not

 

"We feel that our vision is like a camera, but that is utterly wrong," Rahnev said. "Our brains aren't just seeing, they're actively constructing the visual scene and making decisions about it." Sometimes the frontal cortex isn't expecting to see something, so although it's in plain sight, it blots it out of consciousness.

 

To test out the fontal cortex's involvement in vision, the researchers ran a two-part experiment.

 

First, they observed which regions of the brain—in particular the frontal cortex—lit up with activity while healthy volunteers completed visual tasks corresponding to three basic stages of conscious visual perception.

 

Second, they inhibited those same regions using magnetic stimulation to confirm their involvement in each visual stage.

 

Believing is part of seeing

 

The first stage of the visual perception the researchers tested for was selection, Rahnev said. That's when the brain picks out part of the vast array of available visual stimuli to actually pay attention to.

 

In the case of the football quarterback, this might mean focusing on the route the receiver takes.

 

The second stage is combination, he said. The brain merges the visual information it processed with other material. "The quarterback's brain is putting what he actually sees together with expectations based on the play he called," Rahnev said.

 

Then comes evaluation. The quarterback needs to decide whether to release the ball given everything he has processed.

 

Expecting a blocker to stop the defending player (which didn't happen), he may have blotted him out of perception and thrown the ball right at him. Interception.

 

"The frontal cortex sends a signal to move your attention onto the object you select," Rahnev said. "It does some of the combining with other information, and then it's probably the primary evaluator of what you think you saw."

 

Simple vision brain map

 

In experiments, during a functional MRI scan, different parts of the frontal cortex of the participants lit up, corresponding to each vision function.

 

The back of the frontal cortex activated during selection; its midsection lit up during combination, and the front, or anterior, part cranked up during evaluation.

 

That's how the researchers arrived at a kind of vision map of the frontal cortex. "It's a rudimentary map," Rahnev said. "A very simple one that just says, 'This is the back. This in the middle. This is the front.'"

 

The critical evidence

 

The critical evidence for this map came from the use of magnetic stimulation. When the researchers used it to inhibit the back and middle of the frontal cortex separately, subjects became less able to complete the corresponding functions of selection and combination.

 

When they stimulated the front, the opposite happened. Subjects were slightly but significantly better able to evaluate the accuracy of what they think they saw.

 

"This is a really clear demonstration of the role that the frontal cortex, which is usually seen as the seat of thought, plays in controlling vision."

 

Sorry officer!

 

And there is a practical takeaway for health and safety. Instead of the quarterback telling the coach, "I swear I didn't see that coming," often it's motorists telling police officers the same thing after a car accident.

 

Distraction is often the culprit, because it overtaxes the organization of perception, Rahnev said. These three functions are going on all the time in multiple scenarios in our brains while it processes the world around us.

 

But add too much to the pile, like texting behind the wheel, Rahnev said, and "you can run right into a parked car without ever seeing it."

 

Read the original Here

 

Comments (0)
Do Mirror Neurons Influence Action Recognition?
By Jason von Stietz, M.A.
May 20, 2016
Getty Images

 

Mirror neurons are thought to be the switching point between visual and motor centers in the brain. Mirror neurons help our brain to interpret the actions of others. When someone raises a fist toward us do they mean to give us a friendly greeting (fist bump) or do they mean to attack us? Researchers at the Max Plank Institute studied the role of mirror neurons in action recognition. The study was discussed in a recent article in Medical Xpress: 

 

It is suspected that mirror neurons enable us to empathize and put ourselves 'in other people's shoes'. When we see that someone has been injured, we also experience internal suffering: these special neurons cause what we see to be simulated in our brain in a way that makes us feel as though we are experiencing it in our own bodies.

 

In perception research, it is assumed that mirror neurons enable people go through a movement they have seen in their own motor system. This internal recreation of what we have seen probably enables us to infer the meaning of the observed action. The mirror neurons act as the switching point between the motor and visual areas of the brain. Conversely, when the motor system is supposed to be the determining factor in the classification of an action, it means that the perception can also be manipulated by our own implementation of an action.

 

Attack or greeting?

 

In their study, the researchers analyzed the mechanism by which the brain recognizes an action. To do this, they showed the test subjects two different movements: a punch and a greeting gesture known as the 'fist bump', practised by young men in particular. The researchers arranged the scenario as realistically as possible. A life-sized avatar was shown on a screen facing the test subjects. Using 3D glasses, the subjects were able to see their virtual partners in three dimensions – the avatar's movements appeared as though they were unfolding within the test subjects' reach.

 

All the test subjects were required to do was to decide whether they were being presented with an aggressive punch or well-intentioned greeting. However, the scientists made the conditions more difficult by combining the two gestures in a single movement. The avatar's intentions were thus a matter of interpretation.

 

The question behind the experiment then was whether people allow themselves to be influenced by their own motor system when interpreting the actions of others. The test subjects were manipulated in different ways in the experiment: they could observe a clearly identifiable action played in a continuous loop on a screen. They became active at the same time themselves by carrying out air punches, for example. They were then asked to assess how the indefinable movement of the avatar should be interpreted.

 

I only believe what I also see

 

When the two sensory stimuli were played out against each other – that is the test subjects saw a fist bump in front of them while carrying out a punch movement themselves – the visual impression was the clear winner. The subject's own movement did not have any influence on the perception. Contrary to what was previously assumed, the motor system had little or no influence on the participants' assessment of the movement. To the astonishment of the scientists, the mirror neurons associated with the motor system clearly did not have any major role to play in the action recognition process.

 

With their experiment set up, the team was able to study the contribution of the motor system to action recognition during social interaction for the first time and, thereby also the existing theory on the interaction between mirror neurons and stimulus processing. "Contrary to what was previously assumed, the mirror neurons do not have a particularly significant influence on the interpretation of an action. Visual perception is namely far more important for our brain – in social situations, we rely almost exclusively on what we see," says the head of the study, Stephan de la Rosa, summarizing the study findings.

 

Read the original article Here

Comments (0)
Memory Impairment Related To Brain Signal Between Seizures
By Jason von Stietz, M.A.
May 15, 2016
Getty Images

 

Many patients with epilepsy suffer from cognitive deficits. Researchers at New York University Langone Medical Center conducted an animal model study in which the relationship between signals from the hippocampus to the cortex relate to impaired memory in seizure patients. The study was published in Nature Medicine and discussed in a recent article of NeuroScientistNews: 

 

Between seizures and continually, brain cells in epileptic patients send signals that make "empty memories," perhaps explaining the learning problems faced by up to 40 percent of patients. This is the finding of a study in rats and humans led by researchers at New York University (NYU) Langone Medical Center and published in Nature Medicine.

 

"Our study sheds the first light on the mechanisms by which epilepsy hijacks a normal brain process, disrupting the signals needed to form memories," says study lead author and NYU Langone pediatric neurologist Jennifer Gelinas, MD, PhD. "Many of my patients feel that cognitive problems have at least as much impact on their lives as seizures, but we have nothing to offer them beyond seizure control treatments. We hope to change that."

 

The study results revolve around two brain regions, the hippocampus and cortex, shown by past studies to exchange precise signals as each day's experiences are converted into permanent memories during sleep. The study authors found that epileptic signals come from the hippocampus, not as the part of normal memory consolidation, but instead as meaningless commands that the cortex must process like memories.

 

Study rats experiencing such abnormal signals had significant difficulties navigating to places where they had previously found water. Furthermore, the degree of abnormal hippocampal-cortical signaling in study animals tracked closely with the level of memory impairment.

 

The study also looked at data from epilepsy patients that had their brain signals monitored as part of surgery preparation. Researchers found that rats and humans with epilepsy experienced similar, abnormal hippocampal discharges between seizures that resembled but out-competed normal memory-forming communication between brain regions.

 

Given the tens of milliseconds delay observed between hippocampus signals and the response from the cortex, researchers see a time window during which an implanted device might interrupt disease-related signals, and have launched a related design effort.

 

Foundation Built over Decades

 

Senior author of the study and NYU Langone neuroscientist Gyorgy Buzsaki, PhD, had established, starting in 1989, the theory that memories form in two stages: one while awake and another where we replay the day's events during sleep.

 

As the latest step in that work, Buzsaki also led a study published in the journal Science last month that explained key mechanisms behind hippocampal-cortical memory consolidation.

 

Countering the idea most neurons contribute equally as memories form, his team found that a few strongly active "rigid" neurons perform the same way before and after experiences; while a second set of rarely contributing "plastic" neurons behave differently before and after opportunities for memory consolidation.

 

"We seem to have evolved with both a stable template of neurons that process what is the same about the things we encounter, and a second group that can learn with new experiences," says Buzsaki. "This new understanding of memory consolidation made possible our insights into epilepsy."

 

Buzsaki has shown that the hippocampus processes information in rhythmic cycles, with thousands of nerve signals sent regularly and within milliseconds of each other. By firing in synchrony, brain cells cooperate to achieve complex signals, but only if this wave is sculpted, with signals afforded proper strengths and placed in order. Unfortunately, the synchronous nature of hippocampal signaling creates risk, says Buzsaki, because without proper control it can convey powerful nonsense messages to the rest of the brain.

 

Read the original article Here

Comments (0)
by -