Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 1830

|By Gary Stix Implantation of electrodes deep within the brain is now commonly performed for treatment of the neurological disorders Parkinson’s disease and essential tremor. But the use of deep-brain stimulation, as it is known, is expanding. It is now being assessed in as many as 200 patients for major depression—and is being considered for other disorders such as anorexia. Helen Mayberg, a neurologist from Emory University, has pioneered the use of imaging techniques to understand the functioning of different brain circuits to determine how to tailor various treatments for depression, including deep-brain stimulation, to a patient’s needs. Learn about her work below in “Deep-Brain Stimulation: A Decade of Progress with Helen Mayberg,” a Webinar put on by the Brain & Behavior Research Foundation. © 2015 Scientific American

Keyword: Depression
Link ID: 20584 - Posted: 02.16.2015

By Tia Ghose A woman who had persistent headaches found there was a strange culprit for her pain: a Pilates class that caused her brain fluid to leak, according to a new case report. The brain fluid leak led to a persistent, worsening headache that was only alleviated when the 42-year-old British woman laid down, according to the report that was published in December in the Journal of Medical Case Reports. Though doctors never identified the exact location of the leak, the patient improved after a few weeks of bed rest and pain relievers. [The 16 Oddest Medical Cases] Cerebrospinal fluid is a clear liquid that flows between the brain and its outer covering, and between the spinal cord and its outer covering. Both of these structures' outer coverings are called the dura. This fluid cushions the brain and spinal cord and helps clear metabolic waste from the brain. However, sometimes holes can emerge in the dura, said Dr. Amber Luong, an otolaryngologist at the University of Texas Health Sciences Center in Houston. "The most common cause [of such leaks is] trauma, like a car accident," Luong told Live Science. Often, cerebrospinal fluid leaks out of a person's nose because there is a crack in the base of the skull and a tear in the dura lining the brain. One telltale sign of a cerebrospinal leak is that there is clear, metallic-tasting fluid coming out of just one nostril, Luong said. (The woman in this case did not experience this symptom because her tear was around the spinal cord, not the brain.)

Keyword: Brain Injury/Concussion
Link ID: 20579 - Posted: 02.16.2015

Dr. Lisa Sanders. On Wednesday, we challenged Well readers to take on the case of a 21-year-old college student with chronic headaches who suddenly became too dizzy to walk. She had a medical history that was complicated by back surgery and a subsequent infection, and chronic headaches after a car accident. More than 300 of you wrote in with suggested diagnoses, but only a handful of you noticed the clue that led the medical student who saw the patient to the right answer. The cause of the young woman’s dizziness was… Postural tachycardia syndrome, or POTS. The first reader to make this diagnosis was Theresa Baker, a retired bookkeeper and mother from Philomath, Ore. She said she immediately recognized the disorder because her young niece has suffered from it for over a decade. Her episodes of dizziness and fainting had started when she was just 13. Well done, Ms. Baker! The Diagnosis Postural tachycardia syndrome — also called postural orthostatic tachycardia syndrome — is an unusual condition in which simply being upright causes symptoms of lightheadedness, sometimes to the point of fainting, along with an increase in heart rate faster than 130 beats per minute, all of which improves when the patient lies down. These basic symptoms are often accompanied by fatigue, which is often worst after any type of exertion, along with a loss of concentration, blurred or tunnel vision, difficulty sleeping or nausea. POTS is considered a syndrome rather than a disease because it has many possible causes. It can be transient — a side effect of certain medications or a result of loss of conditioning, acute blood loss or dehydration — and in these cases it resolves when the trigger is removed. Other types of POTS are more persistent — which turned out to be the case for this patient — lasting months or years. © 2015 The New York Times Company

Keyword: Miscellaneous
Link ID: 20575 - Posted: 02.13.2015

By David Tuller The Institute of Medicine on Tuesday proposed a new name and new diagnostic criteria for the condition that many still call chronic fatigue syndrome. Experts generally agree that the disease has a physical basis, but they have struggled for decades to characterize its symptoms. The new report may help improve diagnosis, but the recommendations are unlikely to end the long, contentious debate over who has the condition and what may be causing it. An institute panel recommended that the illness be renamed “systemic exertion intolerance disease,” a term that reflects what patients, clinicians and researchers all agree is a core symptom: a sustained depletion of energy after minimal activity, called postexertional malaise. The new name “really describes much more directly the key feature of the illness, which is the inability to tolerate both physical and cognitive exertion,” said Dr. Peter Rowe, a member of the panel and a pediatrician at Johns Hopkins who treats children with the condition. An alternate name for the illness, myalgic encephalomyelitis, meaning “brain and spinal cord inflammation with muscle pain,” was coined decades ago. Many experts now refer to the condition as M.E./C.F.S. About one million people in the United States are believed to have the syndrome. Many say they have been accused of imagining or exaggerating their symptoms, and many doctors have long viewed it as a psychological illness. The authors urged that doctors take patients’ physical complaints seriously. “This is not a figment of their imagination,” said Dr. Ellen Wright Clayton, the chairwoman of the Institute of Medicine panel and a professor of pediatrics and law at Vanderbilt University. Patients attribute much of their mistreatment to the name “chronic fatigue syndrome,” chosen by the Centers for Disease Control in 1988. © 2015 The New York Times Company

Keyword: Depression
Link ID: 20573 - Posted: 02.13.2015

Scientists have uncovered more than 90 new gene regions that could help explain why some people are more likely to put on weight than others. The team scoured DNA libraries of more than 300,000 people, constructing the largest-ever genetic map of obesity. Looking for consistent patterns they found a link with genes involved in brain processes, suggesting obesity could partly have a neurological basis. The results are published in the journal Nature. Researchers from the international Giant consortium (Genetic Investigation of Anthropometric Trait), analysed the genetics behind body mass index (a ratio of weight and height ). And in a separate Nature paper they looked specifically at how genetics influence where fat is distributed around the body. Fat around the abdomen for example can cause more health problems than fat carried around the thighs. Some 33 newly pinpointed gene regions were linked to body fat distribution - giving further clues about why some people are pear-shaped while others put on weight more around the tummy. They also identified more than 60 genetic locations that influence body mass index - tripling the number previously known. And some of these regions have links with the nervous system. © 2015 BBC

Keyword: Obesity; Genes & Behavior
Link ID: 20572 - Posted: 02.13.2015

Ewen Callaway A mysterious group of humans from the east stormed western Europe 4,500 years ago — bringing with them technologies such as the wheel, as well as a language that is the forebear of many modern tongues, suggests one of the largest studies of ancient DNA yet conducted. Vestiges of these eastern émigrés exist in the genomes of nearly all contemporary Europeans, according to the authors, who analysed genome data from nearly 100 ancient Europeans1. The first Homo sapiens to colonize Europe were hunter-gatherers who arrived from Africa, by way of the Middle East, around 45,000 years ago. (Neanderthals and other archaic human species had begun roaming the continent much earlier.) Archaeology and ancient DNA suggest that farmers from the Middle East started streaming in around 8,000 years ago, replacing the hunter-gatherers in some areas and mixing with them in others. But last year, a study of the genomes of ancient and contemporary Europeans found echoes not only of these two waves from the Middle East, but also of an enigmatic third group that they said could be from farther east2 (see 'Ancient European genomes reveal jumbled ancestry'). Ancient genes To further pin down the origins of this ghost lineage, a team led by David Reich, an evolutionary and population geneticist at Harvard Medical School in Boston, Massachusetts, analysed nuclear DNA from the bodies of 69 individuals who lived across Europe between 8,000 and 3,000 years ago. They also examined previously published genome data from another 25 ancient Europeans, including Ötzi, the 5,300-year-old 'ice man' who was discovered on the Italian-Austrian border. © 2015 Nature Publishing Group

Keyword: Language
Link ID: 20571 - Posted: 02.13.2015

By Devin Powell Dog owners may think their pets can tell a smile from a frown, but scientific evidence has been lacking. Now, researchers have trained dogs from a variety of breeds to look at a pair of photos arranged side by side—one showing the upper half of a woman’s face looking happy and the other showing the upper half of the same woman’s face looking angry—and pick out the happy expression by touching their snouts to it (pictured). When then shown the lower halves of the faces or pieces of other people’s faces, the perceptive pooches could still easily discern happy from angry. Another group of canines similarly learned to identify angry faces. Dogs in a previous study that distinguished expressions on whole faces could have done so using simple visual clues that reappeared in every face: the white of teeth in a smile, for instance, or creases in angry skin. Identifying emotions from photos of different parts of the face requires a more holistic understanding of expression, argue the authors of the new study, published online today in Current Biology. While primates are known to recognize faces, dogs may have been especially adapted for emotional sensitivity to humans during their domestication. The researchers plan to investigate how common this ability is by testing pigs and other animals. © 2015 American Association for the Advancement of Science.

Keyword: Emotions; Evolution
Link ID: 20570 - Posted: 02.13.2015

By DENISE GRADY However bad you thought smoking was, it’s even worse. A new study adds at least five diseases and 60,000 deaths a year to the toll taken by tobacco in the United States. Before the study, smoking was already blamed for nearly half a million deaths a year in this country from 21 diseases, including 12 types of cancer. The new findings are based on health data from nearly a million people who were followed for 10 years. In addition to the well-known hazards of lung cancer, artery disease, heart attacks, chronic lung disease and stroke, the researchers found that smoking was linked to significantly increased risks of infection, kidney disease, intestinal disease caused by inadequate blood flow, and heart and lung ailments not previously attributed to tobacco. Even though people are already barraged with messages about the dangers of smoking, researchers say it is important to let the public know that there is yet more bad news. “The smoking epidemic is still ongoing, and there is a need to evaluate how smoking is hurting us as a society, to support clinicians and policy making in public health,” said Brian D. Carter, an epidemiologist at the American Cancer Society and the first author of an article about the study, which appears in The New England Journal of Medicine. “It’s not a done story.” In an editorial accompanying the article, Dr. Graham A. Colditz, from Washington University School of Medicine in St. Louis, said the new findings showed that officials in the United States had substantially underestimated the effect smoking has on public health. He said smokers, particularly those who depend on Medicaid, had not been receiving enough help to quit. © 2015 The New York Times Company

Keyword: Drug Abuse
Link ID: 20569 - Posted: 02.13.2015

By Amy Ellis Nutt When we tell stories about our lives, most of us never have our memories questioned. NBC's Brian Williams, like other high-profile people in the past, is finding out what happens when questions arise. Williams's faux pas – retelling a story of his helicopter coming under fire in Iraq a dozen years ago when it was actually the helicopter flying ahead of him – was much like Hillary Rodham Clinton's during the 2008 presidential campaign. Her story was about coming under fire during a visit to an airfield in Bosnia 12 years earlier. George W. Bush also misremembered when, on several occasions, he told audiences that on 9/11 he watched the first plane fly into the north tower of the World Trade Center on TV, just before entering that classroom in Florida to read a book to school kids. In each case, these were highly emotional moments. Williams's helicopter made an emergency landing in the desert behind the aircraft that was hit; Clinton was made to don a flak jacket and was told her airplane might not be able to land at the airport in Bosnia because of sniper fire in the area; and Bush was told by an aide about the first crash into World Trade Center just before entering the classroom. That each of those memories was false created huge public relations headaches for Clinton and Williams. But the fact is that false memories are not that uncommon, especially when they involve highly emotional events. Scientists have been telling us for years that memory of autobiographical events, also known as episodic memory, is pliable and even unreliable. The consensus from neuroimaging studies and laboratory experiments is that episodic memory is not like replaying a film but more like reconstructing an event from bits and pieces of information. Memories are stored in clusters of neurons called engrams, and the proteins responsible for storing those memories, scientists say, are modified and changed just by the reconstruction process of remembering.

Keyword: Learning & Memory
Link ID: 20566 - Posted: 02.09.2015

Madeline Bonin Bats and moths have been evolving to one-up each other for 65 million years. Many moths can hear bats’ ultrasonic echolocation calls, making it easy for the insects to avoid this predator. A few species of bat have developed echolocation calls that are outside the range of the moths’ hearing, making it harder for the moths to evade them1. But humans short-circuit this evolutionary arms race every time they turn on a porch light, according to a study in the Journal of Applied Ecology2. In field experiments, ecologist Corneile Minnaar of the University of Pretoria and his colleagues examined the diet of Cape serotine bats (Neoromicia capensis) both in the dark and under artificial light in a national park near Pretoria. The bat, an insect-eating species common in South Africa, has an echolocation call that moths can hear. Minnaar and his team determined both the species and quantity of available insect prey at the test sites using a hand-held net and a stationary trap. Cape serotine bats do not normally eat many moths. As the scientists expected, they caught more during the lighted trials than in the dark. What was surprising, however, was the discovery that the insects formed a greater share of the bats' diet during the lighted trials. The percentage of moths eaten in bright areas was six times larger than in dark zones, even though moths represented a smaller share of the total insect population under the lights than in the shade. But surprisingly, though moths represented a smaller share of the total insect population in the lighted areas, they played a larger role in the bats' diet. © 2015 Nature Publishing Group

Keyword: Hearing; Evolution
Link ID: 20565 - Posted: 02.09.2015

by Andy Coghlan Apple's the word. Chimpanzees can learn to grunt "apple" in two chimp languages – a finding that questions how unique our own language abilities are. Researchers have kept records of vocalisations of a group of adult chimps from the Netherlands before and after the move to Edinburgh zoo. Three years later, recordings show, the Dutch chimps had picked up the pronunciation of their Scottish hosts. The finding challenges the prevailing theory that chimp words for objects are fixed because they result from excited, involuntary outbursts. Humans can easily learn foreign words that refer to a specific object, and it was assumed that chimps and other animals could not, perhaps owing to their different brain structure. This has long been argued to be one of the talents making humans unique. The assumption has been that animals do not have control over the sounds they make, whereas we socially learn the labels for things – which is what separates us from animals, says Katie Slocombe of the University of York, UK. But this may be wrong, it seems. "The important thing we've now shown is that with the food calls, they changed the structure to fit in with their new group members, so the Dutch calls for 'apple' changed to the Edinburgh ones," says Slocombe. "It's the first time call structure has been dissociated from emotional outbursts." © Copyright Reed Business Information Ltd.

Keyword: Language; Evolution
Link ID: 20560 - Posted: 02.07.2015

by Sandrine Ceurstemont Malte Andersson from the University of Gothenburg in Sweden has been testing whether Norwegian lemmings (Lemmus lemmus), like the one in the video above, deter predators by warning them of their aggressive nature with their shrieks. The vivid markings on the fur also indicate to predators that this critter isn't for eating. Having such warning colours – a phenomenon known as aposematism – is common in insects, snakes and frogs, but unusual in herbivorous mammals. This combination of hues made the lemmings easier to spot than their plain-looking neighbours, grey-sided voles. When a predator, played by humans in Andersson's test, is far away, these lemmings prefer to go unnoticed, he found. But when predators get closer, to within a few metres, these lemmings were much more likely to give out a warning call than their browner relatives. The conspicuous colours, aggressive calls and threatening postures together let predators know to expect a fight, and potentially damage, if they attempt to eat a Norwegian lemming. In contrast with the voles, these lemmings aggressively resist attacks by predatory birds. © Copyright Reed Business Information Ltd.

Keyword: Aggression
Link ID: 20558 - Posted: 02.07.2015

By Angelina Fanous After the height of the Ice Bucket Challenge last fall, I found myself at a dinner party where the conversation turned to A.L.S. — amyotrophic lateral sclerosis — the disease for which millions were dousing themselves to raise awareness and money. “Would you rather have A.L.S., Alzheimer’s, or Parkinson’s?” someone asked. All those diseases are devastating, but A.L.S. is unique in that it usually kills within two to three years of diagnosis. It was just a game to my friends, all of whom are in their 20s. Everyone chose A.L.S., agreeing that it would be the fastest and therefore easiest death. But I stayed silent. I hadn’t yet told my friends that I had been diagnosed with A.L.S. in July — two months after my 29th birthday. Had I been healthy, I might have answered A.L.S., too. But since my diagnosis, all I have wanted is more time. When I first noticed I couldn’t type with my left hand, the doctors narrowed down it down to two options: a treatable autoimmune disease or A.L.S. They initially began treating me for the autoimmune disease. About once a month, we shut down my immune system so it would stop attacking my central nervous system. But with no immune system I made regular visits to the E.R. “At least it’s not A.L.S.,” I consoled myself. When the treatment didn’t work and the weakness spread to my left leg and right hand, A.L.S. was the only remaining possibility. Still, I did that socially acceptable but also borderline insane thing where I sought second, third and fourth opinions. I voluntarily subjected myself to excruciating medical tests. I got shocked with electricity, had my spinal fluid drained, and underwent a surgery to remove a piece of my muscles and nerves, all in the hopes of finding a different diagnosis. All of the tests confirmed the diagnosis of A.L.S. © 2015 The New York Times Company

Keyword: ALS-Lou Gehrig's Disease
Link ID: 20556 - Posted: 02.05.2015

Alison Abbott Fabienne never found out why she went into labour three months too early. But on a quiet afternoon in June 2007, she was hit by accelerating contractions and was rushed to the nearest hospital in rural Switzerland, near Lausanne. When her son, Hugo, was born at 26 weeks of gestation rather than the typical 40, he weighed just 950 grams and was immediately placed in intensive care. Three days later, doctors told Fabienne that ultrasound pictures of Hugo's brain indicated that he had had a severe haemorrhage from his immature blood vessels. “I just exploded into tears,” she says. Both she and her husband understood that the prognosis for Hugo was grim: he had a very high risk of cerebral palsy, a neurological condition that can lead to a life of severe disability. The couple agreed that they did not want to subject their child to that. “We immediately told the doctors that we did not want fierce medical intervention to keep him alive — and saw the relief on the doctors' faces,” recalls Fabienne, who requested that her surname not be used. That night was the most tortured of her life. The next day, however, before any change had been made to Hugo's treatment, his doctors proposed a new option to confirm the diagnosis: a brain scan using magnetic resonance imaging (MRI). This technique, which had been newly adapted for premature babies, would allow the doctors to predict the risk of cerebral palsy more accurately than with ultrasound alone, which has a high false-positive rate. Hugo's MRI scan showed that the damage caused by the brain haemorrhage was limited, and his risk of severe cerebral palsy was likely to be relatively low. So just 24 hours after their decision to let his life end, Hugo's parents did an about-turn. They agreed that the doctors should try to save him. © 2015 Nature Publishing Group

Keyword: Development of the Brain
Link ID: 20555 - Posted: 02.05.2015

By Amanda Baker While we all may vary on just how much time we like spending with other people, humans are overall very social beings. Scientists have already found this to be reflected in our health and well-being – with social isolation being associated with more depression, worse health, and a shorter life. Looking even deeper, they find evidence of our social nature reflected in the very structure of our brains. Just thinking through your daily interactions with your friends or siblings probably gives you dozens of examples of times when it was important to interpret or predict the feelings and behaviors of other people. Our brains agree. Over time parts of our brains have been developed specifically for those tasks, but apparently not all social interaction was created equally. When researchers study the brains of people trying to predict the thoughts and feelings of others, they can actually see a difference in the brain activity depending on whether that person is trying to understand a friend versus a stranger. Even at the level of blood flowing through your brain, you treat people you know well differently than people you don’t. These social interactions also extend into another important area of the brain: the nucleus accumbens. This structure is key in the reward system of the brain, with activity being associated with things that leave you feeling good. Curious if this could have a direct connection with behavior, one group of scientists studied a very current part of our behavior as modern social beings: Facebook use. © 2015 Scientific American

Keyword: Development of the Brain
Link ID: 20554 - Posted: 02.05.2015

|By Andrea Anderson and Victoria Stern Blood type may affect brain function as we age, according to a new large, long-term study. People with the rare AB blood type, present in less than 10 percent of the population, have a higher than usual risk of cognitive problems as they age. University of Vermont hematologist Mary Cushman and her colleagues used data from a national study called REGARDS, which has been following 30,239 African-American and Caucasian individuals older than 45 since 2007. The aim of the study is to understand the heavy stroke toll seen in the southeastern U.S., particularly among African-Americans. Cushman's team focused on information collected twice yearly via phone surveys that evaluate cognitive skills such as learning, short-term memory and executive function. The researchers zeroed in on 495 individuals who showed significant declines on at least two of the three phone survey tests. When they compared that cognitively declining group with 587 participants whose mental muster remained robust, researchers found that impairment in thinking was roughly 82 percent more likely in individuals with AB blood type than in those with A, B or O blood types, even after taking their race, sex and geography into account. The finding was published online last September in Neurology. The seemingly surprising result has some precedent: past studies suggest non-O blood types are linked to elevated incidence of heart disease, stroke and blood clots—vascular conditions that could affect brain function. Yet these cardiovascular consequences are believed to be linked to the way non-O blood types coagulate, which did not seem to contribute to the cognitive effects described in the new study. The researchers speculate that other blood-group differences, such as how likely cells are to stick to one another or to blood vessel walls, might affect brain function. © 2015 Scientific American

Keyword: Alzheimers
Link ID: 20552 - Posted: 02.05.2015

By Monique Brouillette When the first four-legged creatures emerged from the sea roughly 375 million years ago, the transition was anything but smooth. Not only did they have to adjust to the stress of gravity and the dry environment, but they also had to wait another 100 million years to evolve a fully functional ear. But two new studies show that these creatures weren’t deaf; instead, they may have used their lungs to help them hear. Fish hear easily underwater, as sound travels in a wave of vibration that freely passes into their inner ears. If you put a fish in air, however, the difference in the density of the air and tissue is so great that sound waves will mostly be reflected. The modern ear adapted by channeling sound waves onto an elastic membrane (the eardrum), causing it to vibrate. But without this adaptation, how did the first land animals hear? To answer this question, a team of Danish researchers looked at one of the closest living relatives of early land animals, the African lungfish (Protopterus annectens). As its name suggests, the lungfish is equipped with a pair of air-breathing lungs. But like the first animals to walk on land, it lacks a middle ear. The researchers wanted to determine if the fish could sense sound pressure waves underwater, so they filled a long metal tube with water and placed a loudspeaker at one end. They played sounds into the tube in a range of frequencies and carefully positioned the lungfish in areas of the tube where the sound pressure was high. Monitoring the brain stem and auditory nerve activity in the lungfish, the researchers were surprised to discover that the fish could detect pressure waves in frequencies above 200 Hz. © 2015 American Association for the Advancement of Science

Keyword: Hearing; Evolution
Link ID: 20551 - Posted: 02.05.2015

by Jacob Aron Ever struggled to tell the difference between two shades of paint? When it comes to colour, one person's peach is another's puce, but there are 11 basic colours that we all agree on. Now it seems two more should be in the mix: lilac and turquoise. In 1969, two researchers looked at 100 languages and found that all had words for black, white, red, green, yellow, blue, brown, purple, pink, orange and grey. These terms pass a number of tests: they refer to easily distinguishable colours, are widely used and are single words. The chart divided into basic colours (Image: D.Mylonas/L.MacDonald) We might quibble over which shade is cream or peach, for example, but everyone knows yellow when they see it. There are exceptions - Russian and Greek speakers have separate words for light and dark blue. Now Dimitris Mylonas of Queen Mary University of London and Lindsay MacDonald of University College London says the same applies to two more colours, in the case of English-speakers, at least. For the past seven years, they've been running an online test in which people name a range of shades – you can try it for yourself. Results from 330 participants were analysed to pick out basic names. These were ranked in a number of ways, such as how often each colour name came up and whether the name was unique to one shade or common to many. Lilac and turquoise came ninth and tenth overall, beating white, red and orange. The only measure turquoise didn't score highly on was the time it took people to enter an answer, says Mylonas. "Our observers had problems spelling it correctly." © Copyright Reed Business Information Ltd.

Keyword: Vision
Link ID: 20550 - Posted: 02.05.2015

by Bethany Brookshire The windup before the pitch. The take-away before the golf swing. When you learn to pitch a softball, swing a golf club or shoot a basketball, you learn that preparation is important. You also learn about follow-through — the upswing of the golf club or the bend in the elbow after a softball pitch. It’s the preparation and the execution that get the ball across the plate, so why should we care about follow-through? In theory, once the ball has left your hands or sailed away from your club or racket, there’s no movement you could make that could affect what happens next. So while some follow-through might be important to diffuse the energy you just put into your shot, it shouldn’t really matter whether you swing your golf club up in an arc, whip it off to the side or club your opponent over the head with it. But follow-through is in fact quite important, and not just as an extension of the movements that preceded it. Consistent follow-through actually helps performance, reports neuroscientist Ian Howard and colleagues at the University of Plymouth in England. The finding gives coaches some science to back up their training, and helps scientists understand how the brain accesses motor memories. Howard has always been interested in how the brain learns movement tasks. “The first study we did looked at the preparation movement — you move backwards and then you move forwards [as in a golf swing],” he says. His lab found that the preparation before a particular motion had a strong effect on how our brains learn and recall motor movements. © Society for Science & the Public 2000 - 2015.

Keyword: Movement Disorders
Link ID: 20549 - Posted: 02.05.2015

By Maria Konnikova R. T. first heard about the Challenger explosion as she and her roommate sat watching television in their Emory University dorm room. A news flash came across the screen, shocking them both. R. T., visibly upset, raced upstairs to tell another friend the news. Then she called her parents. Two and a half years after the event, she remembered it as if it were yesterday: the TV, the terrible news, the call home. She could say with absolute certainty that that’s precisely how it happened. Except, it turns out, none of what she remembered was accurate. R. T. was a student in a class taught by Ulric Neisser, a cognitive psychologist who had begun studying memory in the seventies. Early in his career, Neisser became fascinated by the concept of flashbulb memories—the times when a shocking, emotional event seems to leave a particularly vivid imprint on the mind. William James had described such impressions, in 1890, as “so exciting emotionally as almost to leave a scar upon the cerebral tissues.” The day following the explosion of the Challenger, in January, 1986, Neisser, then a professor of cognitive psychology at Emory, and his assistant, Nicole Harsch, handed out a questionnaire about the event to the hundred and six students in their ten o’clock psychology 101 class, “Personality Development.” Where were the students when they heard the news? Whom were they with? What were they doing? The professor and his assistant carefully filed the responses away. In the fall of 1988, two and a half years later, the questionnaire was given a second time to the same students. It was then that R. T. recalled, with absolute confidence, her dorm-room experience. But when Neisser and Harsch compared the two sets of answers, they found barely any similarities.

Keyword: Learning & Memory
Link ID: 20548 - Posted: 02.05.2015