Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 5152

Boer Deng The ability of the bizarre prion protein to cause an array of degenerative brain conditions may help solve a puzzle in Alzheimer's research — why the disease sometimes kills within a few years, but usually causes a slow decline that can take decades. By adopting tools used to study the prion protein, PrP, researchers have found variations in the shape of a protein involved in Alzheimer’s that may influence how much damage it causes in the brain. At the Prion 2015 meeting, held on 26–29 May in Fort Collins, Colorado, neuroscientist Lary Walker described how he has borrowed a technique from prion research to study different ‘strains’ of the amyloid-β protein, which accumulates in clumps in the brains of people with Alzheimer’s. It may be that differences between the strains account for variations in the disease’s symptoms and rate of progression. “The Alzheimer’s field has not been paying enough attention to what’s happening in the prion field,” says Walker, who is based at Emory University in Atlanta, Georgia. Similarities between rare prion diseases and common neurodegenerative diseases such as Alzheimer’s have been noted for decades: both are thought to involve proteins in the nervous system that change shape and clump together. In prion diseases, a misfolded, often foreign, protein induces cascading malformation of the native prion protein in a patient’s brain. In Alzheimer’s, proteins called tau and amyloid-β accumulate within and around nerve cells, though what triggers that process — and the role of the deposits in the disease — is unclear. © 2015 Nature Publishing Group,

Keyword: Alzheimers; Prions
Link ID: 21000 - Posted: 05.30.2015

A patient tormented by suicidal thoughts gives his psychiatrist a few strands of his hair. She derives stem cells from them to grow budding brain tissue harboring the secrets of his unique illness in a petri dish. She uses the information to genetically engineer a personalized treatment to correct his brain circuit functioning. Just Sci-fi? Yes, but... An evolving “disease-in-a-dish” technology, funded by the National Institutes of Health (NIH), is bringing closer the day when such a seemingly futuristic personalized medicine scenario might not seem so far-fetched. Scientists have perfected mini cultured 3-D structures that grow and function much like the outer mantle – the key working tissue, or cortex — of the brain of the person from whom they were derived. Strikingly, these “organoids” buzz with neuronal network activity. Cells talk with each other in circuits, much as they do in our brains. Sergiu Pasca, M.D. External Web Site Policy, of Stanford University, Palo Alto, CA, and colleagues, debut what they call “human cortical spheroids,” May 25, 2015 online in the journal Nature Methods. Prior to the new study, scientists had developed a way to study neurons differentiated from stem cells derived from patients’ skin cells — using a technology called induced pluripotent stem cells (iPSCs). They had even produced primitive organoids by coaxing neurons and support cells to organize themselves, mimicking the brain’s own architecture. But these lacked the complex circuitry required to even begin to mimic the workings of our brains.

Keyword: Development of the Brain
Link ID: 20998 - Posted: 05.30.2015

by Andy Coghlan A man in his mid-50s with Parkinson's disease had fetal brain cells injected into his brain last week. He is the first person in nearly 20 years to be treated this way – and could recover full control of his movements in roughly five years. "It seemed to go fine," says Roger Barker of the University of Cambridge, who is leading the international team that is reviving the procedure. The treatment was pioneered 28 years ago in Sweden, but two trials in the US reported no significant benefit within the first two years following the injections, and the procedure was abandoned in favour of deep brain stimulation treatments. What these trials overlooked is that it takes several years for fetal cells to "bed in" and connect properly to the recipient's brain. Many Swedish and North American recipients improved dramatically, around three years or more after the implants – long after the trials had finished. "In the best cases, patients who had the treatment pretty much went back to normal," says Barker. After the fetal cells were wired up properly in their brains, they started producing the brain signalling chemical dopamine – low levels of this cause the classic Parkinson's symptom of uncontrolled movements. In fact, the cells produced so much dopamine that many patients could stop taking their Parkinson's drugs. "The prospect of not having to take medications for Parkinson's is fantastic," says James Beck of the Parkinson's Disease Foundation in the US. © Copyright Reed Business Information Ltd

Keyword: Parkinsons; Stem Cells
Link ID: 20989 - Posted: 05.27.2015

Children developed better fine-motor skills when the clamping of their umbilical cord at birth was delayed several minutes compared with just seconds, according to a new randomized trial. Delaying clamping allows fetal blood circulating in the placenta to be transfused to the infant, which has been shown to reduce iron deficiency at four to six months of age. Now the longer term benefits of a delay are becoming clearer. Researchers in Sweden randomly assigned 382 full-term infants born after low-risk pregnancies to be clamped at least three minutes after delivery or within 10 seconds of birth. When the children were four, a psychologist assessed them on standard tests of IQ, motor skills and behaviour. The parents also filled in questionnaires about their child's communication and social skills. "Delayed cord clamping compared with early cord clamping improved scores and reduced the number of children having low scores in fine-motor skills and social domains," the study's lead author, Dr. Ola Andersson of Uppsala University in Sweden, and his co-authors said in Tuesday's issue of JAMA Pediatrics. The fine-motor skill tests showed those in the delayed clamping group had a more mature pencil grip. There was also a difference in boys, who researchers said are generally more prone to iron deficiency than girls. Boys showed more improvements in fine-motor skills with delayed clamping. Andersson said delayed cord clamping can have quite an effect on the amount of iron in the blood, which is important for brain development just after birth. ©2015 CBC/Radio-Canada.

Keyword: Development of the Brain
Link ID: 20988 - Posted: 05.27.2015

By Tina Hesman Saey Combatants in the age-old battle of nature versus nurture may finally be able to lay down their arms. On average, both nature and nurture contribute roughly equally to determining human traits. Researchers compiled data from half a century’s worth of studies on more than 14 million pairs of twins. The researchers measured heritability — the amount of variation in a characteristic that can be attributed to genes — for a wide variety of human traits including blood pressure, the structure of the eyeball and mental or behavioral disorders. All traits are heritable to some degree, the researchers report May 18 in Nature Genetics. Traits overall had an average heritability of 49 percent, meaning it’s a draw between genes and environment. Individual traits can be more strongly influenced by one or the other. 100% Fraction of human traits with a genetic component 49% Fraction of variability in human traits determined by genes T.J.C. Polderman et al. Meta-analysis of the heritability of human traits based on fifty years of twin studies. Nature Genetics. Published online May 18, 2015. doi:10.1038/ng.3285. © Society for Science & the Public 2000 - 2015.

Keyword: Genes & Behavior; Development of the Brain
Link ID: 20985 - Posted: 05.27.2015

By Esther Hsieh Imagine you are enjoying your golden years, driving to your daily appointment for some painless brain zapping that is helping to stave off memory loss. That's the hope of a new study, in which people who learned associations (such as a random word and an image) after transcranial magnetic stimulation (TMS) were better able to learn more pairings days and weeks later—with no further stimulation needed. TMS uses a magnetic coil placed on the head to increase electrical signaling a few centimeters into the brain. Past studies have found that TMS can boost cognition and memory during stimulation, but this is the first to show that such gains can last even after the TMS regimen is completed. In the new study, which was published in Science, neuroscientists first used brain imaging to identify the associative memory network of 16 young, healthy participants. This network, based around the hippocampus, glues together things such as sights, places, sounds and time to form a memory, explains neuroscientist Joel Voss of Northwestern University, a senior author of the paper. Next, the researchers applied TMS behind the left ear of each participant for 20 minutes for five consecutive days to stimulate this memory network. To see if participants' associative memory improved, one day after the stimulation regimen finished they were tested for their ability to learn random words paired with faces. Subjects who had had TMS performed 33 percent better, compared with those who received placebo treatments, such as sham stimulation. © 2015 Scientific American

Keyword: Learning & Memory; Brain imaging
Link ID: 20977 - Posted: 05.25.2015

By Tara Haelle Thousands of infants each year die in their cribs from sudden infant death syndrome (SIDS) for reasons that have remained largely a mystery. A study published May 25 provides strong evidence that oxygen deprivation plays a big role. One reason the cause of SIDS has been so difficult to study is the sheer number of variables researchers have had to account for: whether the infant sleeps face down, breathes secondhand smoke or has an illness as well as whether the child has an unidentified underlying susceptibility. To isolate the effects of oxygen concentration, researchers from the University of Colorado compared the rate of SIDS in infants living at high altitudes, where the air is thin, to those living closer to sea level. Infants at high altitudes, they found, were more than twice as likely to die from SIDS. It was “very clever of the authors,” says Michael Goodstein, a pediatrician and member of the 2010–2011 Task Force on Sudden Infant Death Syndrome who was not involved in the study. “The authors did a good job controlling for other variables,” he adds. Beyond the risk of living at high altitudes, the study suggests a common link among different risk factors about the causes of SIDS. For example, the authors note that sleeping on the stomach and exposure to tobacco smoke can also contribute to hypoxia—insufficient oxygen reaching the tissues. Similarly, past research has suggested that sleeping on soft surfaces may shift the chin down, partly obstructing the airway, which might cause an infant to breathe in less oxygen. It’s unclear how hypoxia might contribute to SIDS but it could have to do with a buildup of carbon dioxide in the tissues when a child does not wake up. © 2015 Scientific American

Keyword: Sleep; Development of the Brain
Link ID: 20975 - Posted: 05.25.2015

Nala Rogers Alzheimer’s disease may have evolved alongside human intelligence, researchers report in a paper posted this month on BioRxiv1. The study finds evidence that 50,000 to 200,000 years ago, natural selection drove changes in six genes involved in brain development. This may have helped to increase the connectivity of neurons, making modern humans smarter as they evolved from their hominin ancestors. But that new intellectual capacity was not without cost: the same genes are implicated in Alzheimer's disease. Kun Tang, a population geneticist at the Shanghai Institutes for Biological Sciences in China who led the research, speculates that the memory disorder developed as ageing brains struggled with new metabolic demands imposed by increasing intelligence. Humans are the only species known to develop Alzheimer's; the disease is absent even in closely related primate species such as chimpanzees. Tang and his colleagues searched modern human DNA for evidence of this ancient evolution. They examined the genomes of 90 people with African, Asian or European ancestry, looking for patterns of variation driven by changes in population size and natural selection. Marked by selection The analysis was tricky, because the two effects can mimic each other. To control for the effects of population changes ― thereby isolating the signatures of natural selection — the researchers estimated how population sizes changed over time. Then they identified genome segments that did not match up with the population history, revealing the DNA stretches that were most likely shaped by selection. © 2015 Nature Publishing Group

Keyword: Alzheimers; Intelligence
Link ID: 20971 - Posted: 05.23.2015

Athletes who lose consciousness after concussions may be at greater risk for memory loss later in life, a small study of retired National Football League players suggests. Researchers compared memory tests and brain scans for former NFL players and a control group of people who didn't play college or pro football. After concussions that resulted in lost consciousness, the football players were more likely to have mild cognitive impairment and brain atrophy years later. "Our results do suggest that players with a history of concussion with a loss of consciousness may be at greater risk for cognitive problems later in life," senior study author Munro Cullum, chief of neuropsychology at the University of Texas Southwestern Medical Center in Dallas, said by email. "We are at the early stages of understanding who is actually at risk at the individual level." Cullum and colleagues recruited 28 retired NFL players living in Texas: eight who were diagnosed with mild cognitive impairment and 20 who didn't appear to have any memory problems. They ranged in age from 36 to 79, and were an average of about 58 years old. All but three former athletes experienced at least one concussion, and they typically had more than three. Researchers compared these men to 27 people who didn't play football but were similar in age, education, and mental capacity to the retired athletes, including six with cognitive impairment. These men were 41 to 77 years old, and about 59 on average. ©2015 CBC/Radio-Canada

Keyword: Brain Injury/Concussion; Learning & Memory
Link ID: 20965 - Posted: 05.21.2015

Scientists at Mayo Clinic, Jacksonville, Florida created a novel mouse that exhibits the symptoms and neurodegeneration associated with the most common genetic forms of frontotemporal dementia (FTD) and amyotrophic lateral sclerosis (ALS, Lou Gehrig’s disease), both of which are caused by a mutation in the a gene called C9ORF72. The study was partially funded by the National Institutes of Health and published in the journal Science. More than 30,000 Americans live with ALS, which destroys nerves that control essential movements, including speaking, walking, breathing and swallowing. After Alzheimer’s disease, FTD is the most common form of early onset dementia. It is characterized by changes in personality, behavior and language due to loss of neurons in the brain’s frontal and temporal lobes. Patients with mutations in the chromosome 9 open reading frame 72 (C9ORF72) gene have all or some symptoms associated with both disorders. “Our mouse model exhibits the pathologies and symptoms of ALS and FTD seen in patients with theC9ORF72 mutation,” said the study’s lead author, Leonard Petrucelli, Ph.D., chair and Ralph and Ruth Abrams Professor of the Department of Neuroscience at Mayo Clinic, and a senior author of the study. “These mice could greatly improve our understanding of ALS and FTD and hasten the development of effective treatments.” To create the model, Ms. Jeannie Chew, a Mayo Graduate School student and member of Dr. Petrucelli’s team, injected the brains of newborn mice with a disease-causing version of the C9ORF72 gene. As the mice aged, they became hyperactive, anxious, and antisocial, in addition to having problems with movement that mirrored patient symptoms.

Keyword: ALS-Lou Gehrig's Disease ; Alzheimers
Link ID: 20964 - Posted: 05.21.2015

By Susan Cosier Once a memory is lost, is it gone forever? Most research points to yes. Yet a study published in the online journal eLife now suggests that traces of a lost memory might remain in a cell's nucleus, perhaps enabling future recall or at least the easy formation of a new, related memory. The current theory accepted by neurobiologists is that long-term memories live at synapses, which are the spaces where impulses pass from one nerve cell to another. Lasting memories are dependent on a strong network of such neural connections; memories weaken or fade if the synapses degrade. In the new study, researchers at the University of California, Los Angeles, studied sea slugs' neurons in a cell culture dish. Over several days the neurons spontaneously formed a number of synapses. The scientists then administered the neurotransmitter serotonin to the neurons, causing them to create many more synapses—the same process by which a living creature would form a long-term memory. When they inhibited a memory-forming enzyme and checked the neurons after 48 hours, the number of synapses had returned to the initial number—but they were not the same individual synapses as before. Some of the original and some of the new synapses retracted to create the exact number the cells started with. The finding is surprising because it suggests that a nerve cell body “knows” how many synapses it is supposed to form, meaning it is encoding a crucial part of memory. The researchers also ran a similar experiment on live sea slugs, in which they found that a long-term memory could be totally erased (as gauged by its synapses being destroyed) and then re-formed with only a small reminder stimulus—again suggesting that some information was being stored in a neuron's body. © 2015 Scientific American

Keyword: Learning & Memory
Link ID: 20958 - Posted: 05.20.2015

by Clare Wilson Does this qualify as irony? Our bodies need iron to be healthy – but too much could harm our brains by bringing on Alzheimer's disease. If that's the case, measuring people's brain iron levels could help identify those at risk of developing the disease. And since we already have drugs that lower iron, we may be able to put the brakes on. Despite intense efforts, the mechanisms behind this form of dementia are still poorly understood. For a long time the main suspect has been a protein called beta-amyloid, which forms distinctive plaques in the brain, but drugs that dissolve it don't result in people improving. Not so good ferrous Studies have suggested that people with Alzheimer's also have higher iron levels in their brains. Now it seems that high iron may hasten the disease's onset. Researchers at the University of Melbourne in Australia followed 144 older people who had mild cognitive impairment for seven years. To gauge how much iron was in their brains, they measured ferritin, a protein that binds to the metal, in their cerebrospinal fluid. For every nanogram per millilitre people had at the start of the study, they were diagnosed with Alzheimer's on average three months earlier. The team also found that the biggest risk gene for Alzheimer's, ApoE4, was strongly linked with higher iron, suggesting this is why carrying the gene makes you more vulnerable. Iron is highly reactive, so it probably subjects neurons to chemical stress, says team member Scott Ayton. © Copyright Reed Business Information Ltd

Keyword: Alzheimers
Link ID: 20957 - Posted: 05.20.2015

By PAM BELLUCKM The largest analysis to date of amyloid plaques in people’s brains confirms that the presence of the substance can help predict who will develop Alzheimer’s and determine who has the disease. Two linked studies, published Tuesday in JAMA, also support the central early role in Alzheimer’s of beta amyloid, the protein that creates plaques. Data from nearly 9,500 people on five continents shows that amyloid can appear 20 to 30 years before symptoms of dementia, that the vast majority of Alzheimer’s patients have amyloid and that the ApoE4 gene, known to increase Alzheimer’s risk, greatly accelerates amyloid accumulation. The findings also confirm that amyloid screening, by PET scan or cerebral spinal fluid test, can help identify people for clinical trials of drugs to prevent Alzheimer’s. Such screening is increasingly used in research. Experts say previous trials of anti-amyloid drugs on people with dementia failed because their brains were already too damaged or because some patients, not screened for amyloid, may not have had Alzheimer’s. “The papers indicate that amyloid imaging is important to be sure that the drugs are being tested on people who have amyloid,” said Dr. Roger Rosenberg, the director of the Alzheimer’s Disease Center at the University of Texas Southwestern Medical Center at Dallas, who wrote an editorial about the studies. Dr. Samuel Gandy, an Alzheimer’s researcher at Mount Sinai Hospital, who was not involved in the research, said doctors “can feel fairly confident that amyloid is due to Alzheimer’s.” But he and others cautioned against screening most people without dementia because there is not yet a drug that prevents or treats Alzheimer’s, and amyloid scans are expensive and typically not covered by insurance. © 2015 The New York Times Company

Keyword: Alzheimers
Link ID: 20956 - Posted: 05.20.2015

by Ashley Yeager This guest post is by SN's web producer Ashley Yeager, who can't remember ever not knowing how to swim. Sometimes my brother-in-law will scoop up my 2-year-old niece and fly her around like Superwoman. She’ll start kicking her legs and swinging her arms like she’s swimming — especially when we say, “paddle, paddle, paddle.” My niece, Baby D, loves the water. She often looks like one of the kids captured in famed photographer Seth Casteel’s new book, Underwater Babies. But she probably won’t remember her first trips to the pool — she was only a few months old when her mom first took her swimming. Part of my sister’s reasoning for such an early start was standard water safety. Every day in the United States, accidental drowning claims the lives of two children under the age of 14 years. Our family spends a lot of time at the pool and the beach, so making sure Baby D is protected is a priority. But there’s another reason my sister was keen to get Baby D to the pool. Loosely based on something our mother told us, it’s that learning to swim early in life may give kids a head start in developing balance, body awareness and maybe even language and math skills. Mom may have been right. A multi-year study released in 2012 suggests that kids who take swim lessons early in life appear to hit certain developmental milestones well before their nonswimming peers. In the study, Australian researchers surveyed about 7,000 parents about their children’s development and gave 177 kids aged 3 to 5 years standard motor, language, memory and attention tests. Compared with kids who didn’t spend much time in the water, kids who had taken swim lessons seemed to be more advanced at tasks like running and climbing stairs and standing on their tiptoes or on one leg, along with drawing, handling scissors and building towers out of blocks. © Society for Science & the Public 2000 - 2015.

Keyword: Development of the Brain; Learning & Memory
Link ID: 20955 - Posted: 05.20.2015

An octopus filmed off the coast of Kalaoa in Hawaii has shown that even cephalopods can get into a game of peekaboo. In the footage, shot last month by the GoPro camera of diver Timothy Ewing, the octopus bobs up and down behind a rock as a Ewing does the same in an effort to take the animal's picture. It's clear from the video that the octopus is wary of Ewing and his big, light-equipped camera — but the animal is also very curious. “Octopus are one of the more intelligent creatures in the ocean. Sometimes they are too curious for their own good. If you hide from them they will come out and look for you," the diver wrote in his online posting of the video. Ewing explained to CaliforniaDiver.com that the encounter wasn't limited to the time captured on his GoPro. "I was interacting with that octopus for about 10 minutes before I took the video," Ewing told CaliforniaDiver.com. "I normally mount my GoPro to my big camera housing, however I always carry a small tripod with me to use with the GoPro for stationary shots like this or selfie videos." The octopus, found worldwide in tropical, subtropical and temperate areas, is known for its smarts and striking ability to camouflage itself. When it feels threatened, pigment cells in its skin allow it to change color instantly to blend in with its surroundings. The animals can also adapt their skin texture and body posture to further match their background. © 2015 Discovery Communications, LLC.

Keyword: Learning & Memory; Intelligence
Link ID: 20954 - Posted: 05.20.2015

By James Gorman and Robin Lindsay Before human ancestors started making stone tools by chipping off flakes to fashion hand axes and other implements, their ancestors may have used plain old stones, as animals do now. And even that simple step required the intelligence to see that a rock could be used to smash open a nut or an oyster and the muscle control to do it effectively. Researchers have been rigorous in documenting every use of tools they have found find in animals, like crows, chimpanzees and dolphins. And they are now beginning to look at how tools are used by modern primates — part of the scientists’ search for clues about the evolution of the kind of delicate control required to make and use even the simplest hand axes. Monkeys do not exhibit human dexterity with tools, according to Madhur Mangalam of the University of Georgia, one of the authors of a recent study of how capuchin monkeys in Brazil crack open palm nuts. “Monkeys are working as blacksmiths,” he said, “They’re not working as goldsmiths.” But they are not just banging away haphazardly, either. Mr. Mangalam, a graduate student who is interested in “the evolution of precise movement,” reported in a recent issue of Current Biology on how capuchins handle stones. His adviser and co-author was Dorothy M. Fragaszy, the director of the Primate Behavior Laboratory at the university. Using video of the capuchins’ lifting rocks with both hands to slam them down on the hard palm nuts, he analyzed how high a monkey lifted a stone and how fast it brought it down. He found that the capuchins adjusted the force of a strike according to the condition of the nut after the previous strike. © 2015 The New York Times Company

Keyword: Evolution; Learning & Memory
Link ID: 20952 - Posted: 05.19.2015

Monica Tan The age-old question of whether human traits are determined by nature or nurture has been answered, a team of researchers say. Their conclusion? It’s a draw. By collating almost every twin study across the world from the past 50 years, researchers determined that the average variation for human traits and disease is 49% due to genetic factors and 51% due to environmental factors. University of Queensland researcher Beben Benyamin from the Queensland Brain Institute collaborated with researchers at VU University of Amsterdam to collate 2,748 studies involving more than 14.5 million pairs of twins. “Twin studies have been conducted for more than 50 years but there is still some debate in terms of how much the variation is due to genetic or environmental factors,” Benyamin said. He said the study showed the conversation should move away from nature versus nature, instead looking at how the two work together. “Both are important sources of variation between individuals,” he said. While the studies averaged an almost even split between nature and nurture, there was wide variation within the 17,800 separate traits and diseases examined by the studies. For example, the risk for bipolar disorder was found to be 68% due to genetics and only 32% due to environmental factors. Weight maintenance was 63% due to genetics and 37% due to environmental factors. In contrast, risk for eating disorders was found to be 40% genetic and 60% environmental, whereas the risk for mental and behavioural disorders due to use of alcohol was 41% genetic and 59% environmental. © 2015 Guardian News and Media Limited

Keyword: Genes & Behavior
Link ID: 20948 - Posted: 05.19.2015

RACHEL MARTIN, HOST: For most of her life, Cole Cohen had a hard time with all kinds of things. She'd get lost all of the time. She couldn't do math to save her life. The whole concept of time was hard for her to grasp. Her parents took her to doctor after doctor, and there were all kinds of tests and experiments with medication, but no real diagnosis until she was 26 years old. Cole Cohen got her first MRI and finally, there was an explanation. There was a hole in her brain; a hole in her brain the size of a lemon. Her memoir, titled "Head Case," is a darkly funny exploration of what that discovery meant to her. Cole Cohen joins us now. Thanks so much for being with us. COLE COHEN: Thank you for having me, Rachel. MARTIN: Let's talk about what life was like before this revelation. I mentioned your propensity to get lost. We're not talking about being in a new place and getting confuses as a lot of us might do. You got lost in, like, big box stores that you had been to before. Can you describe that sensation, that feeling of not knowing where you are in a situation like that? COHEN: Yeah. I know that sensation every time I go grocery shopping. You know, you want to get a jar of peanut butter. You have a memory of where that jar of peanut butter is, and I just don't have that in my brain. I don't store that information. So it's like a discovery every time. MARTIN: I'd love for you to read an example of one of the symptoms. You have a hard time with numbers, even references to numbers. And you write about this in the book when you're taking driver's ed. Do you mind reading that bit? © 2015 NPR

Keyword: Learning & Memory
Link ID: 20942 - Posted: 05.18.2015

By JIM DWYER The real world of our memory is made of bits of true facts, surrounded by holes that we Spackle over with guesses and beliefs and crowd-sourced rumors. On the dot of 10 on Wednesday morning, Anthony O’Grady, 26, stood in front of a Dunkin’ Donuts on Eighth Avenue in Manhattan. He heard a ruckus, some shouts, then saw a police officer chase a man into the street and shoot him down in the middle of the avenue. Moments later, Mr. O’Grady spoke to a reporter for The New York Times and said the wounded man was in flight when he was shot. “He looked like he was trying to get away from the officers,” Mr. O’Grady said. Another person on Eighth Avenue then, Sunny Khalsa, 41, had been riding her bicycle when she saw police officers and the man. Shaken by the encounter, she contacted the Times newsroom with a shocking detail. “I saw a man who was handcuffed being shot,” Ms. Khalsa said. “And I am sorry, maybe I am crazy, but that is what I saw.” At 3 p.m. on Wednesday, the Police Department released a surveillance videotape that showed that both Mr. O’Grady and Ms. Khalsa were wrong. Contrary to what Mr. O’Grady said, the man who was shot had not been trying to get away from the officers; he was actually chasing an officer from the sidewalk onto Eighth Avenue, swinging a hammer at her head. Behind both was the officer’s partner, who shot the man, David Baril. And Ms. Khalsa did not see Mr. Baril being shot while in handcuffs; he is, as the video and still photographs show, freely swinging the hammer, then lying on the ground with his arms at his side. He was handcuffed a few moments later, well after he had been shot. © 2015 The New York Times Company

Keyword: Learning & Memory
Link ID: 20939 - Posted: 05.16.2015

By Jonathan Webb Science reporter, BBC News A cluster of cells in the brain of a fly can track the animal's orientation like a compass, a study has revealed. Fixed in place on top of a spherical treadmill, a fruit fly walked on the spot while neuroscientists peered into its brain using a microscope. Watching the neurons fire inside a donut-shaped brain region, they saw activity sweep around the ring to match the direction the animal was headed. Mammals have similar "head direction cells" but this is a first for flies. The findings are reported in the journal Nature. Crucially, the compass-like activity took place not only when the animal was negotiating a virtual-reality environment, in which screens gave the illusion of movement, but also when it was left in the dark. "The fly is using a sense of its own motion to pick up which direction it's pointed," said senior author Dr Vivek Jayaraman, from the Howard Hughes Medical Institute's Janelia Research Campus. In some other insects, such as monarch butterflies and locusts, brain cells have been observed firing in a way that reflects the animal's orientation to the pattern of polarised light in the sky - a "sun compass". But the newly discovered compass in the fly brain works more like the "head directions cells" seen in mammals, which rapidly set up a directional system for the animal based on landmarks in the surrounding scene. "A key thing was incorporating the fly's own movement," Dr Jayaraman told the BBC. "To see that its own motion was relevant to the functioning of this compass - that was something we could only see if we did it in a behaving animal." © 2015 BBC

Keyword: Learning & Memory
Link ID: 20933 - Posted: 05.14.2015