Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 5605

Nicola Davis Scientists have discovered 17 separate genetic variations that increase the risk of a person developing depression. The findings, which came from analysing DNA data collected from more than 300,000 people, are the first genetics links to the disease found in people of European ancestry. The scientists say the research will contribute to a better understanding of the disease and could eventually lead to new treatments. They also hope it will reduce the stigma that can accompany depression. According to Nice, up to 10% of people seen by practitioners in primary care have clinical depression, with symptoms including a continuously low mood, low self-esteem, difficulties making decisions and lack of energy. Both environmental and genetic factors are thought to be behind depression, with the interaction between the two also thought to be important. But with a large number of genetic variants each thought to make a tiny contribution to the risk of developing the condition, unravelling their identity has proved challenging. While previous studies have turned up a couple of regions in the genome of Chinese women that might increase the risk of depression, the variants didn’t appear to play a role in depression for people of European ancestry. © 2016 Guardian News and Media Limited

Keyword: Depression; Genes & Behavior
Link ID: 22504 - Posted: 08.02.2016

By Andy Coghlan Mysterious shrunken cells have been spotted in the human brain for the first time, and appear to be associated with Alzheimer’s disease. “We don’t know yet if they’re a cause or consequence,” says Marie-Ève Tremblay of Laval University in Québec, Canada, who presented her discovery at the Translational Neuroimmunology conference in Big Sky, Montana, last week. The cells appear to be withered forms of microglia – the cells that keep the brain tidy and free of infection, normally by pruning unwanted brain connections or destroying abnormal and infected brain cells. But the cells discovered by Tremblay appear much darker when viewed using an electron microscope, and they seem to be more destructive. “It took a long time for us to identify them,” says Tremblay, who adds that these shrunken microglia do not show up with the same staining chemicals that normally make microglia visible under the microscope. Compared with normal microglia, the dark cells appear to wrap much more tightly around neurons and the connections between them, called synapses. “It seems they’re hyperactive at synapses,” says Tremblay. Where these microglia are present, synapses often seem shrunken and in the process of being degraded. Tremblay first discovered these dark microglia in mice, finding that they increase in number as mice age, and appear to be linked to a number of things, including stress, the neurodegenerative condition Huntington’s disease and a mouse model of Alzheimer’s disease. “There were 10 times as many dark microglia in Alzheimer’s mice as in control mice,” says Tremblay. © Copyright Reed Business Information Ltd.

Keyword: Alzheimers; Glia
Link ID: 22503 - Posted: 08.02.2016

By Katherine S. Pollard When the first human genome sequence was published in 2001,1 I was a graduate student working as the statistics expert on a team of scientists. Hailing from academia and biotechnology, we aimed to discover differences in gene expression levels between tumors and healthy cells. Like many others, I had high hopes for what we could do with this enormous text file of more than 3 billion As, Cs, Ts, and Gs. Ambitious visions of a precise wiring diagram for human cells and imminent cures for disease were commonplace among my classmates and professors. But I was most excited about a different use of the data, and I found myself counting the months until the genome of a chimpanzee would be sequenced. Chimps are our closest living relatives on the tree of life. While their biology is largely similar to ours, we have many striking differences, ranging from digestive enzymes to spoken language. Humans also suffer from an array of diseases that do not afflict chimpanzees or are less severe in them, including autism, schizophrenia, Alzheimer’s disease, diabetes, atherosclerosis, AIDS, rheumatoid arthritis, and certain cancers. I had long been fascinated with hominin fossils and the way the bones morphed into different forms over evolutionary time. But those skeletons cannot tell us much about the history of our immune system or our cognitive abilities. So I started brainstorming about how to extend the statistical approaches we were using for cancer research to compare human and chimpanzee DNA. My immodest goal was to identify the genetic basis for all the traits that make humans unique. © 1986-2016 The Scientist

Keyword: Evolution; Genes & Behavior
Link ID: 22502 - Posted: 08.02.2016

By Bahar Gholipour After reflexively reaching out to grab a hot pan falling from the stove, you may be able to withdraw your hand at the very last moment to avoid getting burned. That is because the brain's executive control can step in to break a chain of automatic commands. Several new lines of evidence suggest that the same may be true when it comes to the reflex of recollection—and that the brain can halt the spontaneous retrieval of potentially painful memories. Within the brain, memories sit in a web of interconnected information. As a result, one memory can trigger another, making it bubble up to the surface without any conscious effort. “When you get a reminder, the mind's automatic response is to do you a favor by trying to deliver the thing that's associated with it,” says Michael Anderson, a neuroscientist at the University of Cambridge. “But sometimes we are reminded of things we would rather not think about.” Humans are not helpless against this process, however. Previous imaging studies suggest that the brain's frontal areas can dampen the activity of the hippocampus, a crucial structure for memory, and therefore suppress retrieval. In an effort to learn more, Anderson and his colleagues recently investigated what happens after the hippocampus is suppressed. They asked 381 college students to learn pairs of loosely related words. Later, the students were shown one word and asked to recall the other—or to do the opposite and to actively not think about the other word. Sometimes between these tasks they were shown unusual images, such as a peacock standing in a parking lot. © 2016 Scientific American

Keyword: Learning & Memory
Link ID: 22500 - Posted: 08.01.2016

Aaron E. Carroll I remember thinking, after my pregnant wife’s water broke, minutes after I went to bed, anguishing really, over one thought as we drove to the hospital: “I’m never going to be well rested again.” If there’s one things all new parents wish for, it’s a good night’s sleep. Unfortunately, infants sometimes make that impossible. They wake up repeatedly, needing to be fed, changed and comforted. Eventually, they reach an age when they should sleep through the night. Some don’t, though. What to do with them continues to be a topic of a heated debate in parenting circles. One camp believes that babies should be left to cry it out. These people place babies in their cribs at a certain time, after a certain routine, and don’t interfere until the next morning. No matter how much the babies scream or cry, parents ignore them. After all, if babies learn that tantrums lead to the appearance of a loved one, they will continue that behavior in the future. The official name for this intervention is “Extinction.” The downside, of course, is that it’s unbelievably stressful for parents. Many can’t do it. And not holding fast to the plan can make everything worse. Responding to an infant’s crying after an extended period of time makes the behavior harder to extinguish. To a baby, it’s like a slot machine that hits just as you’re ready to walk away; it makes you want to play more. A modification of this strategy is known as “Graduated Extinction.” Parents allow their infant to cry it out for a longer period each night, until infants eventually put themselves to sleep. On the first night, for instance, parents might commit to not entering the baby’s room for five minutes. The next night, 10 minutes. Then 15, and so on. Or, they could increase the increments on progressive checks each night. When they do go in the room, it’s only to check and make sure the baby is O.K. – no picking up or comforting. This isn’t meant to be a reward for crying, but to allow parents to be assured that nothing is wrong. © 2016 The New York Times Company

Keyword: Sleep; Development of the Brain
Link ID: 22499 - Posted: 08.01.2016

By Richard Kemeny Sleep is essential for memory. Mounting evidence continues to support the notion that the nocturnal brain replays, stabilizes, reorganizes, and strengthens memories while the body is at rest. Recently, one particular facet of this process has piqued the interest of a growing group of neuroscientists: sleep spindles. For years these brief bursts of brain activity have been largely ignored. Now it seems that examining these neuronal pulses could help researchers better understand—perhaps even treat—cognitive impairments. Sleep spindles are a defining characteristic of stage 2 non-rapid eye movement (NREM) sleep. These electrical bursts between 10-16 Hz last only around a second, and are known to occur in the human brain thousands of times per night. Generated by a thin net of neurons enveloping the thalamus, spindles appear across several regions of the brain, and are thought to perform various functions, including maintaining sleep in the face of disturbances in the environment. It appears they are also a fundamental part of the process by which the human brain consolidates memories during sleep. A memory formed during the day is stored temporarily in the hippocampus, before being spontaneously replayed during the night. Information about the memory is distributed out and integrated into the neocortex through an orchestra of slow-waves, spindles, and rapid hippocampal ripples. Spindles, it seems, could be a guiding force—providing the plasticity and coordination needed for this delicate, interregional transfer of information. © 1986-2016 The Scientist

Keyword: Sleep; Learning & Memory
Link ID: 22494 - Posted: 07.30.2016

By Tanya Lewis The tangled buildup of tau protein in brain cells is a hallmark of the cognitive decline linked with Alzheimer’s disease. Antibodies have been shown to block tau’s spread, but some scientists worry it could also fuel inflammation. Now, researchers from Genentech in San Francisco and colleagues have found that an antibody’s ability to recruit immune cells—known as its effector function—is not necessary for stopping tau’s spread, the team reported today (July 28) in Cell Reports. “Our results suggest that, given that effector function is not required for efficacy [in treating tau accumulation], going without it could offer a safer approach for immunotherapy,” study coauthor Gai Ayalon of Genentech told The Scientist. Alzheimer’s disease causes a characteristic constellation of pathologies: accumulation of amyloid-β plaques outside neurons, neurofibrillary tangles of tau inside brain cells, and chronic inflammation. Clinical research has mostly focused on targeting amyloid-β with antibody therapies, and several treatments based on this approach are currently in clinical trials. But recent efforts have zeroed in on tau as a new potential target. Antibodies are known to spur the brain’s defense system, microglia, to absorb and degrade tau, but their recruitment of immune cells may also worsen inflammation. Ayalon and colleagues wondered whether effector function was necessary for stopping tau’s spread. © 1986-2016 The Scientist

Keyword: Alzheimers; Neuroimmunology
Link ID: 22492 - Posted: 07.30.2016

By ANDREW POLLACK A new type of drug for Alzheimer’s disease failed to slow the rate of decline in mental ability and daily functioning in its first large clinical trial. There was a hint, though, that it might be effective for certain patients. The drug, called LMTX, is the first one with its mode of action — trying to undo so-called tau tangles in the brain — to reach the final stage of clinical trials. So the results of the study were eagerly awaited. The initial reaction to the outcome was disappointment, with perhaps a glimmer of hopefulness. Over all, the patients who received LMTX, which was developed by TauRx Therapeutics, did not have a slower rate of decline in mental ability or daily functioning than those in the control group. However, the drug did seem to work for the subset of patients — about 15 percent of those in the study — who took LMTX as their only therapy. The other 85 percent of patients took an existing Alzheimer’s drug in addition to either LMTX or a placebo. “There were highly significant, clinically meaningful, large effects in patients taking the drug as monotherapy, and no effect in patients taking it as an add-on,” Claude Wischik, a founder and the chief executive of TauRx, said in an interview. He spoke from Toronto, where the results were being presented at the Alzheimer’s Association International Conference. Dr. Wischik said a second clinical trial sponsored by the company, whose results will be announced later, found the same phenomenon. He said the company planned to apply for approval of LMTX to be used by itself. But some experts not involved in the study were skeptical about drawing conclusions from a small subset of patients, especially since there was no obvious explanation why LMTX would be expected to work only in patients not getting other drugs. on © 2016 The New York Times Company

Keyword: Alzheimers
Link ID: 22488 - Posted: 07.28.2016

Ian Sample and Nicky Woolf When Bill Gates pulled on a red and white-striped cord to upturn a bucket of iced water positioned delicately over his head, the most immediate thought for many was not, perhaps, of motor neurone disease. But the ice bucket challenge, the charity campaign that went viral in the summer of 2014 and left scores of notable persons from Gates and Mark Zuckerberg to George W. Bush and Anna Wintour shivering and drenched, has paid off in the most spectacular way. Dismissed by some at the time as “slacktivism” - an exercise that appears to do good while achieving very little - the ice bucket challenge raised more than $115m (£88m) for motor neurone disease in a single month. Now, scientists funded with the proceeds have discovered a gene variant associated with the condition. In the near term the NEK1 gene variant, described in the journal Nature Genetics this week, will help scientists understand how the incurable disorder, known also as Amyotrophic Lateral Sclerosis (ALS) or Lou Gehrig’s disease, takes hold. Once the mechanisms are more clearly elucidated, it may steer researchers on a path towards much-needed treatments. The work may never have happened were it not for the curious appeal of the frozen water drenchings. The research grants that scientists are awarded do not get close to the €4m the study required. Instead, Project MinE, which aims to unravel the genetic basis of the disease and ultimately find a cure, was funded by the ALS Association through ice bucket challenge donations. © 2016 Guardian News and Media Limited

Keyword: ALS-Lou Gehrig's Disease ; Genes & Behavior
Link ID: 22487 - Posted: 07.28.2016

By Gretchen Reynolds Learning requires more than the acquisition of unfamiliar knowledge; that new information or know-how, if it’s to be more than ephemeral, must be consolidated and securely stored in long-term memory. Mental repetition is one way to do that, of course. But mounting scientific evidence suggests that what we do physically also plays an important role in this process. Sleep, for instance, reinforces memory. And recent experiments show that when mice and rats jog on running wheels after acquiring a new skill, they learn much better than sedentary rodents do. Exercise seems to increase the production of biochemicals in the body and brain related to mental function. Researchers at the Donders Institute for Brain, Cognition and Behavior at Radboud University in the Netherlands and the University of Edinburgh have begun to explore this connection. For a study published this month in Current Biology, 72 healthy adult men and women spent about 40 minutes undergoing a standard test of visual and spatial learning. They observed pictures on a computer screen and then were asked to remember their locations. Afterward, the subjects all watched nature documentaries. Two-thirds of them also exercised: Half were first put through interval training on exercise bicycles for 35 minutes immediately after completing the test; the others did the same workout four hours after the test. Two days later, everyone returned to the lab and repeated the original computerized test while an M.R.I. machine scanned their brain activity. Those who exercised four hours after the test recognized and recreated the picture locations most accurately. Their brain activity was subtly different, too, showing a more consistent pattern of neural activity. The study’s authors suggest that their brains might have been functioning more efficiently because they had learned the patterns so fully. But why delaying exercise for four hours was more effective than an immediate workout remains mysterious. By contrast, rodents do better in many experiments if they work out right after learning. © 2016 The New York Times Company

Keyword: Learning & Memory
Link ID: 22486 - Posted: 07.28.2016

Jon Hamilton Two studies released at an international Alzheimer's meeting Tuesday suggest doctors may eventually be able to screen people for this form of dementia by testing the ability to identify familiar odors, like smoke, coffee and raspberry. In both studies, people who were in their 60s and older took a standard odor detection test. And in both cases, those who did poorly on the test were more likely to already have — or go on to develop — problems with memory and thinking. "The whole idea is to create tests that a general clinician can use in an office setting," says Dr. William Kreisl, a neurologist at Columbia University, where both studies were done. The research was presented at the Alzheimer's Association International Conference in Toronto. Currently, any tests that are able to spot people in the earliest stages of Alzheimer's are costly and difficult. They include PET scans, which can detect sticky plaques in the brain, and spinal taps that measure the levels of certain proteins in spinal fluid. The idea of an odor detection test arose, in part, from something doctors have observed for many years in patients with Alzheimer's, Kreisl says. "Patients will tell us that food does not taste as good," he says. The reason is often that these patients have lost the ability to smell what they eat. That's not surprising, Kreisl says, given that odor signals from the nose have to be processed in areas of the brain that are among the first to be affected by Alzheimer's disease. But it's been tricky to develop a reliable screening test using odor detection. © 2016 npr

Keyword: Alzheimers; Chemical Senses (Smell & Taste)
Link ID: 22485 - Posted: 07.27.2016

By PAM BELLUCK “Has the person become agitated, aggressive, irritable, or temperamental?” the questionnaire asks. “Does she/he have unrealistic beliefs about her/his power, wealth or skills?” Or maybe another kind of personality change has happened: “Does she/he no longer care about anything?” If the answer is yes to one of these questions — or others on a new checklist — and the personality or behavior change has lasted for months, it could indicate a very early stage of dementia, according to a group of neuropsychiatrists and Alzheimer’s experts. They are proposing the creation of a new diagnosis: mild behavioral impairment. The idea is to recognize and measure something that some experts say is often overlooked: Sharp changes in mood and behavior may precede the memory and thinking problems of dementia. The group made the proposal on Sunday at the Alzheimer’s Association International Conference in Toronto, and presented a 38-question checklist that may one day be used to identify people at greater risk for Alzheimer’s. “I think we do need something like this,” said Nina Silverberg, the director of the Alzheimer’s Disease Centers program at the National Institute on Aging, who was not involved in creating the checklist or the proposed new diagnosis. “Most people think of Alzheimer’s as primarily a memory disorder, but we do know from years of research that it also can start as a behavioral issue.” Under the proposal, mild behavioral impairment (M.B.I.) would be a clinical designation preceding mild cognitive impairment (M.C.I.), a diagnosis created more than a decade ago to describe people experiencing some cognitive problems but who can still perform most daily functions. © 2016 The New York Times Company

Keyword: Alzheimers
Link ID: 22480 - Posted: 07.26.2016

By Sharon Begley, STAT For the first time ever, researchers have managed to reduce people’s risk for dementia — not through a medicine, special diet, or exercise, but by having healthy older adults play a computer-based brain-training game. The training nearly halved the incidence of Alzheimer’s disease and other devastating forms of cognitive and memory loss in older adults a decade after they completed it, scientists reported on Sunday. If the surprising finding holds up, the intervention would be the first of any kind — including drugs, diet, and exercise — to do that. “I think these results are highly, highly promising,” said George Rebok of the Johns Hopkins Bloomberg School of Public Health, an expert on cognitive aging who was not involved in the study. “It’s exciting that this intervention pays dividends so far down the line.” The results, presented at the Alzheimer’s Association International Conference in Toronto, come from the government-funded ACTIVE (Advanced Cognitive Training for Independent and Vital Elderly) study. Starting in 1998, ACTIVE’s 2,832 healthy older adults (average age at the start: 74) received one of three forms of cognitive training, or none, and were evaluated periodically in the years after. In actual numbers, 14 percent of ACTIVE participants who received no training had dementia 10 years later, said psychologist Jerri Edwards of the University of South Florida, who led the study. Among those who completed up to 10 60-to-75-minute sessions of computer-based training in speed-of-processing — basically, how quickly and accurately they can pay attention to, process, and remember brief images on a computer screen — 12.1 percent developed dementia. Of those who completed all 10 initial training sessions plus four booster sessions a few years later, 8.2 percent developed dementia. © 2016 Scientific American

Keyword: Alzheimers; Learning & Memory
Link ID: 22479 - Posted: 07.26.2016

By Tim Page When I returned to California, I brought my diaries into the back yard every afternoon and read them through sequentially, with the hope of learning more about the years before my brain injury. I remembered much of what I’d done professionally, and whatever additional information I needed could usually be found on my constantly vandalized Wikipedia page. Here was the story of an awkward, imperious child prodigy who made his own films and became famous much too early; a music explainer who won a Pulitzer Prize; a driven and obsessive loner whose fascinations led to collaborations with Glenn Gould, Philip Glass and Thomas Pynchon. In 2000, at age 45, I was diagnosed with Asperger’s syndrome. In retrospect, the only surprise is that it took so long. But the diaries offered a more intimate view. Reading them was slow going, and I felt as though my nose was pressed up against the windowpane of my own life. The shaggy-dog accretion of material — phone numbers, long-ago concert dates, coded references to secret loves — all seemed to belong to somebody else. My last clear memory was of a muggy, quiet Sunday morning in July, three months earlier, as I waited for a train in New London, Conn. It was 11:13 a.m., and the train was due to arrive two minutes later. I was contented, proud of my punctuality and expecting an easy ride to New York in the designated “quiet car,” with just enough time to finish whatever book I was carrying. There would be dinner in Midtown with a magical friend, followed by overnight family visits in Baltimore and Washington, and then a flight back to Los Angeles and the University of Southern California, at which point a sabbatical semester would be at an end.

Keyword: Stroke; Learning & Memory
Link ID: 22478 - Posted: 07.26.2016

Dean Burnett On July 31st 2016, this blog will have been in existence for four years exactly. A huge thanks to everyone who’s made the effort to read it in that time (an alarming number of you). Normally there’d be a post on the day to mark the occasion, but this year the 31st is a) a Sunday, and b) my birthday, so even if I could be bothered to work that day, it’s unlikely anyone would want to read it. However, today also marks the ridiculously-unlikely-but-here-we-are American release of my book. How did it get to this point? I’ve been a “professional” science writer now for four years, and I’ve been involved in neuroscience, in one guise or another, since 2000, the year I started my undergraduate degree. In that time, I’ve heard/encountered some seriously bizarre claims about how the brain works. Oftentimes it was me not understanding what was being said, or misinterpreting a paper, or just my own lack of competence. Sometimes, it was just a media exaggeration. However, there have been occasions when a claim made about the brain thwarts all my efforts to find published evidence or even a rational basis for it, leaving me scratching my head and wondering “where the hell did THAT come from?” Here are some of my favourites. In the past, one terabyte of storage capacity would have seemed incredibly impressive. But Moore’s law has put paid to that. My home desktop PC presently has 1.5 TB of storage space, and that’s over seven years old. Could my own clunky desktop be, in terms of information capacity, smarter than me? Apparently. Some estimates put the capacity of the human brain as low as 1TB. A lifetimes worth of memories wouldn’t fill a modern-day hard drive? That seems far-fetched, at least at an intuitive level.

Keyword: Development of the Brain
Link ID: 22477 - Posted: 07.26.2016

By Dave Dormer, Transporting babies deprived of oxygen at birth to a neonatal intensive care unit in Calgary will soon be safer thanks to a new portable cooling device. The Foothills hospital is one of the first facilities in Canada to acquire one and doctors hope it will help prevent brain injuries, as reducing a baby's temperature can prevent damage to brain tissue and promote healing. The reduction in temperature is called therapeutic hypothermia, and it can help prevent damage to brain tissue and promote healing. (Evelyne Asselin/CBC) "The period immediately following birth is critical. We have about a six-hour window to lower these babies' temperatures to prevent neurological damage," said Dr. Khorshid Mohammad, the neonatal neurocritical care project lead who spearheaded the initiative. "The sooner we can do so, and the more consistent we can make the temperature, the more protective it is and the better their chances of surviving without injury." Since about 2008, doctors used cooling blankets and gel packs to lower a baby's temperature to 33.5 C from the normal 37 C for 72 hours in order to prevent brain damage. "With those methods, it can be difficult to maintain a stable temperature," said Mohammad. ©2016 CBC/Radio-Canada.

Keyword: Development of the Brain
Link ID: 22476 - Posted: 07.26.2016

By Andy Coghlan The final brain edit before adulthood has been observed for the first time. MRI scans of 300 adolescents and young adults have shown how the teenage brain upgrades itself to become quicker – but that errors in this process may lead to schizophrenia in later life. The editing process that takes place in teen years seems to select the brain’s best connections and networks, says Kirstie Whitaker at the University of Cambridge. “The result is a brain that’s sleeker and more efficient.” When Whitaker and her team scanned brains from people between the ages of 14 and 24, they found that two major changes take place in the outer layer of the brain – the cortex – at this time. As adolescence progresses, this layer of grey matter gets thinner – probably because unwanted or unused connections between neurons – called synapses – are pruned back. At the same time, important neurons are upgraded. The parts of these cells that carry signals down towards synapses are given a sheath that helps them transmit signals more quickly – a process called myelination. “It may be that pruning and myelination are part of the maturation of the brain,” says Steven McCarroll at Harvard Medical School. “Pruning involves removing the connections that are not used, and myelination takes the ones that are left and makes them faster,” he says. McCarroll describes this as a trade-off – by pruning connections, we lose some flexibility in the brain, but the proficiency of signal transmission improves. © Copyright Reed Business Information Ltd.

Keyword: Development of the Brain
Link ID: 22474 - Posted: 07.26.2016

By Lizzie Wade Neandertals and modern humans had a lot in common—at least enough to have babies together fairly often. But what about their brains? To answer that question, scientists have looked at how Neandertal and modern human brains developed during the crucial time of early childhood. In the first year of life, modern human infants go through a growth spurt in several parts of the brain: the cerebellum, the parietal lobes, and the temporal lobes—key regions for language and social interaction. Past studies suggested baby Neandertal brains developed more like the brains of chimpanzees, without concentrated growth in any particular area. But a new study casts doubt on that idea. Scientists examined 15 Neandertal skulls, including one newborn and a pair of children under the age of 2. By carefully imaging the skulls, the team determined that Neandertal temporal lobes, frontal lobes, and cerebellums did, in fact, grow faster than the rest of the brain in early life, a pattern very similar to modern humans, they report today in Current Biology. Scientists had overlooked that possibility, the researchers say, because Neandertals and Homo sapiens have such differently shaped skulls. Modern humans’ rounded skull is a telltale marker of the growth spurt, for example, whereas Neandertals’ skulls were relatively flat on the top. If Neandertals did, in fact, have fast developing cerebellums and temporal and frontal lobes, they might have been more skilled at language and socializing than assumed, scientists say. This could in turn explain how the children of Neandertal–modern human pairings fared well enough to pass down their genes to so many us living today. © 2016 American Association for the Advancement of Science

Keyword: Evolution; Development of the Brain
Link ID: 22473 - Posted: 07.26.2016

By Tanya Lewis Scientists have made significant progress toward understanding how individual memories are formed, but less is known about how multiple memories interact. Researchers from the Hospital for Sick Children in Toronto and colleagues studied how memories are encoded in the amygdalas of mice. Memories formed within six hours of each other activate the same population of neurons, whereas distinct sets of brain cells encode memories formed farther apart, in a process whereby neurons compete with their neighbors, according to the team’s study, published today (July 21) in Science. “Some memories naturally go together,” study coauthor Sheena Josselyn of the Hospital for Sick Children told The Scientist. For example, you may remember walking down the aisle at your wedding ceremony and, later, your friend having a bit too much to drink at the reception. “We’re wondering about how these memories become linked in your mind,” Josselyn said. When the brain forms a memory, a group of neurons called an “engram” stores that information. Neurons in the lateral amygdala—a brain region involved in memory of fearful events—are thought to compete with one another to form an engram. Cells that are more excitable or have higher expression of the transcription factor CREB—which is critical for the formation of long-term memories—at the time the memory is being formed will “win” this competition and become part of a memory. © 1986-2016 The Scientist

Keyword: Learning & Memory
Link ID: 22467 - Posted: 07.23.2016

By Minaz Kerawala, For years, gamers, athletes and even regular people trying to improving their memory have resorted, with electrified enthusiasm, to "brain zapping" to gain an edge. The procedure, called transcranial direct current stimulation (tDCS), uses a battery and electrodes to deliver electrical pulses to the brain, usually through a cap or headset fitted close to the scalp. Proponents say these currents are beneficial for a range of neurological conditions like Alzheimer's and Parkinson's diseases, stroke and schizophrenia, but experts are warning that too little is known about the safety of tDCS. "You might end up with a placement of electrodes that doesn't do what you think it does and could potentially have long-lasting effects," said Matthew Krause, a neuroscientist at the Montreal Neurological Institute. All functions of the brain—thought, emotion and coordination—are carried out by neurons using pulses of electricity. "The objective of all neuroscience is to influence these electrical processes," Krause said. The brain's activity can be influenced by drugs that alter its electrochemistry or by external external electric fields. While mind-altering headsets may seem futuristic, tDCS is not a new procedure. Much of the pioneering work in the field was done in Montreal by Dr. Wilder Penfield in the 1920s and 30s. ©2016 CBC/Radio-Canada.

Keyword: Alzheimers
Link ID: 22464 - Posted: 07.21.2016