Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By ANDREW POLLACK A new type of drug for Alzheimer’s disease failed to slow the rate of decline in mental ability and daily functioning in its first large clinical trial. There was a hint, though, that it might be effective for certain patients. The drug, called LMTX, is the first one with its mode of action — trying to undo so-called tau tangles in the brain — to reach the final stage of clinical trials. So the results of the study were eagerly awaited. The initial reaction to the outcome was disappointment, with perhaps a glimmer of hopefulness. Over all, the patients who received LMTX, which was developed by TauRx Therapeutics, did not have a slower rate of decline in mental ability or daily functioning than those in the control group. However, the drug did seem to work for the subset of patients — about 15 percent of those in the study — who took LMTX as their only therapy. The other 85 percent of patients took an existing Alzheimer’s drug in addition to either LMTX or a placebo. “There were highly significant, clinically meaningful, large effects in patients taking the drug as monotherapy, and no effect in patients taking it as an add-on,” Claude Wischik, a founder and the chief executive of TauRx, said in an interview. He spoke from Toronto, where the results were being presented at the Alzheimer’s Association International Conference. Dr. Wischik said a second clinical trial sponsored by the company, whose results will be announced later, found the same phenomenon. He said the company planned to apply for approval of LMTX to be used by itself. But some experts not involved in the study were skeptical about drawing conclusions from a small subset of patients, especially since there was no obvious explanation why LMTX would be expected to work only in patients not getting other drugs. on © 2016 The New York Times Company
Link ID: 22488 - Posted: 07.28.2016
Ian Sample and Nicky Woolf When Bill Gates pulled on a red and white-striped cord to upturn a bucket of iced water positioned delicately over his head, the most immediate thought for many was not, perhaps, of motor neurone disease. But the ice bucket challenge, the charity campaign that went viral in the summer of 2014 and left scores of notable persons from Gates and Mark Zuckerberg to George W. Bush and Anna Wintour shivering and drenched, has paid off in the most spectacular way. Dismissed by some at the time as “slacktivism” - an exercise that appears to do good while achieving very little - the ice bucket challenge raised more than $115m (£88m) for motor neurone disease in a single month. Now, scientists funded with the proceeds have discovered a gene variant associated with the condition. In the near term the NEK1 gene variant, described in the journal Nature Genetics this week, will help scientists understand how the incurable disorder, known also as Amyotrophic Lateral Sclerosis (ALS) or Lou Gehrig’s disease, takes hold. Once the mechanisms are more clearly elucidated, it may steer researchers on a path towards much-needed treatments. The work may never have happened were it not for the curious appeal of the frozen water drenchings. The research grants that scientists are awarded do not get close to the €4m the study required. Instead, Project MinE, which aims to unravel the genetic basis of the disease and ultimately find a cure, was funded by the ALS Association through ice bucket challenge donations. © 2016 Guardian News and Media Limited
By Gretchen Reynolds Learning requires more than the acquisition of unfamiliar knowledge; that new information or know-how, if it’s to be more than ephemeral, must be consolidated and securely stored in long-term memory. Mental repetition is one way to do that, of course. But mounting scientific evidence suggests that what we do physically also plays an important role in this process. Sleep, for instance, reinforces memory. And recent experiments show that when mice and rats jog on running wheels after acquiring a new skill, they learn much better than sedentary rodents do. Exercise seems to increase the production of biochemicals in the body and brain related to mental function. Researchers at the Donders Institute for Brain, Cognition and Behavior at Radboud University in the Netherlands and the University of Edinburgh have begun to explore this connection. For a study published this month in Current Biology, 72 healthy adult men and women spent about 40 minutes undergoing a standard test of visual and spatial learning. They observed pictures on a computer screen and then were asked to remember their locations. Afterward, the subjects all watched nature documentaries. Two-thirds of them also exercised: Half were first put through interval training on exercise bicycles for 35 minutes immediately after completing the test; the others did the same workout four hours after the test. Two days later, everyone returned to the lab and repeated the original computerized test while an M.R.I. machine scanned their brain activity. Those who exercised four hours after the test recognized and recreated the picture locations most accurately. Their brain activity was subtly different, too, showing a more consistent pattern of neural activity. The study’s authors suggest that their brains might have been functioning more efficiently because they had learned the patterns so fully. But why delaying exercise for four hours was more effective than an immediate workout remains mysterious. By contrast, rodents do better in many experiments if they work out right after learning. © 2016 The New York Times Company
Keyword: Learning & Memory
Link ID: 22486 - Posted: 07.28.2016
Jon Hamilton Two studies released at an international Alzheimer's meeting Tuesday suggest doctors may eventually be able to screen people for this form of dementia by testing the ability to identify familiar odors, like smoke, coffee and raspberry. In both studies, people who were in their 60s and older took a standard odor detection test. And in both cases, those who did poorly on the test were more likely to already have — or go on to develop — problems with memory and thinking. "The whole idea is to create tests that a general clinician can use in an office setting," says Dr. William Kreisl, a neurologist at Columbia University, where both studies were done. The research was presented at the Alzheimer's Association International Conference in Toronto. Currently, any tests that are able to spot people in the earliest stages of Alzheimer's are costly and difficult. They include PET scans, which can detect sticky plaques in the brain, and spinal taps that measure the levels of certain proteins in spinal fluid. The idea of an odor detection test arose, in part, from something doctors have observed for many years in patients with Alzheimer's, Kreisl says. "Patients will tell us that food does not taste as good," he says. The reason is often that these patients have lost the ability to smell what they eat. That's not surprising, Kreisl says, given that odor signals from the nose have to be processed in areas of the brain that are among the first to be affected by Alzheimer's disease. But it's been tricky to develop a reliable screening test using odor detection. © 2016 npr
By PAM BELLUCK “Has the person become agitated, aggressive, irritable, or temperamental?” the questionnaire asks. “Does she/he have unrealistic beliefs about her/his power, wealth or skills?” Or maybe another kind of personality change has happened: “Does she/he no longer care about anything?” If the answer is yes to one of these questions — or others on a new checklist — and the personality or behavior change has lasted for months, it could indicate a very early stage of dementia, according to a group of neuropsychiatrists and Alzheimer’s experts. They are proposing the creation of a new diagnosis: mild behavioral impairment. The idea is to recognize and measure something that some experts say is often overlooked: Sharp changes in mood and behavior may precede the memory and thinking problems of dementia. The group made the proposal on Sunday at the Alzheimer’s Association International Conference in Toronto, and presented a 38-question checklist that may one day be used to identify people at greater risk for Alzheimer’s. “I think we do need something like this,” said Nina Silverberg, the director of the Alzheimer’s Disease Centers program at the National Institute on Aging, who was not involved in creating the checklist or the proposed new diagnosis. “Most people think of Alzheimer’s as primarily a memory disorder, but we do know from years of research that it also can start as a behavioral issue.” Under the proposal, mild behavioral impairment (M.B.I.) would be a clinical designation preceding mild cognitive impairment (M.C.I.), a diagnosis created more than a decade ago to describe people experiencing some cognitive problems but who can still perform most daily functions. © 2016 The New York Times Company
Link ID: 22480 - Posted: 07.26.2016
By Sharon Begley, STAT For the first time ever, researchers have managed to reduce people’s risk for dementia — not through a medicine, special diet, or exercise, but by having healthy older adults play a computer-based brain-training game. The training nearly halved the incidence of Alzheimer’s disease and other devastating forms of cognitive and memory loss in older adults a decade after they completed it, scientists reported on Sunday. If the surprising finding holds up, the intervention would be the first of any kind — including drugs, diet, and exercise — to do that. “I think these results are highly, highly promising,” said George Rebok of the Johns Hopkins Bloomberg School of Public Health, an expert on cognitive aging who was not involved in the study. “It’s exciting that this intervention pays dividends so far down the line.” The results, presented at the Alzheimer’s Association International Conference in Toronto, come from the government-funded ACTIVE (Advanced Cognitive Training for Independent and Vital Elderly) study. Starting in 1998, ACTIVE’s 2,832 healthy older adults (average age at the start: 74) received one of three forms of cognitive training, or none, and were evaluated periodically in the years after. In actual numbers, 14 percent of ACTIVE participants who received no training had dementia 10 years later, said psychologist Jerri Edwards of the University of South Florida, who led the study. Among those who completed up to 10 60-to-75-minute sessions of computer-based training in speed-of-processing — basically, how quickly and accurately they can pay attention to, process, and remember brief images on a computer screen — 12.1 percent developed dementia. Of those who completed all 10 initial training sessions plus four booster sessions a few years later, 8.2 percent developed dementia. © 2016 Scientific American
By Tim Page When I returned to California, I brought my diaries into the back yard every afternoon and read them through sequentially, with the hope of learning more about the years before my brain injury. I remembered much of what I’d done professionally, and whatever additional information I needed could usually be found on my constantly vandalized Wikipedia page. Here was the story of an awkward, imperious child prodigy who made his own films and became famous much too early; a music explainer who won a Pulitzer Prize; a driven and obsessive loner whose fascinations led to collaborations with Glenn Gould, Philip Glass and Thomas Pynchon. In 2000, at age 45, I was diagnosed with Asperger’s syndrome. In retrospect, the only surprise is that it took so long. But the diaries offered a more intimate view. Reading them was slow going, and I felt as though my nose was pressed up against the windowpane of my own life. The shaggy-dog accretion of material — phone numbers, long-ago concert dates, coded references to secret loves — all seemed to belong to somebody else. My last clear memory was of a muggy, quiet Sunday morning in July, three months earlier, as I waited for a train in New London, Conn. It was 11:13 a.m., and the train was due to arrive two minutes later. I was contented, proud of my punctuality and expecting an easy ride to New York in the designated “quiet car,” with just enough time to finish whatever book I was carrying. There would be dinner in Midtown with a magical friend, followed by overnight family visits in Baltimore and Washington, and then a flight back to Los Angeles and the University of Southern California, at which point a sabbatical semester would be at an end.
Dean Burnett On July 31st 2016, this blog will have been in existence for four years exactly. A huge thanks to everyone who’s made the effort to read it in that time (an alarming number of you). Normally there’d be a post on the day to mark the occasion, but this year the 31st is a) a Sunday, and b) my birthday, so even if I could be bothered to work that day, it’s unlikely anyone would want to read it. However, today also marks the ridiculously-unlikely-but-here-we-are American release of my book. How did it get to this point? I’ve been a “professional” science writer now for four years, and I’ve been involved in neuroscience, in one guise or another, since 2000, the year I started my undergraduate degree. In that time, I’ve heard/encountered some seriously bizarre claims about how the brain works. Oftentimes it was me not understanding what was being said, or misinterpreting a paper, or just my own lack of competence. Sometimes, it was just a media exaggeration. However, there have been occasions when a claim made about the brain thwarts all my efforts to find published evidence or even a rational basis for it, leaving me scratching my head and wondering “where the hell did THAT come from?” Here are some of my favourites. In the past, one terabyte of storage capacity would have seemed incredibly impressive. But Moore’s law has put paid to that. My home desktop PC presently has 1.5 TB of storage space, and that’s over seven years old. Could my own clunky desktop be, in terms of information capacity, smarter than me? Apparently. Some estimates put the capacity of the human brain as low as 1TB. A lifetimes worth of memories wouldn’t fill a modern-day hard drive? That seems far-fetched, at least at an intuitive level.
Keyword: Development of the Brain
Link ID: 22477 - Posted: 07.26.2016
By Dave Dormer, Transporting babies deprived of oxygen at birth to a neonatal intensive care unit in Calgary will soon be safer thanks to a new portable cooling device. The Foothills hospital is one of the first facilities in Canada to acquire one and doctors hope it will help prevent brain injuries, as reducing a baby's temperature can prevent damage to brain tissue and promote healing. The reduction in temperature is called therapeutic hypothermia, and it can help prevent damage to brain tissue and promote healing. (Evelyne Asselin/CBC) "The period immediately following birth is critical. We have about a six-hour window to lower these babies' temperatures to prevent neurological damage," said Dr. Khorshid Mohammad, the neonatal neurocritical care project lead who spearheaded the initiative. "The sooner we can do so, and the more consistent we can make the temperature, the more protective it is and the better their chances of surviving without injury." Since about 2008, doctors used cooling blankets and gel packs to lower a baby's temperature to 33.5 C from the normal 37 C for 72 hours in order to prevent brain damage. "With those methods, it can be difficult to maintain a stable temperature," said Mohammad. ©2016 CBC/Radio-Canada.
Keyword: Development of the Brain
Link ID: 22476 - Posted: 07.26.2016
By Andy Coghlan The final brain edit before adulthood has been observed for the first time. MRI scans of 300 adolescents and young adults have shown how the teenage brain upgrades itself to become quicker – but that errors in this process may lead to schizophrenia in later life. The editing process that takes place in teen years seems to select the brain’s best connections and networks, says Kirstie Whitaker at the University of Cambridge. “The result is a brain that’s sleeker and more efficient.” When Whitaker and her team scanned brains from people between the ages of 14 and 24, they found that two major changes take place in the outer layer of the brain – the cortex – at this time. As adolescence progresses, this layer of grey matter gets thinner – probably because unwanted or unused connections between neurons – called synapses – are pruned back. At the same time, important neurons are upgraded. The parts of these cells that carry signals down towards synapses are given a sheath that helps them transmit signals more quickly – a process called myelination. “It may be that pruning and myelination are part of the maturation of the brain,” says Steven McCarroll at Harvard Medical School. “Pruning involves removing the connections that are not used, and myelination takes the ones that are left and makes them faster,” he says. McCarroll describes this as a trade-off – by pruning connections, we lose some flexibility in the brain, but the proficiency of signal transmission improves. © Copyright Reed Business Information Ltd.
Keyword: Development of the Brain
Link ID: 22474 - Posted: 07.26.2016
By Lizzie Wade Neandertals and modern humans had a lot in common—at least enough to have babies together fairly often. But what about their brains? To answer that question, scientists have looked at how Neandertal and modern human brains developed during the crucial time of early childhood. In the first year of life, modern human infants go through a growth spurt in several parts of the brain: the cerebellum, the parietal lobes, and the temporal lobes—key regions for language and social interaction. Past studies suggested baby Neandertal brains developed more like the brains of chimpanzees, without concentrated growth in any particular area. But a new study casts doubt on that idea. Scientists examined 15 Neandertal skulls, including one newborn and a pair of children under the age of 2. By carefully imaging the skulls, the team determined that Neandertal temporal lobes, frontal lobes, and cerebellums did, in fact, grow faster than the rest of the brain in early life, a pattern very similar to modern humans, they report today in Current Biology. Scientists had overlooked that possibility, the researchers say, because Neandertals and Homo sapiens have such differently shaped skulls. Modern humans’ rounded skull is a telltale marker of the growth spurt, for example, whereas Neandertals’ skulls were relatively flat on the top. If Neandertals did, in fact, have fast developing cerebellums and temporal and frontal lobes, they might have been more skilled at language and socializing than assumed, scientists say. This could in turn explain how the children of Neandertal–modern human pairings fared well enough to pass down their genes to so many us living today. © 2016 American Association for the Advancement of Science
By Tanya Lewis Scientists have made significant progress toward understanding how individual memories are formed, but less is known about how multiple memories interact. Researchers from the Hospital for Sick Children in Toronto and colleagues studied how memories are encoded in the amygdalas of mice. Memories formed within six hours of each other activate the same population of neurons, whereas distinct sets of brain cells encode memories formed farther apart, in a process whereby neurons compete with their neighbors, according to the team’s study, published today (July 21) in Science. “Some memories naturally go together,” study coauthor Sheena Josselyn of the Hospital for Sick Children told The Scientist. For example, you may remember walking down the aisle at your wedding ceremony and, later, your friend having a bit too much to drink at the reception. “We’re wondering about how these memories become linked in your mind,” Josselyn said. When the brain forms a memory, a group of neurons called an “engram” stores that information. Neurons in the lateral amygdala—a brain region involved in memory of fearful events—are thought to compete with one another to form an engram. Cells that are more excitable or have higher expression of the transcription factor CREB—which is critical for the formation of long-term memories—at the time the memory is being formed will “win” this competition and become part of a memory. © 1986-2016 The Scientist
Keyword: Learning & Memory
Link ID: 22467 - Posted: 07.23.2016
By Minaz Kerawala, For years, gamers, athletes and even regular people trying to improving their memory have resorted, with electrified enthusiasm, to "brain zapping" to gain an edge. The procedure, called transcranial direct current stimulation (tDCS), uses a battery and electrodes to deliver electrical pulses to the brain, usually through a cap or headset fitted close to the scalp. Proponents say these currents are beneficial for a range of neurological conditions like Alzheimer's and Parkinson's diseases, stroke and schizophrenia, but experts are warning that too little is known about the safety of tDCS. "You might end up with a placement of electrodes that doesn't do what you think it does and could potentially have long-lasting effects," said Matthew Krause, a neuroscientist at the Montreal Neurological Institute. All functions of the brain—thought, emotion and coordination—are carried out by neurons using pulses of electricity. "The objective of all neuroscience is to influence these electrical processes," Krause said. The brain's activity can be influenced by drugs that alter its electrochemistry or by external external electric fields. While mind-altering headsets may seem futuristic, tDCS is not a new procedure. Much of the pioneering work in the field was done in Montreal by Dr. Wilder Penfield in the 1920s and 30s. ©2016 CBC/Radio-Canada.
Link ID: 22464 - Posted: 07.21.2016
By TRIP GABRIEL DO you remember June 27, 2015? If you knew you had been on a sailboat, and that the weather was miserable, and that afterward you had a beer with the other sailors, would you expect to recall — even one year later — at least a few details? I was on that boat, on a blustery Saturday on Long Island Sound. But every detail is missing from my memory, as if snipped out by an overzealous movie editor. The earliest moment I recall from the day is lying in an industrial tube with a kind of upturned colander over my face, fighting waves of claustrophobia. My mind was densely fogged, but I understood that I was in an M.R.I. machine. Someone was scanning my brain. Other hazy scenes followed: being wheeled into a hospital room. My wife, Alice, hovering in the background. A wall clock that read minutes to midnight, an astonishing piece of information. What had happened to the day? Late that night, alone in the room, I noticed two yellow Post-its on the bedside table in Alice’s writing: “You have a condition called transient global amnesia. It will last Hours not DAYS. You’re going to be fine. Your CT scan was clear. You sailed today and drove yourself home,” the note read in part. I had never heard of transient global amnesia, a rare condition in which you are suddenly unable to recall recent events. Its causes are unknown. Unlike other triggers of memory loss, like a stroke or epileptic seizures, the condition is considered harmless, and an episode does not last long. “We don’t understand why it happens,” a neurologist would later tell me. “There are a million theories.” © 2016 The New York Times Company
Keyword: Learning & Memory
Link ID: 22456 - Posted: 07.19.2016
By Alice Klein Blame grandpa. A study in mice shows that the grandsons of obese males are more susceptible to the detrimental health effects of junk food, even if their fathers are lean and healthy. The finding adds to evidence that new traits can be passed down the family line without being permanently recorded in a family’s genes – a phenomenon called transgenerational epigenetics. Last year, a study found that the DNA in the sperm of obese men is modified in thousands of places, and that these sperm also contain short pieces of RNA. These are epigenetic modifications – they don’t affect the precise code of genes, but instead may affect how active particular genes are. Now Catherine Suter at Victor Chang Cardiac Research Institute in Sydney and her team have investigated the longer-term effects of paternal obesity. To do this, they mated obese male mice with lean female mice. They found that, compared with the offspring of lean males, both the sons and grandsons of the obese males were more likely to show the early signs of fatty liver disease and diabetes when given a junk food diet. The same effect wasn’t seen in daughters or granddaughters. Even when the sons of the obese males were fed a healthy diet and kept at a normal weight, their sons still had a greater tendency to develop obesity-related conditions when exposed to a junk diet. © Copyright Reed Business Information Ltd.
By William Kenower My youngest son, Sawyer, used to spend far more time relating to his imagination than he did to the world around him. He would run back and forth humming, flapping his hands and thumping on his chest. By the time he was in first grade, attempts to draw him out of his pretend world to join his classmates or do some class work led to explosions and timeouts. At 7 he was given a diagnosis of being on the autism spectrum. That was when my wife, Jen, learned about the practice called joining. The idea behind it, which she discovered in Barry Neil Kaufman’s book “Son-Rise,” is brilliant in its simplicity. We wanted Sawyer to be with us. We did not want him to live in this bubble of his own creation. And so, instead of telling him to stop pretending and join us, we started pretending and joined him. The first time Jen joined him, the first time she ran beside him humming and thumping her chest, he stopped running, stopped thumping, stopped humming and, without a single word from us, turned to her and said, “What are you doing?” We took turns joining him every day, and a week later we got an email from his special education teacher telling us to keep doing whatever we were doing. He’d gone from five timeouts a day to one in a week. The classroom was the same, the work was the same – all that was different was that we had found a way to say to him in a language he could understand, “You’re not wrong.” Emboldened by our success, we set about becoming more fluent in this language. For the next couple of years we taught ourselves to join him constantly. This meant that whatever we were doing had to stop whenever we heard him running back and forth and humming. But we could not join him simply to get him to stop running and thumping and humming. We had to join him without any judgment or impatience. That was the trickiest part. The desire to fix him was great. I had come to believe that there were broken people in need of fixing. Sometimes, I looked like one of those people. I was a 40-year-old unpublished writer working as a waiter. My life reeked of failure. Many days I looked in the mirror and asked, “What is wrong with me?” © 2016 The New York Times Company
Link ID: 22451 - Posted: 07.16.2016
James M. Broadway “Where did the time go?” middle-aged and older adults often remark. Many of us feel that time passes more quickly as we age, a perception that can lead to regrets. According to psychologist and BBC columnist Claudia Hammond, “the sensation that time speeds up as you get older is one of the biggest mysteries of the experience of time.” Fortunately, our attempts to unravel this mystery have yielded some intriguing findings. In 2005, for instance, psychologists Marc Wittmann and Sandra Lenhoff, both then at Ludwig Maximilian University of Munich, surveyed 499 participants, ranging in age from 14 to 94 years, about the pace at which they felt time moving—from “very slowly” to “very fast.” For shorter durations—a week, a month, even a year—the subjects' perception of time did not appear to increase with age. Most participants felt that the clock ticked by quickly. But for longer durations, such as a decade, a pattern emerged: older people tended to perceive time as moving faster. When asked to reflect on their lives, the participants older than 40 felt that time elapsed slowly in their childhood but then accelerated steadily through their teenage years into early adulthood. There are good reasons why older people may feel that way. When it comes to how we perceive time, humans can estimate the length of an event from two very different perspectives: a prospective vantage, while an event is still occurring, or a retrospective one, after it has ended. In addition, our experience of time varies with whatever we are doing and how we feel about it. In fact, time does fly when we are having fun. Engaging in a novel exploit makes time appear to pass more quickly in the moment. But if we remember that activity later on, it will seem to have lasted longer than more mundane experiences. © 2016 Scientific American,
Helen Haste The American psychologist and educationist Jerome Bruner, who has died aged 100, repeatedly challenged orthodoxies and generated novel directions. His elegant, accessible writing reached wide audiences. His colleague Rom Harré described his lectures as inspiring: “He darted all over the place, one topic suggested another and so on through a thrilling zigzag.” To the charge that he was always asking impossible questions, Jerry replied: “They are pretty much impossible, but the search for the impossible is part of what intelligence is about.” He was willing to engage with controversy, both on academic issues and in education politics. Blind at birth because of cataracts, Jerry gained his sight after surgery at the age of two. He credited this for his sense that we actively interpret and organise our world rather than passively react to it – a theme that he continued to develop in different ways. His first work lay in perception, when he resumed research at Harvard after the second world war. He found that children’s judgments of the size of coins and coin-like disks varied: poorer children overestimated the size of the coins. This contributed to the emerging “new look” movement in psychology, involving values, intentions and interpretation in contrast to the then dominant behaviourist focus on passive learning, reward and punishment. His professorship at Harvard came in 1952, and by the middle of the decade a computer metaphor began to influence psychology – the “cognitive revolution”. With Jacqueline Goodnow and George Austin, Jerry published A Study of Thinking (1956). © 2016 Guardian News and Media Limited
By Andy Coghlan There once was a brainy duckling. It could remember whether shapes or colours it saw just after hatching were the same as or different to each other. The feat surprised the researchers, who were initially sceptical about whether the ducklings could grasp such complex concepts as “same” and “different”. The fact that they could suggests the ability to think in an abstract way may be far more common in nature than expected, and not just restricted to humans and a handful of animals with big brains. “We were completely surprised,” says Alex Kacelnik at the University of Oxford, who conducted the experiment along with his colleague Antone Martinho III. Kacelnik and Martinho reasoned that ducklings might be able to grasp patterns relating to shape or colour as part of the array of sensory information they absorb soon after hatching. Doing so would allow them to recognise their mothers and siblings and distinguish them from all others – abilities vital for survival. In ducklings, goslings and other species that depend for survival on following their mothers, newborns learn quickly – a process called filial imprinting. Kacelnik wondered whether this would enable them to be tricked soon after hatching into “following” objects or colours instead of their natural mother, and recognising those same patterns in future. © Copyright Reed Business Information Ltd.
Laura Sanders If you’ve ever watched a baby purse her lips to hoot for the first time, or flash a big, gummy grin when she sees you, or surprise herself by rolling over, you’ve glimpsed the developing brain in action. A baby’s brain constructs itself into something that controls the body, learns and connects socially. Spending time with an older person, you may notice signs of slippage. An elderly man might forget why he went into the kitchen, or fail to anticipate the cyclist crossing the road, or muddle medications with awkward and unfamiliar names. These are the signs of the gentle yet unrelenting neural erosion that comes with normal aging. These two seemingly distinct processes — development and aging — may actually be linked. Hidden in the brain-building process, some scientists now suspect, are the blueprints for the brain’s demise. The way the brain is built, recent research suggests, informs how it will decline in old age. That the end can be traced to the beginning sounds absurd: A sturdily constructed brain stays strong for decades. During childhood, neural pathways make connections in a carefully choreographed order. But in old age, this sequence plays in reverse, brain scans reveal. In both appearance and behavior, old brains seem to drift backward toward earlier stages of development. What’s more, some of the same cellular tools are involved in both processes. © Society for Science & the Public 2000 - 2016
Keyword: Development of the Brain
Link ID: 22440 - Posted: 07.14.2016