Chapter 7. Life-Span Development of the Brain and Behavior
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Nicola Davis Scientists have discovered 17 separate genetic variations that increase the risk of a person developing depression. The findings, which came from analysing DNA data collected from more than 300,000 people, are the first genetics links to the disease found in people of European ancestry. The scientists say the research will contribute to a better understanding of the disease and could eventually lead to new treatments. They also hope it will reduce the stigma that can accompany depression. According to Nice, up to 10% of people seen by practitioners in primary care have clinical depression, with symptoms including a continuously low mood, low self-esteem, difficulties making decisions and lack of energy. Both environmental and genetic factors are thought to be behind depression, with the interaction between the two also thought to be important. But with a large number of genetic variants each thought to make a tiny contribution to the risk of developing the condition, unravelling their identity has proved challenging. While previous studies have turned up a couple of regions in the genome of Chinese women that might increase the risk of depression, the variants didn’t appear to play a role in depression for people of European ancestry. © 2016 Guardian News and Media Limited
By Andy Coghlan Mysterious shrunken cells have been spotted in the human brain for the first time, and appear to be associated with Alzheimer’s disease. “We don’t know yet if they’re a cause or consequence,” says Marie-Ève Tremblay of Laval University in Québec, Canada, who presented her discovery at the Translational Neuroimmunology conference in Big Sky, Montana, last week. The cells appear to be withered forms of microglia – the cells that keep the brain tidy and free of infection, normally by pruning unwanted brain connections or destroying abnormal and infected brain cells. But the cells discovered by Tremblay appear much darker when viewed using an electron microscope, and they seem to be more destructive. “It took a long time for us to identify them,” says Tremblay, who adds that these shrunken microglia do not show up with the same staining chemicals that normally make microglia visible under the microscope. Compared with normal microglia, the dark cells appear to wrap much more tightly around neurons and the connections between them, called synapses. “It seems they’re hyperactive at synapses,” says Tremblay. Where these microglia are present, synapses often seem shrunken and in the process of being degraded. Tremblay first discovered these dark microglia in mice, finding that they increase in number as mice age, and appear to be linked to a number of things, including stress, the neurodegenerative condition Huntington’s disease and a mouse model of Alzheimer’s disease. “There were 10 times as many dark microglia in Alzheimer’s mice as in control mice,” says Tremblay. © Copyright Reed Business Information Ltd.
By Katherine S. Pollard When the first human genome sequence was published in 2001,1 I was a graduate student working as the statistics expert on a team of scientists. Hailing from academia and biotechnology, we aimed to discover differences in gene expression levels between tumors and healthy cells. Like many others, I had high hopes for what we could do with this enormous text file of more than 3 billion As, Cs, Ts, and Gs. Ambitious visions of a precise wiring diagram for human cells and imminent cures for disease were commonplace among my classmates and professors. But I was most excited about a different use of the data, and I found myself counting the months until the genome of a chimpanzee would be sequenced. Chimps are our closest living relatives on the tree of life. While their biology is largely similar to ours, we have many striking differences, ranging from digestive enzymes to spoken language. Humans also suffer from an array of diseases that do not afflict chimpanzees or are less severe in them, including autism, schizophrenia, Alzheimer’s disease, diabetes, atherosclerosis, AIDS, rheumatoid arthritis, and certain cancers. I had long been fascinated with hominin fossils and the way the bones morphed into different forms over evolutionary time. But those skeletons cannot tell us much about the history of our immune system or our cognitive abilities. So I started brainstorming about how to extend the statistical approaches we were using for cancer research to compare human and chimpanzee DNA. My immodest goal was to identify the genetic basis for all the traits that make humans unique. © 1986-2016 The Scientist
Aaron E. Carroll I remember thinking, after my pregnant wife’s water broke, minutes after I went to bed, anguishing really, over one thought as we drove to the hospital: “I’m never going to be well rested again.” If there’s one things all new parents wish for, it’s a good night’s sleep. Unfortunately, infants sometimes make that impossible. They wake up repeatedly, needing to be fed, changed and comforted. Eventually, they reach an age when they should sleep through the night. Some don’t, though. What to do with them continues to be a topic of a heated debate in parenting circles. One camp believes that babies should be left to cry it out. These people place babies in their cribs at a certain time, after a certain routine, and don’t interfere until the next morning. No matter how much the babies scream or cry, parents ignore them. After all, if babies learn that tantrums lead to the appearance of a loved one, they will continue that behavior in the future. The official name for this intervention is “Extinction.” The downside, of course, is that it’s unbelievably stressful for parents. Many can’t do it. And not holding fast to the plan can make everything worse. Responding to an infant’s crying after an extended period of time makes the behavior harder to extinguish. To a baby, it’s like a slot machine that hits just as you’re ready to walk away; it makes you want to play more. A modification of this strategy is known as “Graduated Extinction.” Parents allow their infant to cry it out for a longer period each night, until infants eventually put themselves to sleep. On the first night, for instance, parents might commit to not entering the baby’s room for five minutes. The next night, 10 minutes. Then 15, and so on. Or, they could increase the increments on progressive checks each night. When they do go in the room, it’s only to check and make sure the baby is O.K. – no picking up or comforting. This isn’t meant to be a reward for crying, but to allow parents to be assured that nothing is wrong. © 2016 The New York Times Company
By Tanya Lewis The tangled buildup of tau protein in brain cells is a hallmark of the cognitive decline linked with Alzheimer’s disease. Antibodies have been shown to block tau’s spread, but some scientists worry it could also fuel inflammation. Now, researchers from Genentech in San Francisco and colleagues have found that an antibody’s ability to recruit immune cells—known as its effector function—is not necessary for stopping tau’s spread, the team reported today (July 28) in Cell Reports. “Our results suggest that, given that effector function is not required for efficacy [in treating tau accumulation], going without it could offer a safer approach for immunotherapy,” study coauthor Gai Ayalon of Genentech told The Scientist. Alzheimer’s disease causes a characteristic constellation of pathologies: accumulation of amyloid-β plaques outside neurons, neurofibrillary tangles of tau inside brain cells, and chronic inflammation. Clinical research has mostly focused on targeting amyloid-β with antibody therapies, and several treatments based on this approach are currently in clinical trials. But recent efforts have zeroed in on tau as a new potential target. Antibodies are known to spur the brain’s defense system, microglia, to absorb and degrade tau, but their recruitment of immune cells may also worsen inflammation. Ayalon and colleagues wondered whether effector function was necessary for stopping tau’s spread. © 1986-2016 The Scientist
By ANDREW POLLACK A new type of drug for Alzheimer’s disease failed to slow the rate of decline in mental ability and daily functioning in its first large clinical trial. There was a hint, though, that it might be effective for certain patients. The drug, called LMTX, is the first one with its mode of action — trying to undo so-called tau tangles in the brain — to reach the final stage of clinical trials. So the results of the study were eagerly awaited. The initial reaction to the outcome was disappointment, with perhaps a glimmer of hopefulness. Over all, the patients who received LMTX, which was developed by TauRx Therapeutics, did not have a slower rate of decline in mental ability or daily functioning than those in the control group. However, the drug did seem to work for the subset of patients — about 15 percent of those in the study — who took LMTX as their only therapy. The other 85 percent of patients took an existing Alzheimer’s drug in addition to either LMTX or a placebo. “There were highly significant, clinically meaningful, large effects in patients taking the drug as monotherapy, and no effect in patients taking it as an add-on,” Claude Wischik, a founder and the chief executive of TauRx, said in an interview. He spoke from Toronto, where the results were being presented at the Alzheimer’s Association International Conference. Dr. Wischik said a second clinical trial sponsored by the company, whose results will be announced later, found the same phenomenon. He said the company planned to apply for approval of LMTX to be used by itself. But some experts not involved in the study were skeptical about drawing conclusions from a small subset of patients, especially since there was no obvious explanation why LMTX would be expected to work only in patients not getting other drugs. on © 2016 The New York Times Company
Link ID: 22488 - Posted: 07.28.2016
Ian Sample and Nicky Woolf When Bill Gates pulled on a red and white-striped cord to upturn a bucket of iced water positioned delicately over his head, the most immediate thought for many was not, perhaps, of motor neurone disease. But the ice bucket challenge, the charity campaign that went viral in the summer of 2014 and left scores of notable persons from Gates and Mark Zuckerberg to George W. Bush and Anna Wintour shivering and drenched, has paid off in the most spectacular way. Dismissed by some at the time as “slacktivism” - an exercise that appears to do good while achieving very little - the ice bucket challenge raised more than $115m (£88m) for motor neurone disease in a single month. Now, scientists funded with the proceeds have discovered a gene variant associated with the condition. In the near term the NEK1 gene variant, described in the journal Nature Genetics this week, will help scientists understand how the incurable disorder, known also as Amyotrophic Lateral Sclerosis (ALS) or Lou Gehrig’s disease, takes hold. Once the mechanisms are more clearly elucidated, it may steer researchers on a path towards much-needed treatments. The work may never have happened were it not for the curious appeal of the frozen water drenchings. The research grants that scientists are awarded do not get close to the €4m the study required. Instead, Project MinE, which aims to unravel the genetic basis of the disease and ultimately find a cure, was funded by the ALS Association through ice bucket challenge donations. © 2016 Guardian News and Media Limited
Jon Hamilton Two studies released at an international Alzheimer's meeting Tuesday suggest doctors may eventually be able to screen people for this form of dementia by testing the ability to identify familiar odors, like smoke, coffee and raspberry. In both studies, people who were in their 60s and older took a standard odor detection test. And in both cases, those who did poorly on the test were more likely to already have — or go on to develop — problems with memory and thinking. "The whole idea is to create tests that a general clinician can use in an office setting," says Dr. William Kreisl, a neurologist at Columbia University, where both studies were done. The research was presented at the Alzheimer's Association International Conference in Toronto. Currently, any tests that are able to spot people in the earliest stages of Alzheimer's are costly and difficult. They include PET scans, which can detect sticky plaques in the brain, and spinal taps that measure the levels of certain proteins in spinal fluid. The idea of an odor detection test arose, in part, from something doctors have observed for many years in patients with Alzheimer's, Kreisl says. "Patients will tell us that food does not taste as good," he says. The reason is often that these patients have lost the ability to smell what they eat. That's not surprising, Kreisl says, given that odor signals from the nose have to be processed in areas of the brain that are among the first to be affected by Alzheimer's disease. But it's been tricky to develop a reliable screening test using odor detection. © 2016 npr
By PAM BELLUCK “Has the person become agitated, aggressive, irritable, or temperamental?” the questionnaire asks. “Does she/he have unrealistic beliefs about her/his power, wealth or skills?” Or maybe another kind of personality change has happened: “Does she/he no longer care about anything?” If the answer is yes to one of these questions — or others on a new checklist — and the personality or behavior change has lasted for months, it could indicate a very early stage of dementia, according to a group of neuropsychiatrists and Alzheimer’s experts. They are proposing the creation of a new diagnosis: mild behavioral impairment. The idea is to recognize and measure something that some experts say is often overlooked: Sharp changes in mood and behavior may precede the memory and thinking problems of dementia. The group made the proposal on Sunday at the Alzheimer’s Association International Conference in Toronto, and presented a 38-question checklist that may one day be used to identify people at greater risk for Alzheimer’s. “I think we do need something like this,” said Nina Silverberg, the director of the Alzheimer’s Disease Centers program at the National Institute on Aging, who was not involved in creating the checklist or the proposed new diagnosis. “Most people think of Alzheimer’s as primarily a memory disorder, but we do know from years of research that it also can start as a behavioral issue.” Under the proposal, mild behavioral impairment (M.B.I.) would be a clinical designation preceding mild cognitive impairment (M.C.I.), a diagnosis created more than a decade ago to describe people experiencing some cognitive problems but who can still perform most daily functions. © 2016 The New York Times Company
Link ID: 22480 - Posted: 07.26.2016
By Sharon Begley, STAT For the first time ever, researchers have managed to reduce people’s risk for dementia — not through a medicine, special diet, or exercise, but by having healthy older adults play a computer-based brain-training game. The training nearly halved the incidence of Alzheimer’s disease and other devastating forms of cognitive and memory loss in older adults a decade after they completed it, scientists reported on Sunday. If the surprising finding holds up, the intervention would be the first of any kind — including drugs, diet, and exercise — to do that. “I think these results are highly, highly promising,” said George Rebok of the Johns Hopkins Bloomberg School of Public Health, an expert on cognitive aging who was not involved in the study. “It’s exciting that this intervention pays dividends so far down the line.” The results, presented at the Alzheimer’s Association International Conference in Toronto, come from the government-funded ACTIVE (Advanced Cognitive Training for Independent and Vital Elderly) study. Starting in 1998, ACTIVE’s 2,832 healthy older adults (average age at the start: 74) received one of three forms of cognitive training, or none, and were evaluated periodically in the years after. In actual numbers, 14 percent of ACTIVE participants who received no training had dementia 10 years later, said psychologist Jerri Edwards of the University of South Florida, who led the study. Among those who completed up to 10 60-to-75-minute sessions of computer-based training in speed-of-processing — basically, how quickly and accurately they can pay attention to, process, and remember brief images on a computer screen — 12.1 percent developed dementia. Of those who completed all 10 initial training sessions plus four booster sessions a few years later, 8.2 percent developed dementia. © 2016 Scientific American
Dean Burnett On July 31st 2016, this blog will have been in existence for four years exactly. A huge thanks to everyone who’s made the effort to read it in that time (an alarming number of you). Normally there’d be a post on the day to mark the occasion, but this year the 31st is a) a Sunday, and b) my birthday, so even if I could be bothered to work that day, it’s unlikely anyone would want to read it. However, today also marks the ridiculously-unlikely-but-here-we-are American release of my book. How did it get to this point? I’ve been a “professional” science writer now for four years, and I’ve been involved in neuroscience, in one guise or another, since 2000, the year I started my undergraduate degree. In that time, I’ve heard/encountered some seriously bizarre claims about how the brain works. Oftentimes it was me not understanding what was being said, or misinterpreting a paper, or just my own lack of competence. Sometimes, it was just a media exaggeration. However, there have been occasions when a claim made about the brain thwarts all my efforts to find published evidence or even a rational basis for it, leaving me scratching my head and wondering “where the hell did THAT come from?” Here are some of my favourites. In the past, one terabyte of storage capacity would have seemed incredibly impressive. But Moore’s law has put paid to that. My home desktop PC presently has 1.5 TB of storage space, and that’s over seven years old. Could my own clunky desktop be, in terms of information capacity, smarter than me? Apparently. Some estimates put the capacity of the human brain as low as 1TB. A lifetimes worth of memories wouldn’t fill a modern-day hard drive? That seems far-fetched, at least at an intuitive level.
Keyword: Development of the Brain
Link ID: 22477 - Posted: 07.26.2016
By Dave Dormer, Transporting babies deprived of oxygen at birth to a neonatal intensive care unit in Calgary will soon be safer thanks to a new portable cooling device. The Foothills hospital is one of the first facilities in Canada to acquire one and doctors hope it will help prevent brain injuries, as reducing a baby's temperature can prevent damage to brain tissue and promote healing. The reduction in temperature is called therapeutic hypothermia, and it can help prevent damage to brain tissue and promote healing. (Evelyne Asselin/CBC) "The period immediately following birth is critical. We have about a six-hour window to lower these babies' temperatures to prevent neurological damage," said Dr. Khorshid Mohammad, the neonatal neurocritical care project lead who spearheaded the initiative. "The sooner we can do so, and the more consistent we can make the temperature, the more protective it is and the better their chances of surviving without injury." Since about 2008, doctors used cooling blankets and gel packs to lower a baby's temperature to 33.5 C from the normal 37 C for 72 hours in order to prevent brain damage. "With those methods, it can be difficult to maintain a stable temperature," said Mohammad. ©2016 CBC/Radio-Canada.
Keyword: Development of the Brain
Link ID: 22476 - Posted: 07.26.2016
By Andy Coghlan The final brain edit before adulthood has been observed for the first time. MRI scans of 300 adolescents and young adults have shown how the teenage brain upgrades itself to become quicker – but that errors in this process may lead to schizophrenia in later life. The editing process that takes place in teen years seems to select the brain’s best connections and networks, says Kirstie Whitaker at the University of Cambridge. “The result is a brain that’s sleeker and more efficient.” When Whitaker and her team scanned brains from people between the ages of 14 and 24, they found that two major changes take place in the outer layer of the brain – the cortex – at this time. As adolescence progresses, this layer of grey matter gets thinner – probably because unwanted or unused connections between neurons – called synapses – are pruned back. At the same time, important neurons are upgraded. The parts of these cells that carry signals down towards synapses are given a sheath that helps them transmit signals more quickly – a process called myelination. “It may be that pruning and myelination are part of the maturation of the brain,” says Steven McCarroll at Harvard Medical School. “Pruning involves removing the connections that are not used, and myelination takes the ones that are left and makes them faster,” he says. McCarroll describes this as a trade-off – by pruning connections, we lose some flexibility in the brain, but the proficiency of signal transmission improves. © Copyright Reed Business Information Ltd.
Keyword: Development of the Brain
Link ID: 22474 - Posted: 07.26.2016
By Lizzie Wade Neandertals and modern humans had a lot in common—at least enough to have babies together fairly often. But what about their brains? To answer that question, scientists have looked at how Neandertal and modern human brains developed during the crucial time of early childhood. In the first year of life, modern human infants go through a growth spurt in several parts of the brain: the cerebellum, the parietal lobes, and the temporal lobes—key regions for language and social interaction. Past studies suggested baby Neandertal brains developed more like the brains of chimpanzees, without concentrated growth in any particular area. But a new study casts doubt on that idea. Scientists examined 15 Neandertal skulls, including one newborn and a pair of children under the age of 2. By carefully imaging the skulls, the team determined that Neandertal temporal lobes, frontal lobes, and cerebellums did, in fact, grow faster than the rest of the brain in early life, a pattern very similar to modern humans, they report today in Current Biology. Scientists had overlooked that possibility, the researchers say, because Neandertals and Homo sapiens have such differently shaped skulls. Modern humans’ rounded skull is a telltale marker of the growth spurt, for example, whereas Neandertals’ skulls were relatively flat on the top. If Neandertals did, in fact, have fast developing cerebellums and temporal and frontal lobes, they might have been more skilled at language and socializing than assumed, scientists say. This could in turn explain how the children of Neandertal–modern human pairings fared well enough to pass down their genes to so many us living today. © 2016 American Association for the Advancement of Science
By Minaz Kerawala, For years, gamers, athletes and even regular people trying to improving their memory have resorted, with electrified enthusiasm, to "brain zapping" to gain an edge. The procedure, called transcranial direct current stimulation (tDCS), uses a battery and electrodes to deliver electrical pulses to the brain, usually through a cap or headset fitted close to the scalp. Proponents say these currents are beneficial for a range of neurological conditions like Alzheimer's and Parkinson's diseases, stroke and schizophrenia, but experts are warning that too little is known about the safety of tDCS. "You might end up with a placement of electrodes that doesn't do what you think it does and could potentially have long-lasting effects," said Matthew Krause, a neuroscientist at the Montreal Neurological Institute. All functions of the brain—thought, emotion and coordination—are carried out by neurons using pulses of electricity. "The objective of all neuroscience is to influence these electrical processes," Krause said. The brain's activity can be influenced by drugs that alter its electrochemistry or by external external electric fields. While mind-altering headsets may seem futuristic, tDCS is not a new procedure. Much of the pioneering work in the field was done in Montreal by Dr. Wilder Penfield in the 1920s and 30s. ©2016 CBC/Radio-Canada.
Link ID: 22464 - Posted: 07.21.2016
By Alice Klein Blame grandpa. A study in mice shows that the grandsons of obese males are more susceptible to the detrimental health effects of junk food, even if their fathers are lean and healthy. The finding adds to evidence that new traits can be passed down the family line without being permanently recorded in a family’s genes – a phenomenon called transgenerational epigenetics. Last year, a study found that the DNA in the sperm of obese men is modified in thousands of places, and that these sperm also contain short pieces of RNA. These are epigenetic modifications – they don’t affect the precise code of genes, but instead may affect how active particular genes are. Now Catherine Suter at Victor Chang Cardiac Research Institute in Sydney and her team have investigated the longer-term effects of paternal obesity. To do this, they mated obese male mice with lean female mice. They found that, compared with the offspring of lean males, both the sons and grandsons of the obese males were more likely to show the early signs of fatty liver disease and diabetes when given a junk food diet. The same effect wasn’t seen in daughters or granddaughters. Even when the sons of the obese males were fed a healthy diet and kept at a normal weight, their sons still had a greater tendency to develop obesity-related conditions when exposed to a junk diet. © Copyright Reed Business Information Ltd.
By William Kenower My youngest son, Sawyer, used to spend far more time relating to his imagination than he did to the world around him. He would run back and forth humming, flapping his hands and thumping on his chest. By the time he was in first grade, attempts to draw him out of his pretend world to join his classmates or do some class work led to explosions and timeouts. At 7 he was given a diagnosis of being on the autism spectrum. That was when my wife, Jen, learned about the practice called joining. The idea behind it, which she discovered in Barry Neil Kaufman’s book “Son-Rise,” is brilliant in its simplicity. We wanted Sawyer to be with us. We did not want him to live in this bubble of his own creation. And so, instead of telling him to stop pretending and join us, we started pretending and joined him. The first time Jen joined him, the first time she ran beside him humming and thumping her chest, he stopped running, stopped thumping, stopped humming and, without a single word from us, turned to her and said, “What are you doing?” We took turns joining him every day, and a week later we got an email from his special education teacher telling us to keep doing whatever we were doing. He’d gone from five timeouts a day to one in a week. The classroom was the same, the work was the same – all that was different was that we had found a way to say to him in a language he could understand, “You’re not wrong.” Emboldened by our success, we set about becoming more fluent in this language. For the next couple of years we taught ourselves to join him constantly. This meant that whatever we were doing had to stop whenever we heard him running back and forth and humming. But we could not join him simply to get him to stop running and thumping and humming. We had to join him without any judgment or impatience. That was the trickiest part. The desire to fix him was great. I had come to believe that there were broken people in need of fixing. Sometimes, I looked like one of those people. I was a 40-year-old unpublished writer working as a waiter. My life reeked of failure. Many days I looked in the mirror and asked, “What is wrong with me?” © 2016 The New York Times Company
Link ID: 22451 - Posted: 07.16.2016
James M. Broadway “Where did the time go?” middle-aged and older adults often remark. Many of us feel that time passes more quickly as we age, a perception that can lead to regrets. According to psychologist and BBC columnist Claudia Hammond, “the sensation that time speeds up as you get older is one of the biggest mysteries of the experience of time.” Fortunately, our attempts to unravel this mystery have yielded some intriguing findings. In 2005, for instance, psychologists Marc Wittmann and Sandra Lenhoff, both then at Ludwig Maximilian University of Munich, surveyed 499 participants, ranging in age from 14 to 94 years, about the pace at which they felt time moving—from “very slowly” to “very fast.” For shorter durations—a week, a month, even a year—the subjects' perception of time did not appear to increase with age. Most participants felt that the clock ticked by quickly. But for longer durations, such as a decade, a pattern emerged: older people tended to perceive time as moving faster. When asked to reflect on their lives, the participants older than 40 felt that time elapsed slowly in their childhood but then accelerated steadily through their teenage years into early adulthood. There are good reasons why older people may feel that way. When it comes to how we perceive time, humans can estimate the length of an event from two very different perspectives: a prospective vantage, while an event is still occurring, or a retrospective one, after it has ended. In addition, our experience of time varies with whatever we are doing and how we feel about it. In fact, time does fly when we are having fun. Engaging in a novel exploit makes time appear to pass more quickly in the moment. But if we remember that activity later on, it will seem to have lasted longer than more mundane experiences. © 2016 Scientific American,
Helen Haste The American psychologist and educationist Jerome Bruner, who has died aged 100, repeatedly challenged orthodoxies and generated novel directions. His elegant, accessible writing reached wide audiences. His colleague Rom Harré described his lectures as inspiring: “He darted all over the place, one topic suggested another and so on through a thrilling zigzag.” To the charge that he was always asking impossible questions, Jerry replied: “They are pretty much impossible, but the search for the impossible is part of what intelligence is about.” He was willing to engage with controversy, both on academic issues and in education politics. Blind at birth because of cataracts, Jerry gained his sight after surgery at the age of two. He credited this for his sense that we actively interpret and organise our world rather than passively react to it – a theme that he continued to develop in different ways. His first work lay in perception, when he resumed research at Harvard after the second world war. He found that children’s judgments of the size of coins and coin-like disks varied: poorer children overestimated the size of the coins. This contributed to the emerging “new look” movement in psychology, involving values, intentions and interpretation in contrast to the then dominant behaviourist focus on passive learning, reward and punishment. His professorship at Harvard came in 1952, and by the middle of the decade a computer metaphor began to influence psychology – the “cognitive revolution”. With Jacqueline Goodnow and George Austin, Jerry published A Study of Thinking (1956). © 2016 Guardian News and Media Limited
By Andy Coghlan There once was a brainy duckling. It could remember whether shapes or colours it saw just after hatching were the same as or different to each other. The feat surprised the researchers, who were initially sceptical about whether the ducklings could grasp such complex concepts as “same” and “different”. The fact that they could suggests the ability to think in an abstract way may be far more common in nature than expected, and not just restricted to humans and a handful of animals with big brains. “We were completely surprised,” says Alex Kacelnik at the University of Oxford, who conducted the experiment along with his colleague Antone Martinho III. Kacelnik and Martinho reasoned that ducklings might be able to grasp patterns relating to shape or colour as part of the array of sensory information they absorb soon after hatching. Doing so would allow them to recognise their mothers and siblings and distinguish them from all others – abilities vital for survival. In ducklings, goslings and other species that depend for survival on following their mothers, newborns learn quickly – a process called filial imprinting. Kacelnik wondered whether this would enable them to be tricked soon after hatching into “following” objects or colours instead of their natural mother, and recognising those same patterns in future. © Copyright Reed Business Information Ltd.