Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Hannah Devlin Science correspondent Scientists have found the most definitive evidence yet that some people are destined to age quicker and die younger than others - regardless of their lifestyle. The findings could explain the seemingly random and unfair way that death is sometimes dealt out, and raise the intriguing future possibility of being able to extend the natural human lifespan. “You get people who are vegan, sleep 10 hours a day, have a low-stress job, and still end up dying young,” said Steve Horvath, a biostatistician who led the research at the University of California, Los Angeles. “We’ve shown some people have a faster innate ageing rate.” A higher biological age, regardless of actual age, was consistently linked to an earlier death, the study found. For the 5% of the population who age fastest, this translated to a roughly 50% greater than average risk of death at any age. Intriguingly, the biological changes linked to ageing are potentially reversible, raising the prospect of future treatments that could arrest the ageing process and extend the human lifespan. “The great hope is that we find anti-ageing interventions that would slow your innate ageing rate,” said Horvath. “This is an important milestone to realising this dream.” Horvath’s ageing “clock” relies on measuring subtle chemical changes, in which methyl compounds attach or detach from the genome without altering the underlying code of our DNA. © 2016 Guardian News and Media Limited
By Edd Gent, A brain-inspired computing component provides the most faithful emulation yet of connections among neurons in the human brain, researchers say. The so-called memristor, an electrical component whose resistance relies on how much charge has passed through it in the past, mimics the way calcium ions behave at the junction between two neurons in the human brain, the study said. That junction is known as a synapse. The researchers said the new device could lead to significant advances in brain-inspired—or neuromorphic—computers, which could be much better at perceptual and learning tasks than traditional computers, as well as far more energy efficient. "In the past, people have used devices like transistors and capacitors to simulate synaptic dynamics, which can work, but those devices have very little resemblance to real biological systems. So it's not efficient to do it that way, and it results in a larger device area, larger energy consumption and less fidelity," said study leader Joshua Yang, a professor of electrical and computer engineering at the University of Massachusetts Amherst. [10 Things You Didn't Know About the Brain] Previous research has suggested that the human brain has about 100 billion neurons and approximately 1 quadrillion (1 million billion) synapses. A brain-inspired computer would ideally be designed to mimic the brain's enormous computing power and efficiency, scientists have said. © 2016 Scientific American
By GRETCHEN REYNOLDS Before you skip another workout, you might think about your brain. A provocative new study finds that some of the benefits of exercise for brain health may evaporate if we take to the couch and stop being active, even just for a week or so. I have frequently written about how physical activity, especially endurance exercise like running, aids our brains and minds. Studies with animals and people show that working out can lead to the creation of new neurons, blood vessels and synapses and greater overall volume in areas of the brain related to memory and higher-level thinking. Presumably as a result, people and animals that exercise tend to have sturdier memories and cognitive skills than their sedentary counterparts. Exercise prompts these changes in large part by increasing blood flow to the brain, many exercise scientists believe. Blood carries fuel and oxygen to brain cells, along with other substances that help to jump-start desirable biochemical processes there, so more blood circulating in the brain is generally a good thing. Exercise is particularly important for brain health because it appears to ramp up blood flow through the skull not only during the actual activity, but throughout the rest of the day. In past neurological studies, when sedentary people began an exercise program, they soon developed augmented blood flow to their brains, even when they were resting and not running or otherwise moving. But whether those improvements in blood flow are permanent or how long they might last was not clear. So for the new study, which was published in August in Frontiers in Aging Neuroscience, researchers from the department of kinesiology at the University of Maryland in College Park decided to ask a group of exceedingly fit older men and women to stop exercising for awhile. © 2016 The New York Times Company
Ramin Skibba. Physiologist Ivan Pavlov conditioned dogs to associate food with the sound of a buzzer, which left them salivating. Decades later, researchers discovered such training appears to block efforts to teach the animals to link other stimuli to the same reward. Dogs trained to expect food when a buzzer sounds can then be conditioned to salivate when they are exposed to the noise and a flash of light simultaneously. But light alone will not cue them to drool. This ‘blocking effect’ is well-known in psychology, but new research suggests that the concept might not be so simple. Psychologists in Belgium failed to replicate the effect in 15 independent experiments, they report this month in the Journal of Experimental Psychology1. “For a long time, you tend to think, ‘It’s me’ — I’m doing something wrong, or messing up the experiment,’” says lead author Tom Beckers, a psychologist at the Catholic University of Leuven (KU Leuven) in Belgium. But after his student, co-author Elisa Maes, also could not replicate the blocking effect, and the team failed again in experiments in other labs, Beckers realized that “it can’t just be us”. The scientists do not claim that the blocking effect is not real, or that previous observations of it are wrong. Instead, Beckers thinks that psychologists do not yet know enough about the precise conditions under which it applies. © 2016 Macmillan Publishers Limited,
Keyword: Learning & Memory
Link ID: 22701 - Posted: 09.27.2016
By Abdul-Kareem Ahmed In the world of recreational and professional sports, many athletes—particularly in contact sports—suffer concussions. These mild traumatic brain injuries cause headaches, memory problems and confusion, but usually resolve on their own with rest. Some players, however, especially after repeated concussions, continue to experience symptoms for many months—a phenomenon termed post-concussion syndrome. A few of these players will eventually develop chronic traumatic encephalopathy (CTE), a progressive neurodegenerative disease that causes dementia symptoms similar to Alzheimer’s disease. CTE can lead to personality changes, movement problems and, sometimes, mortality. CTE is diagnosed after death because it requires postmortem examination of a player’s brain. Post-concussion syndrome, in contrast, is diagnosed based on patient symptoms. To date, doctors do not have any objective tests to determine syndrome severity or relate it to the risk of developing CTE. Now, a group of researchers from Sweden and the U.K. say they have developed such a test, reporting their findings last week in JAMA Neurology. The test measures biomarkers in the cerebrospinal fluid—the colorless liquid that supports and suspends the brain and spinal cord—that appear to provide a measure of concussion severity and CTE risk. The researchers collected cerebrospinal fluid via spinal taps from 16 professional Swedish ice hockey players and a similar number of healthy individuals. The hockey players had all experienced post-concussion syndrome, causing nine of them to retire from the game. © 2016 Scientific American,
By KEN BELSON One of the frustrations of researchers who study chronic traumatic encephalopathy, the degenerative brain disease linked to repeated head hits, is that it can be detected only in autopsies, and not in the living. Researchers, though, have been trying to solve this problem in two primary ways: by identifying biomarkers linked to the disease that show up on imaging tests in certain locations in the brain, and by trying to locate in the blood the protein that is the hallmark of the disease. On Monday, two groups of researchers said they had made what they considered small steps in developing both methods. The announcements are small parts of much larger studies that will take years to bear fruit, if they ever do. Both methods have been questioned by detractors, some of whom say the hype is getting ahead of the science. Scientists, these critics note, have spent decades trying to find ways to accurately diagnose Alzheimer’s disease, which has some of the same characteristics as C.T.E. Still, at a medical conference in Boston on Monday, Robert Stern, a professor of neurology at Boston University, said technology developed by the company Quanterix (paid for in part with a grant from the N.F.L.) had identified elevated levels of tau proteins in blood samples of 96 former football players between 40 and 69 years old, compared with only 25 people of the same age in a control group. The results, which are part of a seven-year study and are under review for publication, are preliminary because they identify only the total amount of tau in the blood, not the amount of the specific tau linked to C.T.E. Additional tests are being done in Sweden to determine the amount of the C.T.E.-related tau in the blood samples, Stern said. Even so, Stern said, the blood samples from the 96 former players suggest that absorbing repeated head hits earlier in life can lead to higher concentrations of tau in the blood later. © 2016 The New York Times Company
By CONOR DOUGHERTY SAN FRANCISCO — Every now and again, when I’m feeling a little down, I go to Baseball-Reference.com and look up the San Francisco Giants’ box score from July 29, 2012. It’s an odd choice for a Giants fan. The Los Angeles Dodgers won, 4-0, completing a weekend sweep in which they outscored the Giants by 19-3 and tied them for the lead in the National League West. The Giants went on to win the World Series that year, but that’s not why I remember the July 29 game. I remember that afternoon because my mom, in the throes of Alzheimer’s, left the house she shared with my dad in the Noe Valley neighborhood, walked four or so miles and somehow ended up at AT&T Park. Then she went inside and watched her team. It took a while for me to believe this. When Mom told me she had gone to the park — my dad barely watches baseball, so the Giants have always been a thing between me and Mom — I assumed it was an old memory misplaced on a new day. But it turned out that Sunday game did overlap with the hours she had been out, and a month or so later my dad got a credit card bill with the charge for the ticket. I can’t tell you when Mom cheered or if she managed to find her seat. All I know is Clayton Kershaw struck out seven, the Giants had five hits, and even though I’ve committed these statistics to memory, I still like looking them up. On the chance that this hasn’t been clubbed into your head by now, the Giants have won the World Series in every even-numbered year this decade. And for reasons that I choose to see as cosmic, this run of baseball dominance has tracked my mom’s descent into Alzheimer’s. The disease doesn’t take people from you in a day or a week or a season. You get years of steady disappearance, with an indeterminate end. So for me and Mom and baseball, this decade has been a long goodbye. © 2016 The New York Times Company
Link ID: 22690 - Posted: 09.24.2016
By David Z. Hambrick, Fredrik Ullén, Miriam Mosing Elite-level performance can leave us awestruck. This summer, in Rio, Simone Biles appeared to defy gravity in her gymnastics routines, and Michelle Carter seemed to harness super-human strength to win gold in the shot put. Michael Phelps, meanwhile, collected 5 gold medals, bringing his career total to 23. In everyday conversation, we say that elite performers like Biles, Carter, and Phelps must be “naturals” who possess a “gift” that “can’t be taught.” What does science say? Is innate talent a myth? This question is the focus of the new book Peak: Secrets from the New Science of Expertise by Florida State University psychologist Anders Ericsson and science writer Robert Pool. Ericsson and Pool argue that, with the exception of height and body size, the idea that we are limited by genetic factors—innate talent—is a pernicious myth. “The belief that one’s abilities are limited by one’s genetically prescribed characteristics....manifests itself in all sorts of ‘I can’t’ or ‘I’m not’ statements,” Ericsson and Pool write. The key to extraordinary performance, they argue, is “thousands and thousands of hours of hard, focused work.” To make their case, Ericsson and Pool review evidence from a wide range of studies demonstrating the effects of training on performance. In one study, Ericsson and his late colleague William Chase found that, through over 230 hours of practice, a college student was able to increase his digit span—the number of random digits he could recall—from a normal 7 to nearly 80. In another study, the Japanese psychologist Ayako Sakakibara enrolled 24 children from a private Tokyo music school in a training program designed to train “perfect pitch”—the ability to name the pitch of a tone without hearing another tone for reference. With a trainer playing a piano, the children learned to identify chords using colored flags—for example, a red flag for CEG and a green flag for DGH. Then, the children were tested on their ability to identify the pitches of individual notes until they reached a criterion level of proficiency. By the end of the study, the children had seemed to acquire perfect pitch. Based on these findings, Ericsson and Pool conclude that the “clear implication is that perfect pitch, far from being a gift bestowed upon only a lucky few, is an ability that pretty much anyone can develop with the right exposure and training.” © 2016 Scientific American
Laura Sanders In growing brains, billions of nerve cells must make trillions of precise connections. As they snake through the brain, nerve cell tendrils called axons use the brain’s stiffness to guide them on their challenging journey, a study of frog nerve cells suggests. The results, described online September 19 in Nature Neuroscience, show that along with chemical guidance signals, the brain’s physical properties help shape its connections. That insight may be key to understanding how nerve cells wire the brain, says study coauthor Kristian Franze. “I strongly believe that it’s not enough to look at chemistry,” says Franze, a mechanobiologist at the University of Cambridge. “We need to look at environmental factors, too.” The notion that physical features help guide axons is gaining momentum, says neuroscientist Samantha Butler of UCLA. “It’s a really intriguing study.” A better understanding of how nerve cells find their targets could help scientists coax new cells to grow after a spinal cord injury or design better materials for nerve cell implants. Franze and colleagues studied nerve cells from the retina of frogs. Experiments on cells in dishes suggested that axons, signal-transmitting tendrils led by tiny pioneering structures called growth cones, grew differently on hard and soft material. Axons grew longer and straighter on stiff surfaces and seemed to meander more on softer material. © Society for Science & the Public 2000 - 2016.
Keyword: Development of the Brain
Link ID: 22672 - Posted: 09.20.2016
By DAVID Z. HAMBRICK and ALEXANDER P. BURGOYNE ARE you intelligent — or rational? The question may sound redundant, but in recent years researchers have demonstrated just how distinct those two cognitive attributes actually are. It all started in the early 1970s, when the psychologists Daniel Kahneman and Amos Tversky conducted an influential series of experiments showing that all of us, even highly intelligent people, are prone to irrationality. Across a wide range of scenarios, the experiments revealed, people tend to make decisions based on intuition rather than reason. In one study, Professors Kahneman and Tversky had people read the following personality sketch for a woman named Linda: “Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.” Then they asked the subjects which was more probable: (A) Linda is a bank teller or (B) Linda is a bank teller and is active in the feminist movement. Eighty-five percent of the subjects chose B, even though logically speaking, A is more probable. (All feminist bank tellers are bank tellers, though some bank tellers may not be feminists.) In the Linda problem, we fall prey to the conjunction fallacy — the belief that the co-occurrence of two events is more likely than the occurrence of one of the events. In other cases, we ignore information about the prevalence of events when judging their likelihood. We fail to consider alternative explanations. We evaluate evidence in a manner consistent with our prior beliefs. And so on. Humans, it seems, are fundamentally irrational. But starting in the late 1990s, researchers began to add a significant wrinkle to that view. As the psychologist Keith Stanovich and others observed, even the Kahneman and Tversky data show that some people are highly rational. In other words, there are individual differences in rationality, even if we all face cognitive challenges in being rational. So who are these more rational people? Presumably, the more intelligent people, right? © 2016 The New York Times Company
By Catherine Caruso Most of us think little of hopping on Google Maps to look at everything from a bird’s-eye view of an entire continent to an on-the-ground view of a specific street, all carefully labeled. Thanks to a digital atlas published this week, the same is now possible with the human brain. Ed Lein and colleagues at the Allen Institute for Brain Science in Seattle have created a comprehensive, open-access digital atlas of the human brain, which was published this week in The Journal of Comparative Neurology. “Essentially what we were trying to do is to create a new reference standard for a very fine anatomical structural map of the complete human brain,” says Lein, the principal investigator on the project. “It may seem a little bit odd, but actually we are a bit lacking in types of basic reference materials for mapping the human brain that we have in other organisms like mouse or like monkey, and that is in large part because of the enormous size and complexity of the human brain.” The project, which spanned five years, focused on a single healthy postmortem brain from a 34-year-old woman. The researchers started with the big picture: They did a complete scan of the brain using two imaging techniques (magnetic resonance imaging and diffusion weighted imaging), which allowed them to capture both overall brain structure and the connectivity of brain fibers. Next the researchers took the brain and sliced it into 2,716 very thin sections for fine-scale, cellular analysis. They stained a portion of the sections with a traditional Nissl stain to gather information about general cell architecture. They then used two other stains to selectively label certain aspects of the brain, including structural elements of cells, fibers in the white matter, and specific types of neurons. © 2016 Scientific American
By Brian Owens It’s certainly something to crow about. New Caledonian crows are known for their ingenious use of tools to get at hard-to-reach food. Now it turns out that their Hawaiian cousins are adept tool-users as well. Christian Rutz at the University of St Andrews in the UK has spent 10 years studying the New Caledonian crow and wondered whether any other crow species are disposed to use tools. So he looked for crows that have similar features to the New Caledonian crow – a straight bill and large, mobile eyes that allow it to manipulate tools, much as archaeologists use opposable thumbs as an evolutionary signature for tool use in early humans. “The Hawaiian crow really stood out,” he says. “They look quite similar.” Hawaiian crows are extinct in the wild, but 109 birds still live in two captive breeding facilities in Hawaii. That meant Rutz was able to test pretty much every member of the species. He stuffed tasty morsels into a variety of holes and crevices in a log, and gave the birds a variety of sticks to see if they would use them to dig out the food. Almost all of them did, and most extracted the food in less than a minute, faster than the researchers themselves could. “It’s mind-blowing,” says Rutz. “They’re very good at getting the tool in the right position, and if they’re not happy with it they’ll modify it or make their own.” © Copyright Reed Business Information Ltd.
By Julia Shaw The brain, with its 100 billion neurons, allows us to do amazing things like learn multiple languages, or build things that send people into outer space. Yet despite this astonishing capacity, we routinely can’t remember where we put our keys, we forget why we went to the grocery store, and we fail when trying to recall personal life events. This apparent contradiction in functionality opens up the question of why we forget some things but remember others. Or, more fundamentally, what causes forgetting? This week my book ‘The Memory Illusion’ drops in Canada, and as a Canadian girl I want to celebrate this by showcasing some Canadian researchers who have given us insight into precisely this question. An article published recently in Psychological Science by Talya Sadeh and colleagues at the Rotman Research institute in Toronto addresses a long-running debate in the world of memory science; do we forget things because of decay or interference? Decay. Advocates of the decay account posit that our memories slowly disappear, fading because of a passage of time during which they have not been accessed. You can picture this much like a message written in sand, with every ocean wave that flows over the shore making the writing less legible until it eventually disappears entirely. The sand represents the web of brain cells that form a memory in the brain, and the ocean waves represent time passing. © 2016 Scientific American,
Keyword: Learning & Memory
Link ID: 22651 - Posted: 09.13.2016
Laura Sanders By sneakily influencing brain activity, scientists changed people’s opinions of faces. This covert neural sculpting relied on a sophisticated brain training technique in which people learn to direct their thoughts in specific ways. The results, published September 8 in PLOS Biology, support the idea that neurofeedback methods could help reveal how the brain’s behavior gives rise to perceptions and emotions. What’s more, the technique may ultimately prove useful for easing traumatic memories and treating disorders such as depression. The research is still at an early stage, says neurofeedback researcher Michelle Hampson of Yale University, but, she notes, “I think it has great promise.” Takeo Watanabe of Brown University and colleagues used functional MRI to measure people’s brain activity in an area called the cingulate cortex as participants saw pictures of faces. After participants had rated each face, a computer algorithm sorted their brain responses into patterns that corresponded to faces they liked and faces they disliked. With this knowledge in hand, the researchers then attempted to change people’s face preferences by subtly nudging brain activity in the cingulate cortex. In step 2 of the experiment, returning to the fMRI scanner, participants saw an image of a face that they had previously rated as neutral. Just after that, they were shown a disk. The goal, the participants were told, was simple: make the disk bigger by using their brains. They had no idea that the only way to make the disk grow was to think in a very particular way. |© Society for Science & the Public 2000 - 201
By GRETCHEN REYNOLDS A busy brain can mean a hungry body. We often seek food after focused mental activity, like preparing for an exam or poring over spreadsheets. Researchers speculate that heavy bouts of thinking drain energy from the brain, whose capacity to store fuel is very limited. So the brain, sensing that it may soon require more calories to keep going, apparently stimulates bodily hunger, and even though there has been little in the way of physical movement or caloric expenditure, we eat. This process may partly account for the weight gain so commonly seen in college students. Scientists at the University of Alabama at Birmingham and another institution recently experimented with exercise to counter such post-study food binges. Gary Hunter, an exercise physiologist at U.A.B., oversaw the study, which was published this month in the journal Medicine & Science in Sports & Exercise. Hunter notes that strenuous activity both increases the amount of blood sugar and lactate — a byproduct of intense muscle contractions — circulating in the blood and augments blood flow to the head. Because the brain uses sugar and lactate as fuel, researchers wondered if the increased flow of fuel-rich blood during exercise could feed an exhausted brain and reduce the urge to overeat. Thirty-eight healthy college students were invited to U.A.B.’s exercise lab to determine their fitness and metabolic rates — and to report what their favorite pizza was. Afterward, they sat quietly for 35 minutes before being given as much of their favorite pizza as they wanted, which established a baseline measure of self-indulgence. At a later date, the volunteers returned and spent 20 minutes tackling selections from college and graduate-school entrance exams. Hunter says this work has been used in other studies “to induce mental fatigue and hunger.” Next, half the students sat quietly for 15 minutes, before being given pizza. The rest of the volunteers spent those 15 minutes doing intervals on a treadmill: two minutes of hard running followed by about one minute of walking, repeated five times. This is the sort of brief but intensive routine, Hunter says, that should prompt the release of sugar and lactate into the bloodstream. These students were then allowed to gorge on pizza, too. But by and large, they did not overeat. © 2016 The New York Times Company
By Karen Zusi At least one type of social learning, or the ability to learn from observing others’ actions, is processed by individual neurons within a region of the human brain called the rostral anterior cingulate cortex (rACC), according to a study published today (September 6) in Nature Communications. The work is the first direct analysis in humans of the neuronal activity that encodes information about others’ behavior. “The idea [is] that there could be an area that’s specialized for processing things about other people,” says Matthew Apps, a neuroscientist at the University of Oxford who was not involved with the study. “How we think about other people might use distinct processes from how we might think about ourselves.” During the social learning experiments, the University of California, Los Angeles (UCLA) and CalTech–based research team recorded the activity of individual neurons in the brains of epilepsy patients. The patients were undergoing a weeks-long procedure at the Ronald Reagan UCLA Medical Center in which their brains were implanted with electrodes to locate the origin of their epileptic seizures. Access to this patient population was key to the study. “It’s a very rare dataset,” says Apps. “It really does add a lot to the story.” With data streaming out of the patients’ brains, the researchers taught the subjects to play a card game on a laptop. Each turn, the patients could select from one of two decks of face-down cards: the cards either gave $10 or $100 in virtual winnings, or subtracted $10 or $100. In one deck, 70 percent of the cards were winning cards, while in the other only 30 percent were. The goal was to rack up the most money. © 1986-2016 The Scientist
Hannah Devlin Science correspondent Babies born by caesarean section are more likely to be obese as adults, according to a study that suggests the way we are born could have a lasting impact on health. Birth by caesarean was linked to a 15% higher risk of obesity in children compared with vaginal birth. The scientists involved believe that babies born by caesarean miss out on exposure to bacteria in the birth canal that colonise the baby’s gut and may ultimately change the body’s metabolic rate - and even how hungry we feel. Audrey Gaskins, an epidemiologist at Harvard University and co-author of the new study, said: “Children born via C-section harbour less diverse gut bacteria and these patterns of less diversity have been linked to increased capacity for energy harvest by the gut microbiota. You can think of it as a slower metabolism.” Previous studies have found the same link, but were less able to rule out other factors, such as the mother’s weight or health. The latest research, which included 22,068 children born to 15,271 women, suggests that the link is not simply explained by overweight women or those with pregnancy complications such as high blood pressure being more likely to deliver by caesarean. The link remained after maternal weight was taken into account, and was more striking when siblings who had different types of births were compared. Within families, children born by caesarean were 64% more likely to be obese than their siblings born by vaginal delivery. “With siblings, they have the same mother and home environment so the genetics, the feeding environment, are all controlled for,” said Dr Gaskins. © 2016 Guardian News and Media Limited
By JANE E. BRODY As a woman of a certain age who consumes a well-balanced diet of all the usual food groups, including reasonable amounts of animal protein, I tend to dismiss advice to take a multivitamin supplement. I’ve been told repeatedly by nutrition experts that the overuse of dietary supplements for “nutritional insurance” has given Americans the most expensive urine in the world. I do take a daily supplement of vitamin D, based on considerable evidence of its multiple health benefits, especially for older people. However, based on advice from the National Academy of Medicine and an examination of accumulating research, I’m prompted to consider also taking a vitamin B12 supplement in hopes of protecting my aging brain. Animal protein foods — meat, fish, milk, cheese and eggs — are the only reliable natural dietary sources of B12, and I do get ample amounts of several in my regular diet. But now at age 75, I wonder whether I’m still able to reap the full benefit of what I ingest. You see, the ability to absorb B12 naturally present in foods depends on the presence of adequate stomach acid, the enzyme pepsin and a gastric protein called intrinsic factor to release the vitamin from the food protein it is attached to. Only then can the vitamin be absorbed by the small intestine. As people age, acid-producing cells in the stomach may gradually cease to function, a condition called atrophic gastritis. A century ago, researchers discovered that some people — most likely including Mary Todd Lincoln — had a condition called pernicious anemia, a deficiency of red blood cells ultimately identified as an autoimmune disease that causes a loss of stomach cells needed for B12 absorption. Mrs. Lincoln was known to behave erratically and was ultimately committed to a mental hospital. © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 22634 - Posted: 09.06.2016
By Jesse Singal Back in 2014, a bigoted African leader put J. Michael Bailey, a psychologist at Northwestern, in a strange position. Yoweri Museveni, the president of Uganda, had been issuing a series of anti-gay tirades, and — partially fueled by anti-gay religious figures from the U.S. — was considering toughening Uganda’s anti-gay laws. The rhetoric was getting out of control: “The commercialisation of homosexuality is unacceptable,” said Simon Lokodo, Uganda’s ethics minister. “If they were doing it in their own rooms we wouldn’t mind, but when they go for children, that’s not fair. They are beasts of the forest.” Eventually, Museveni said he would table the idea of new legislation until he better understood the science of homosexuality, and agreed to lay off Uganda’s LGBT population if someone could prove to him homosexuality was innate. That’s where Bailey comes in: He’s a leading sex researcher who has published at length on the question of where sexual orientation comes from. LGBT advocates began reaching out to him to explain the science of homosexuality and, presumably, denounce Museveni for his hateful rhetoric. But “I had issues with rushing out a scientific statement that homosexuality is innate,” he said in an email, because he’s not sure that’s quite accurate. While he did write articles, such as an editorial in New Scientist, explaining why he thought Museveni’s position didn’t make sense, he stopped short of calling homosexuality innate. He also realized that in light of some recent advances in the science of sexual orientation, it was time to publish an article summing up the current state of the field — gathering together all that was broadly agreed-upon about the nature and potential origins of sexual orientation. (In the meantime, Museveni did end up signing the anti-gay legislation, justifying his decision by reasoning that homosexuality “was learned and could be unlearned.”) © 2016, New York Media LLC.
Laura Sanders An experimental drug swept sticky plaques from the brains of a small number of people with Alzheimer’s disease over the course of a year. And preliminary results hint that this cleanup may have staved off mental decline. News about the new drug, an antibody called aducanumab, led to excitement as it trickled out of recent scientific meetings. A paper published online August 31 in Nature offers a more comprehensive look at the drug’s effects. “Overall, this is the best news that we’ve had in my 25 years doing Alzheimer’s clinical research,” study coauthor Stephen Salloway of Brown University said August 30 at a news briefing. “It brings new hope for patients and families most affected by the disease.” The results are the most convincing evidence yet that an antibody can reduce amyloid in the brain, says Alzheimer’s researcherRachelle Doody of Baylor College of Medicine in Houston, who was not involved in the study. Still, experts caution that the results come from 165 people, a relatively small number. The seemingly beneficial effects could disappear in larger clinical trials, which are under way. “These new data are tantalizing, but they are not yet definitive,” says neuroscientist John Hardy of University College London. Like some other drug candidates for Alzheimer’s, aducanumab is an antibody that targets amyloid-beta, a sticky protein that accumulates in the brains of people with the disease. Delivered by intravenous injection, aducanumab appeared to get inside the brains of people with mild Alzheimer’s (average age about 73) and destroy A-beta plaques, the results suggest. After a year of exposure to the drug, A-beta levels had dropped. This reduction depended on the dose — the more drug, the bigger the decline in A-beta. In fact, people on the highest dose of the drug had almost no A-beta plaques in their brains after a year. |© Society for Science & the Public 2000 - 2016.
Link ID: 22621 - Posted: 09.01.2016