Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 6010

By Helen Thomson DON’T mind the gap. A woman has reached the age of 24 without anyone realising she was missing a large part of her brain. The case highlights just how adaptable the organ is. The discovery was made when the woman was admitted to the Chinese PLA General Hospital of Jinan Military Area Command in Shandong Province complaining of dizziness and nausea. She told doctors she’d had problems walking steadily for most of her life, and her mother reported that she hadn’t walked until she was 7 and that her speech only became intelligible at the age of 6. Doctors did a CAT scan and immediately identified the source of the problem – her entire cerebellum was missing (see scan, above). The space where it should be was empty of tissue. Instead it was filled with cerebrospinal fluid, which cushions the brain and provides defence against disease. The cerebellum – sometimes known as the “little brain” – is located underneath the two hemispheres. It looks different from the rest of the brain because it consists of much smaller and more compact folds of tissue. It represents about 10 per cent of the brain’s total volume but contains 50 per cent of its neurons. Although it is not unheard of to have part of your brain missing, either congenitally or from surgery, the woman joins an elite club of just nine people who are known to have lived without their entire cerebellum. A detailed description of how the disorder affects a living adult is almost non-existent, say doctors from the Chinese hospital, because most people with the condition die at a young age and the problem is only discovered on autopsy (Brain, doi.org/vh7). © Copyright New Scientist Ltd.

Keyword: Development of the Brain
Link ID: 24056 - Posted: 09.12.2017

Laura Sanders Peer inside the brain of someone learning. You might be lucky enough to spy a synapse pop into existence. That physical bridge between two nerve cells seals new knowledge into the brain. As new information arrives, synapses form and strengthen, while others weaken, making way for new connections. You might see more subtle changes, too, like fluctuations in the levels of signaling molecules, or even slight boosts in nerve cell activity. Over the last few decades, scientists have zoomed in on these microscopic changes that happen as the brain learns. And while that detailed scrutiny has revealed a lot about the synapses that wire our brains, it isn’t enough. Neuroscientists still lack a complete picture of how the brain learns. They may have been looking too closely. When it comes to the neuroscience of learning, zeroing in on synapse action misses the forest for the trees. A new, zoomed-out approach attempts to make sense of the large-scale changes that enable learning. By studying the shifting interactions between many different brain regions over time, scientists are beginning to grasp how the brain takes in new information and holds onto it. These kinds of studies rely on powerful math. Brain scientists are co-opting approaches developed in other network-based sciences, borrowing tools that reveal in precise, numerical terms the shape and function of the neural pathways that shift as human brains learn. © Society for Science & the Public 2000 - 2017.

Keyword: Learning & Memory
Link ID: 24041 - Posted: 09.06.2017

Laurel Hamers Zika’s damaging neurological effects might someday be enlisted for good — to treat brain cancer. In human cells and in mice, the virus infected and killed the stem cells that become a glioblastoma, an aggressive brain tumor, but left healthy brain cells alone. Jeremy Rich, a regenerative medicine scientist at the University of California, San Diego, and colleagues report the findings online September 5 in the Journal of Experimental Medicine. Previous studies had shown that Zika kills stem cells that generate nerve cells in developing brains (SN: 4/2/16, p. 26). Because of similarities between those neural precursor cells and stem cells that turn into glioblastomas, Rich’s team suspected the virus might also target the cells that cause the notoriously deadly type of cancer. In the United States, about 12,000 people are expected to be diagnosed with glioblastoma in 2017. (It’s the type of cancer U.S. Senator John McCain was found to have in July.) Even with treatment, most patients live only about a year after diagnosis, and tumors frequently recur. In cultures of human cells, Zika infected glioblastoma stem cells and halted their growth, Rich and colleagues report. The virus also infected full-blown glioblastoma cells but at a lower rate, and didn’t infect normal brain tissues. Zika-infected mice with glioblastoma either saw their tumors shrink or their tumor growth slow compared with uninfected mice. The virus-infected mice lived longer, too. In one trial, almost half of the mice survived more than six weeks after being infected with Zika, while all of the uninfected mice died within two weeks of receiving a placebo. |© Society for Science & the Public 2000 - 2017. A

Keyword: Glia
Link ID: 24039 - Posted: 09.06.2017

By Michael Le Page We are still evolving – very slowly. In the 20th century, people in the UK evolved to be less likely to smoke heavily, but the effect was tiny. So claims a study of 200,000 genomes. A population can be described as evolving when the frequency of gene variants changes over time. Because most people in rich countries now live well beyond reproductive age, some argue that we have stopped evolving because natural selection has been weakened. But several recent studies claim we are still evolving, albeit slowly. Now Joseph Pickrell at Columbia University in New York and his team have analysed human genome sequences to spot gene variants that are becoming rarer. One variant, of a gene called CHRNA3, is associated with heavier smoking in those that smoke, raising their risk of a smoking-related death. Comparing people over the age of 80 with people over the age of 60, Pickrell estimates that the variant has declined by 1 per cent between generations. However, his team was not able to prove this, as they did not have any genomic data from people under the age of 40. A variant of the ApoE4 gene that is known to increase the risk of late-onset Alzheimer’s disease, as well as cardiovascular disease, may also be getting rarer. © Copyright New Scientist Ltd.

Keyword: Alzheimers; Genes & Behavior
Link ID: 24038 - Posted: 09.06.2017

Anna VlasitsAnna Vlasits A sheen is starting to appear on Rocky Blumhagen’s forehead, just below his gray hair. He’s marching in place in a starkly lit room decked out with two large flatscreens. On both of the TVs, a volcano lets off steam through wide cracks glowing with lava, their roar muffling the Andean percussion and flutes on the soundtrack. Golden coins slide across the screen. Rocky reaches out his left hand, as if to grasp a coin from midair, and one of them disappears with a brrring. “I don’t know if I can do it,” he says to a guy named Josh sitting nearby in a felt-covered lounge chair. He looks up from his iPad, watching Rocky, age 66, grab, jog, kick, and reach his way through the videogame. “Keep it up,” Josh says as the heart monitor in the corner of the screen reads 129. Rocky and research assistant Josh Volponi are technically in a lab clinic at the University of California, San Francisco, but aside from the mannequin heads studded with electrodes, the room looks more like a man cave. But here, the videogames could halt the mental decay of aging. This is the premise that the university’s new research institute, named Neuroscape, was built to test. This is Rocky’s 18th training session at Neuroscape, founded by neuroscientist Adam Gazzaley. Rocky is fit for his age—he works as a substitute yoga instructor, after retiring from careers producing radio and performing Cole Porter songs—but as he makes it to the end of the level, he looks exhausted. The game cuts to an animation of a jungle, birds chirping and light playing through the canopy as a list of his past scores pops up. This round wasn’t his best. “I haven’t been here for a week,” he says. Volponi asks him to rate his physical exertion level. Rocky gives it a 15 out of 20; Volponi marks it on the iPad. “I feel rusty,” he says, wiping his hands on his orange exercise shorts.

Keyword: Alzheimers; Learning & Memory
Link ID: 24033 - Posted: 09.04.2017

Robin McKie Science Editor People who use genetic tests to trace their ancestry only to discover that they are at risk of succumbing to an incurable illness are being left to suffer serious psychological problems. Dementia researchers say the problem is particularly acute for those found to be at risk of Alzheimer’s disease, which has no cure or effective treatment. Yet these people are stumbling upon their status inadvertently after trying to find their Viking, Asian or ancient Greek roots. “These tests have the potential to cause great distress,” said Anna Middleton, head of society and ethics research at the Wellcome Genome Campus in Cambridge. “Companies should make counselling available, before and after people take tests.” The issue is raised in a paper by Middleton and others in the journal Future Medicine. A similar warning was sounded by Louise Walker, research officer at the Alzheimer’s Society. “Everyone has a right to know about their risk if they want to, but these companies have a moral responsibility to make sure people understand the meaning and consequences of this information. Anyone considering getting genetic test results should do so with their eyes open.” Alzheimer’s is linked to the build-up in the brain of clumps of a protein called amyloid. This triggers severe memory loss, confusion and disorientation. One gene, known as ApoE, affects this process and exists in three variants: E2, E3 and E4. Those possessing the last of these face an increased chance of getting the disease in late life. © 2017 Guardian News and Media Limited

Keyword: Alzheimers; Genes & Behavior
Link ID: 24031 - Posted: 09.04.2017

By Meghana Keshavan, Inflammation has become one of the hottest buzzwords in medical science, pointed to as a culprit in causing or aggravating conditions ranging from allergy to autism to Alzheimer’s disease. But it’s far from clear that standard anti-inflammatory drugs, which have been around for decades, will help patients with those conditions, especially since they often come with dangerous side effects. So in labs across the country, scientists are trying to puzzle through the basic biology, understanding how inflammation leads to disease — and whether it’s possible to develop drugs that could interrupt that process. The latest evidence of inflammation’s broad role in disease came this past week, when a global clinical trial of 10,000 patients who had previous heart attacks showed that an anti-inflammatory drug from Novartis reduced their risk of further heart attacks or strokes. A surprise side effect: The drug also sharply cut the risk of lung cancer. That finding still needs to be confirmed with more research. But lead investigator Dr. Paul Ridker, a cardiologist at Brigham and Women’s Hospital, said he saw the trial as a clear indication of inflammation’s role in spurring cancer growth. The results, he said, turn “the way people look at oncology upside down.” Although inflammation has been studied for decades, there’s still a lot left to learn about this complex physiological condition. It’s basically an unnecessary state of hyperactivity in the body, in which the immune system’s reserve capacity is thrown into overdrive. This excess immune activation sends the wrong cellular signals to various parts of the body — and can wind up worsening conditions like diabetes, Alzheimer’s, and potentially even cancer. © 2017 Scientific American

Keyword: Alzheimers
Link ID: 24028 - Posted: 09.02.2017

Ewen Callaway Japanese researchers report promising results from an experimental therapy for Parkinson’s disease that involves implanting neurons made from ‘reprogrammed’ stem cells into the brain. A trial conducted in monkeys with a version of the disease showed that the treatment improved their symptoms and seemed to be safe, according to a report published on 30 August in Nature1. The study’s key finding — that the implanted cells survived in the brain for at least two years without causing any dangerous effects in the body — provides a major boost to researchers’ hopes of testing stem-cell treatments for Parkinson’s in humans, say scientists. Jun Takahashi, a stem-cell scientist at Kyoto University in Japan who led the study, says that his team plans to begin transplanting neurons made from induced pluripotent stem (iPS) cells into people with Parkinson’s in clinical trials soon. The research is also likely to inform several other groups worldwide that are testing different approaches to treating Parkinson’s using stem cells, with trials also slated to begin soon. Parkinson’s is a neurodegenerative condition caused by the death of cells called dopaminergic neurons, which make a neurotransmitter called dopamine in certain areas of the brain. Because dopamine-producing brain cells are involved in movement, people with the condition experience characteristic tremors and stiff muscles. Current treatments address symptoms of the disease but not the underlying cause. © 2017 Macmillan Publishers Limited,

Keyword: Parkinsons; Stem Cells
Link ID: 24019 - Posted: 08.31.2017

By The Scientist Staff Researchers demonstrated that the mouse subiculum, a brain region associated with the hippocampus, is important for recalling certain types of memories, but it doesn’t appear to play a role in forming them. When they optogenetically turned off neurons within the subiculum, mice’s abilities to retrieve a memory they had previously formed was disrupted. Some scientists think that brain circuits responsible for forming memories are the same as those necessary for retrieving them, write the authors in their report. These data, however, offer evidence to the contrary. See D.S. Roy et al., “Distinct neural circuits for the formation and retrieval of episodic memories,” Cell, doi:10.1016/j.cell.2017.07.013, 2017. © 1986-2017 The Scientist

Keyword: Learning & Memory; Brain imaging
Link ID: 24014 - Posted: 08.31.2017

By DAVID DeSTENO, CYNTHIA BREAZEAL and PAUL HARRIS Why is educational technology such a disappointment? In recent years, parents and schools have been exposing children to a range of computer-mediated instruction, and adults have been turning to “brain training” apps to sharpen their minds, but the results have not been encouraging. A six-year research project commissioned by the Department of Education examined different cybertechnology programs across thousands of students in hundreds of schools and found little to no evidence that they improved academic performance. Unfortunately, it appears the same goes for cognitive-training programs. Lumos Labs, the company behind Lumosity, one of the leading programs in this area, agreed to pay $2 million to settle charges by the Federal Trade Commission that it misled customers with claims that Lumosity improved people’s performance in school and at work. In our view, the problem stems partly from the fact that the designers of these technologies rely on an erroneous set of assumptions about how the mind learns. Yes, the human brain is an amazing information processor, but it evolved to take in, analyze and store information in a specific way: through social interaction. For millenniums, the environs in which we learned best were social ones. It was through other people’s testimony or through interactive discourse and exploration with them that we learned facts about our world and new ways of solving problems. And it’s precisely because of this history that we can expect the mind to be socially tuned, meaning that it should rely on and incorporate social cues to facilitate learning. © 2017 The New York Times Company

Keyword: Learning & Memory
Link ID: 24007 - Posted: 08.29.2017

By Sameer Deshpande, Raiden Hasegawa, Christina Master, Amanda Rabinowitz, Dylan Small American football is the largest participation sport in U.S. high schools. Recently, many have expressed concern about the sport’s safety with some even calling for banning youth and high school tackle football. We recently published a study in JAMA Neurology suggesting that, in general, men who played high school football in 1950s Wisconsin did not have a higher risk of poor cognitive or emotional health later in life than those who did not play. Recent concerns about football’s safety have been driven largely by reports of chronic traumatic encephalopathy (CTE) among retired professional players. CTE is a neurodegenerative disease thought to result from repetitive head trauma with symptoms including memory loss, aggression, confusion and depression. A recent study in JAMA reported evidence of CTE in 110 of 111 deceased retired NFL players who donated their brains for posthumous examination. This important study adds to a larger body of work linking repetitive sports-related concussion with neurodegenerative disease. However, such research, which depends on brains donated by families of players many of whom were symptomatic before death, is not designed to establish the base rate of neurodegeneration among the larger population of football players. A critical question remains: what is the risk of later-life cognitive and emotional dysfunction for American high school football players? © 2017 Scientific American

Keyword: Brain Injury/Concussion; Development of the Brain
Link ID: 23994 - Posted: 08.25.2017

By NICHOLAS BAKALAR Studies have shown that obese women give birth to larger babies who are at risk for obesity and other metabolic problems later in life. Some have thought that the reason may be that obese mothers, whose bodies are rich in nutrients, somehow “overfeed” the fetus during gestation. A new study has found that this is unlikely. The study, in PLOS Medicine, looked at more than 10,000 mother-child pairs, following their offspring into early adulthood. Researchers had data on body mass index, education, occupation and smoking behavior for both mothers and fathers. They also did tests for 153 metabolic traits in the children, including levels of fats in the blood. They found that both maternal and paternal B.M.I. were associated strongly with the metabolic traits of their children. Since paternal B.M.I. cannot affect the fetus during its development, this suggests that familial traits, rather than any “programming” of the fetus in the womb, are the explanation for metabolic abnormalities in the children of obese mothers. The senior author, Deborah A. Lawlor, a professor of epidemiology at the University of Bristol in England, said obesity in pregnancy is dangerous for many reasons. But the evidence that the mother’s weight alone determines her children’s future metabolic health is weak, and putting all the burden on the pregnant woman is not helpful. “The whole family should have a healthy weight,” she said. © 2017 The New York Times Company

Keyword: Obesity; Development of the Brain
Link ID: 23991 - Posted: 08.25.2017

By James Gallagher People with higher levels of lithium in their drinking water appear to have a lower risk of developing dementia, say researchers in Denmark. Lithium is naturally found in tap water, although the amount varies. The findings, based on a study of 800,000 people, are not clear-cut. The highest levels cut risk, but moderate levels were worse than low ones. Experts said it was an intriguing and encouraging study that hinted at a way of preventing the disease. The study, at the University of Copenhagen, looked at the medical records of 73,731 Danish people with dementia and 733,653 without the disease. Tap water was then tested in 151 areas of the country. The results, published in JAMA Psychiatry, showed moderate lithium levels (between 5.1 and 10 micrograms per litre) increased the risk of dementia by 22% compared with low levels (below five micrograms per litre). However, those drinking water with the highest lithium levels (above 15 micrograms per litre) had a 17% reduction in risk. The researchers said: "This is the first study, to our knowledge, to investigate the association between lithium in drinking water and the incidence of dementia. "Higher long-term lithium exposure from drinking water may be associated with a lower incidence of dementia." Lithium is known to have an effect on the brain and is used as a treatment in bipolar disorder. However, the lithium in tap water is at much lower levels than is used medicinally. Experiments have shown the element alters a wide range of biological processes in the brain. © 2017 BBC.

Keyword: Alzheimers
Link ID: 23990 - Posted: 08.24.2017

By Alexander P. Burgoyne, David Z. Hambrick More than 60 years ago, Francis Crick and James Watson discovered the double-helical structure of deoxyribonucleic acid—better known as DNA. Today, for the cost of a Netflix subscription, you can have your DNA sequenced to learn about your ancestry and proclivities. Yet, while it is an irrefutable fact that the transmission of DNA from parents to offspring is the biological basis for heredity, we still know relatively little about the specific genes that make us who we are. That is changing rapidly through genome-wide association studies—GWAS, for short. These studies search for differences in people’s genetic makeup—their “genotypes”—that correlate with differences in their observable traits—their “phenotypes.” In a GWAS recently published in Nature Genetics, a team of scientists from around the world analyzed the DNA sequences of 78,308 people for correlations with general intelligence, as measured by IQ tests. The major goal of the study was to identify single nucleotide polymorphisms—or SNPs—that correlate significantly with intelligence test scores. Found in most cells throughout the body, DNA is made up of four molecules called nucleotides, referred to by their organic bases: cytosine (C), thymine (T), adenine (A), and guanine (G). Within a cell, DNA is organized into structures called chromosomes­. Humans normally have 23 pairs of chromosomes, with one in each pair inherited from each parent. © 2017 Scientific American

Keyword: Intelligence; Genes & Behavior
Link ID: 23986 - Posted: 08.23.2017

By Madhumita Murgia CINCINNATI — Just before Christmas 2015, child psychiatrist Daniel Nelson noticed an unusual number of suicidal kids in the hospital emergency room. A 14-year-old girl with a parent addicted to opioids tried to choke herself with a seat belt. A 12-year-old transgender child hurt himself after being bullied. And a steady stream of kids arrived from the city’s west side, telling him they knew other kids — at school, in their neighborhoods — who had also tried to die. “I think there’s an increase in suicidal kids in Cincinnati,” Nelson told a colleague. “We need to start mapping this out.” So Nelson and his colleagues collected the addresses of 300 children admitted to Cincinnati Children’s Hospital with suicidal behavior over three months in early 2016, looking for patterns. Almost instantly, a disturbing one emerged: Price Hill, a poor community with a high rate of opioid overdoses, was home to a startling number of suicidal kids. “This is who is dying from opiates — people in their 20s and 30s. Think about what that population is,” Nelson said. “It’s parents.” Nelson says there may be a connection between the opioid epidemic and the increased risk of suicide in teenagers and children. (Luke Sharrett for The Washington Post) Now Nelson is working with county coroners across the nation to try to corroborate his theory, that trauma from the nation’s opioid epidemic could help explain an extraordinary increase in suicide among American children. Since 2007, the rate of suicide has doubled among children 10 to 14, according to the Centers for Disease Control and Prevention. Suicide is the second-leading cause of death between the ages of 10 and 24. The suicide rate among older teenage girls hit a 40-year high in 2015, according to newly released data from the National Center for Health Statistics. © 1996-2017 The Washington Post

Keyword: Depression; Drug Abuse
Link ID: 23985 - Posted: 08.23.2017

/ By Steven Lubet There is a memorable episode in the now-classic sitcom Scrubs in which the conniving Dr. Kelso unveils a plan to peddle useless “full body scans” as a new revenue stream for the perpetually cash-strapped Sacred Heart Hospital. The irascible but ultimately patient-protecting Dr. Cox objects loudly. “I think showing perfectly healthy people every harmless imperfection in their body just to scare them into taking invasive and often pointless tests is an unholy sin,” he says. Undeterred, Kelso launches an advertising campaign that promotes the scans in a tear-jerking television commercial and a billboard screaming “YOU may already be DYING.” Alarmist medical advertising is pretty funny on television, but it can be far more troubling in real life. Although I’ve never been alerted to impending death, I recently received an advertisement from my own trusted health care provider warning that I may have Alzheimer’s disease, although I have no known symptoms and no complaints. As long-time patients at NorthShore University Health System, which is affiliated with the University of Chicago, my wife and I received two solicitations from its Center for Brain Health touting the development of “ways to slow brain aging and even prevent the onset of Alzheimer’s.” According to the ads, which arrived in both postcard and email form, there is “new hope for delaying — even preventing — aging brain diseases” through “genetic testing, advanced diagnostics, and lifestyle factors.” Copyright 2017 Undark

Keyword: Alzheimers
Link ID: 23978 - Posted: 08.19.2017

Nicola Davis The eternal sunshine of a spotless mind has come one step closer, say researchers working on methods to erase memories of fear. The latest study, carried out in mice, unpicks why certain sounds can stir alarming memories, and reveals a new approach to wiping such memories from the brain. The researchers say the findings could be used to either weaken or strengthen particular memories while leaving others unchanged. That, they say, could potentially be used to help those with cognitive decline or post-traumatic stress disorder by removing fearful memories while retaining useful ones, such as the sound of a dog’s bark. “We can use same approach to selectively manipulate only the pathological fear memory while preserving all other adaptive fear memories which are necessary for our daily lives,” said Jun-Hyeong Cho, co-author of the research from the University of California, Riverside. The research is the latest in a string of studies looking at ways to erase unpleasant memories, with previous work by scientists exploring techniques ranging from brain scans and AI to the use of drugs. Published in the journal Neuron by Cho and his colleague Woong Bin Kim, the research reveals how the team used genetically modified mice to examine the pathways between the area of the brain involved in processing a particular sound and the area involved in emotional memories, known as the amygdala. “These mice are special in that we can label or tag specific pathways that convey certain signals to the amygdala, so that we can identify which pathways are really modified as the mice learn to fear a particular sound,” said Cho. “It is like a bundle of phone lines,” he added. “Each phone line conveys certain auditory information to the amygdala.” © 2017 Guardian News and Media Limited

Keyword: Emotions; Learning & Memory
Link ID: 23974 - Posted: 08.18.2017

By Ingfei Chen, Spectrum In October 2010, Lisa and Eugene Jeffers learned that their daughter Jade, then nearly 2 and a half years old, has autism. The diagnosis felt like a double whammy. The parents were soon engulfed by stress from juggling Jade’s new therapy appointments and wrangling with their health insurance provider, but they now had an infant son to worry about, too. Autism runs in families. Would Bradley follow in his big sister’s footsteps? "We were on high alert,” Lisa Jeffers says. “There were times I would call his name, and he wouldn't look.” She says she couldn’t help but think: Is it because he's busy playing or because he has autism? In search of guidance, the parents signed Bradley up for a three-year study at the University of California, Davis (UC Davis) MIND Institute, a half-hour drive from their home near Sacramento. Researchers there wanted answers to some of the same questions the couple had: What are the odds that infants like Bradley—younger brothers or sisters of a child with autism—will be on the spectrum too? Could experts detect autism in these babies early on, so that they might benefit from early intervention? The infant-sibling study at UC Davis is one of more than 20 similar long-running investigations across the United States, Canada and United Kingdom, the first of which began around 2000. These ‘baby sib’ studies, which collectively have followed thousands of children, are among the most ambitious and expensive projects in autism research. Many of the scientists who run them anticipated that by tracking this special population, they would be able to spot signs of autism before age 1, and ultimately create an infant screen for the condition. © 2017 Scientific American

Keyword: Autism; Development of the Brain
Link ID: 23973 - Posted: 08.18.2017

By WILLIAM GRIMES Marian C. Diamond, a neuroscientist who overturned long-held beliefs by showing that environmental factors can change the structure of the brain and that the brain continues to develop throughout one’s life, died on July 25 at her home in Oakland, Calif. She was 90. Her son Richard Diamond confirmed the death. Dr. Diamond’s most celebrated study was of the preserved brain of Albert Einstein, in the 1980s, but it was her work two decades earlier, at the University of California, Berkeley, that had the most lasting impact. Dr. Diamond was an instructor at Cornell University in the late 1950s when she read a paper in Science magazine showing that rats who navigated mazes quickly had a different brain chemistry than slower rats. They showed much higher levels of acetylcholinesterase, an enzyme that accelerates the transmission of neural signals. “What a thrill I had when my mind jumped immediately to the question, ‘I wonder if the anatomy of these brains would also show a difference in learning ability?’ ” Dr. Diamond wrote in an autobiographical essay for the Society for Neuroscience. She was able to test her theory after joining a team at Berkeley led by Mark R. Rosenzweig, one of the authors of the Science paper. To gauge the effects of environment on performance, Dr. Rosenzweig and his colleagues had begun raising rats in so-called enriched cages, outfitted with ladders and wheels, in the company of other rats. The rats in a control group were raised alone in bare cages. © 2017 The New York Times Company

Keyword: Learning & Memory
Link ID: 23966 - Posted: 08.17.2017

Alice H. Eagly It’s no secret that Silicon Valley employs many more men than women in tech jobs. What’s much harder to agree on is why. The recent anti-diversity memo by a now former Google engineer has pushed this topic into the spotlight. The writer argued there are ways to explain the gender gap in tech that don’t rely on bias and discrimination – specifically, biological sex differences. Setting aside how this assertion would affect questions about how to move toward greater equity in tech fields, how well does his wrap-up represent what researchers know about the science of sex and gender? As a social scientist who’s been conducting psychological research about sex and gender for almost 50 years, I agree that biological differences between the sexes likely are part of the reason we see fewer women than men in the ranks of Silicon Valley’s tech workers. But the road between biology and employment is long and bumpy, and any causal connection does not rule out the relevance of nonbiological causes. Here’s what the research actually says. There is no direct causal evidence that biology causes the lack of women in tech jobs. But many, if not most, psychologists do give credence to the general idea that prenatal and early postnatal exposure to hormones such as testosterone and other androgens affect human psychology. In humans, testosterone is ordinarily elevated in males from about weeks eight to 24 of gestation and also during early postnatal development. © 2010–2017, The Conversation US, Inc.

Keyword: Sexual Behavior; Development of the Brain
Link ID: 23963 - Posted: 08.16.2017