Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By TRIP GABRIEL DO you remember June 27, 2015? If you knew you had been on a sailboat, and that the weather was miserable, and that afterward you had a beer with the other sailors, would you expect to recall — even one year later — at least a few details? I was on that boat, on a blustery Saturday on Long Island Sound. But every detail is missing from my memory, as if snipped out by an overzealous movie editor. The earliest moment I recall from the day is lying in an industrial tube with a kind of upturned colander over my face, fighting waves of claustrophobia. My mind was densely fogged, but I understood that I was in an M.R.I. machine. Someone was scanning my brain. Other hazy scenes followed: being wheeled into a hospital room. My wife, Alice, hovering in the background. A wall clock that read minutes to midnight, an astonishing piece of information. What had happened to the day? Late that night, alone in the room, I noticed two yellow Post-its on the bedside table in Alice’s writing: “You have a condition called transient global amnesia. It will last Hours not DAYS. You’re going to be fine. Your CT scan was clear. You sailed today and drove yourself home,” the note read in part. I had never heard of transient global amnesia, a rare condition in which you are suddenly unable to recall recent events. Its causes are unknown. Unlike other triggers of memory loss, like a stroke or epileptic seizures, the condition is considered harmless, and an episode does not last long. “We don’t understand why it happens,” a neurologist would later tell me. “There are a million theories.” © 2016 The New York Times Company
Keyword: Learning & Memory
Link ID: 22456 - Posted: 07.19.2016
By Alice Klein Blame grandpa. A study in mice shows that the grandsons of obese males are more susceptible to the detrimental health effects of junk food, even if their fathers are lean and healthy. The finding adds to evidence that new traits can be passed down the family line without being permanently recorded in a family’s genes – a phenomenon called transgenerational epigenetics. Last year, a study found that the DNA in the sperm of obese men is modified in thousands of places, and that these sperm also contain short pieces of RNA. These are epigenetic modifications – they don’t affect the precise code of genes, but instead may affect how active particular genes are. Now Catherine Suter at Victor Chang Cardiac Research Institute in Sydney and her team have investigated the longer-term effects of paternal obesity. To do this, they mated obese male mice with lean female mice. They found that, compared with the offspring of lean males, both the sons and grandsons of the obese males were more likely to show the early signs of fatty liver disease and diabetes when given a junk food diet. The same effect wasn’t seen in daughters or granddaughters. Even when the sons of the obese males were fed a healthy diet and kept at a normal weight, their sons still had a greater tendency to develop obesity-related conditions when exposed to a junk diet. © Copyright Reed Business Information Ltd.
Tina Hesman Saey ORLANDO, Fla. — Weight gain may depend on how an individual’s genes react to certain diets, a new study in mice suggests. Four strains of mice fared differently on four different diets, William Barrington of North Carolina State University in Raleigh reported July 15 at the Allied Genetics Conference. One strain, the A/J mouse, was nearly impervious to dietary changes. Those mice didn’t gain much weight or have changes in insulin or cholesterol no matter what they ate: a fat-and-carbohydrate-laden Western diet, traditional Mediterranean or Japanese diet (usually considered healthy) or very low-carbohydrate, fat-rich fare known as the ketogenic diet. In contrast, NOD/ShiLtJ mice gained weight on all but the Japanese diet. Those mice’s blood sugar shot up — a hallmark of diabetes — on a Mediterranean diet, but decreased on the Japanese diet. FVB/NJ mice didn’t get fat on the Western diet, but became obese and developed high cholesterol and other health problems on the ketogenic diet. The opposite was true for C57BL/6J mice. They became obese and developed cholesterol and other problems linked to heart disease and diabetes in people on the Western diet, but not on the ketogenic diet. They also fattened up on the Mediterranean diet. © Society for Science & the Public 2000 - 2016.
Link ID: 22454 - Posted: 07.19.2016
By Maggie Koerth-Baker Q: I want to hear what the loudest thing in the world is! — Kara Jo, age 5 No. No, you really don’t. See, there’s this thing about sound that even we grown-ups tend to forget — it’s not some glitter rainbow floating around with no connection to the physical world. Sound is mechanical. A sound is a shove — just a little one, a tap on the tightly stretched membrane of your ear drum. The louder the sound, the heavier the knock. If a sound is loud enough, it can rip a hole in your ear drum. If a sound is loud enough, it can plow into you like a linebacker and knock you flat on your butt. When the shock wave from a bomb levels a house, that’s sound tearing apart bricks and splintering glass. Sound can kill you. Consider this piece of history: On the morning of Aug. 27, 1883, ranchers on a sheep camp outside Alice Springs, Australia, heard a sound like two shots from a rifle. At that very moment, the Indonesian volcanic island of Krakatoa was blowing itself to bits 2,233 miles away. Scientists think this is probably the loudest sound humans have ever accurately measured. Not only are there records of people hearing the sound of Krakatoa thousands of miles away, there is also physical evidence that the sound of the volcano’s explosion traveled all the way around the globe multiple times. Now, nobody heard Krakatoa in England or Toronto. There wasn’t a “boom” audible in St. Petersburg. Instead, what those places recorded were spikes in atmospheric pressure — the very air tensing up and then releasing with a sigh, as the waves of sound from Krakatoa passed through. There are two important lessons about sound in there: One, you don’t have to be able to see the loudest thing in the world in order to hear it. Second, just because you can’t hear a sound doesn’t mean it isn’t there. Sound is powerful and pervasive and it surrounds us all the time, whether we’re aware of it or not.
Link ID: 22453 - Posted: 07.19.2016
By Diana Kwon Few things feel worse than not knowing when your next paycheck is coming. Economic insecurity has been shown to have a whole host of negative effects, including low self-esteem and impaired cognitive functioning. It turns out financial stress can also physically hurt, according to a paper published in February in Psychological Science. Eileen Chou, a public policy professor at the University of Virginia, and her collaborators began by analyzing a data set of 33,720 U.S. households and found that those with higher levels of unemployment were more likely to purchase over-the-counter painkillers. Then, using a series of experiments, the team discovered that simply thinking about the prospect of financial insecurity was enough to increase pain. For example, people reported feeling almost double the amount of physical pain in their body after recalling a financially unstable time in their life as compared with those who thought about a secure period. In another experiment, university students who were primed to feel anxious about future employment prospects removed their hand from an ice bucket more quickly (showing less pain tolerance) than those who were not. The researchers also found that economic insecurity reduced people's sense of control, which, in turn, increased feelings of pain. Chou and her colleagues suggest that because of this link between financial insecurity and decreased pain tolerance, the recent recession may have been a factor in fueling the prescription painkiller epidemic. Other experts are cautious about taking the findings that far. “I think the hypothesis [that financial stress causes pain] has a lot of merit, but it would be helpful to see additional rigorous evidence in a real-world environment,” says Heather Schofield, an economist at the University of Pennsylvania who was not involved in the study. © 2016 Scientific American,
By William Kenower My youngest son, Sawyer, used to spend far more time relating to his imagination than he did to the world around him. He would run back and forth humming, flapping his hands and thumping on his chest. By the time he was in first grade, attempts to draw him out of his pretend world to join his classmates or do some class work led to explosions and timeouts. At 7 he was given a diagnosis of being on the autism spectrum. That was when my wife, Jen, learned about the practice called joining. The idea behind it, which she discovered in Barry Neil Kaufman’s book “Son-Rise,” is brilliant in its simplicity. We wanted Sawyer to be with us. We did not want him to live in this bubble of his own creation. And so, instead of telling him to stop pretending and join us, we started pretending and joined him. The first time Jen joined him, the first time she ran beside him humming and thumping her chest, he stopped running, stopped thumping, stopped humming and, without a single word from us, turned to her and said, “What are you doing?” We took turns joining him every day, and a week later we got an email from his special education teacher telling us to keep doing whatever we were doing. He’d gone from five timeouts a day to one in a week. The classroom was the same, the work was the same – all that was different was that we had found a way to say to him in a language he could understand, “You’re not wrong.” Emboldened by our success, we set about becoming more fluent in this language. For the next couple of years we taught ourselves to join him constantly. This meant that whatever we were doing had to stop whenever we heard him running back and forth and humming. But we could not join him simply to get him to stop running and thumping and humming. We had to join him without any judgment or impatience. That was the trickiest part. The desire to fix him was great. I had come to believe that there were broken people in need of fixing. Sometimes, I looked like one of those people. I was a 40-year-old unpublished writer working as a waiter. My life reeked of failure. Many days I looked in the mirror and asked, “What is wrong with me?” © 2016 The New York Times Company
Link ID: 22451 - Posted: 07.16.2016
By Sara Chodosh There has long been debate about a link between serious blows to the head and the development of neurodegenerative diseases later in life. Research has made cases for and against a relationship between traumatic brain injuries and neurological ailments such as Alzheimer’s, Parkinson’s and general dementia. Now the question is drawing ever more scrutiny as the alarming extent of these injuries becomes better known—and new research is finally casting some light on this murky and often quietly terrifying topic. A large-scale analysis of three separate studies published this week in JAMA Neurology found no association between unconsciousness-causing traumatic brain injuries (TBI) and Alzheimer’s disease or general dementia—but it did find a strong association between TBI and Parkinson’s disease. “I can’t decide if the positive or negative findings are more surprising,” says one of the study’s investigators, physician and Alzheimer’s researcher Paul Crane at the University of Washington. The positive association his team found between Parkinson’s and TBI was not entirely novel, but Crane says the magnitude of the link was unexpected. The researchers found the risk of Parkinson’s rose threefold for people whose head injuries had caused them to go unconscious for more than an hour. The more contentious finding is the lack of an association between TBI and Alzheimer’s. Prior research has been divided on whether there is a link, but many of the previous studies have been smaller in scale and conducted less-comprehensive analyses. “Although early studies suggested a clear link between TBI and an increased risk for Alzheimer’s disease, this has not been replicated,” explains Frances Corrigan at the University of Adelaide, who studies how TBI influences neurodegeneration. © 2016 Scientific American,
Paula Span An estimated one zillion older people have a problem like mine. First: We notice age-related hearing loss. A much-anticipated report on hearing health from the National Academies of Sciences, Engineering and Medicine last month put the prevalence at more than 45 percent of those aged 70 to 74, and more than 80 percent among those over 85. Then: We do little or nothing about it. Fewer than 20 percent of those with hearing loss use hearing aids. I’ve written before about the reasons. High prices ($2,500 and up for a decent hearing aid, and most people need two). Lack of Medicare reimbursement, because the original 1965 law creating Medicare prohibits coverage. Time and hassle. Stigma. Both the National Academies and the influential President’s Council of Advisors on Science and Technology have proposed pragmatic steps to make hearing technology more accessible and affordable. But until there’s progress on those, many of us with mild to moderate hearing loss may consider a relatively inexpensive alternative: personal sound amplification products, or P.S.A.P.s. They offer some promise — and some perils, too. Unlike for a hearing aid, you don’t need an audiologist to obtain a P.S.A.P. You see these gizmos advertised on the back pages of magazines or on sale at drugstore chains. You can buy them online. © 2016 The New York Times Company
Link ID: 22449 - Posted: 07.16.2016
By SARAH MASLIN NIR Almost as soon as the young man crouching on a trash-strewed street in Brooklyn pulled out a crumpled dollar bill from his pocket and emptied its contents of dried leaves into a wrapper, he had company. A half-dozen disheveled men and women walked swiftly to where the young man was rolling a cigarette of a synthetic drug known as K2 to wait for a chance to share. The drug has been the source of an alarming and sudden surge in overdoses — over three days this week, 130 people across New York City were treated in hospital emergency rooms after overdosing on K2, almost equaling the total for the entire month of June, according to the city’s health department. About one-fourth of the overdoses, 33, took place on Tuesday along the border of Bedford-Stuyvesant and Bushwick, the same Brooklyn neighborhoods where, despite a heightened presence of police officers, people were again openly smoking the drug on Thursday. In response to the overdoses, the city is sending a health alert to emergency rooms and other health care providers warning about the drug. The outbreak comes after officials this spring lauded what they described as a successful campaign to severely curb the prevalence of K2. On Thursday, Gov. Andrew M. Cuomo announced that the State Police would step up enforcement against the drug and aggressively go after merchants who illegally sell it. The same day, just steps from where people were using the drug, clusters of police officers patrolled beneath the elevated subway tracks along a stretch where, the day before, five bodegas had been raided. K2 is typically sold by convenience stores, though the raids did not turn up any. © 2016 The New York Times Company
Keyword: Drug Abuse
Link ID: 22448 - Posted: 07.16.2016
James M. Broadway “Where did the time go?” middle-aged and older adults often remark. Many of us feel that time passes more quickly as we age, a perception that can lead to regrets. According to psychologist and BBC columnist Claudia Hammond, “the sensation that time speeds up as you get older is one of the biggest mysteries of the experience of time.” Fortunately, our attempts to unravel this mystery have yielded some intriguing findings. In 2005, for instance, psychologists Marc Wittmann and Sandra Lenhoff, both then at Ludwig Maximilian University of Munich, surveyed 499 participants, ranging in age from 14 to 94 years, about the pace at which they felt time moving—from “very slowly” to “very fast.” For shorter durations—a week, a month, even a year—the subjects' perception of time did not appear to increase with age. Most participants felt that the clock ticked by quickly. But for longer durations, such as a decade, a pattern emerged: older people tended to perceive time as moving faster. When asked to reflect on their lives, the participants older than 40 felt that time elapsed slowly in their childhood but then accelerated steadily through their teenage years into early adulthood. There are good reasons why older people may feel that way. When it comes to how we perceive time, humans can estimate the length of an event from two very different perspectives: a prospective vantage, while an event is still occurring, or a retrospective one, after it has ended. In addition, our experience of time varies with whatever we are doing and how we feel about it. In fact, time does fly when we are having fun. Engaging in a novel exploit makes time appear to pass more quickly in the moment. But if we remember that activity later on, it will seem to have lasted longer than more mundane experiences. © 2016 Scientific American,
By JOANNA KLEIN Jet lag may be the worst part of traveling. And it hits many people harder traveling east than west. Why they feel this way is unclear. But scientists recently developed a model that mimics special time-keeping cells in the body and offers a mathematical explanation for why traveling from west to east feels so much worse. It also offers insights on recovering from jet lag. Deep inside the brain, in a region called the hypothalamus (right above where our optic nerves cross) the internal clock is ticking. And approximately every 24 hours, 20,000 special pacemaker cells that inhabit this area, known as the superchiasmatic nucleus, synchronize, signaling to the rest of the body whether it’s night or day. These cells know which signal to send because they receive light input from our environments — bright says wake, dark says sleep. But when you travel across multiple time zones, like flying from New York to Moscow, those little pacemaker cells that thought they knew the routine scramble around confused before they can put on their show. The whole body feels groggy because it’s looking for the time and can’t find it. The result: jet lag. Most of our internal clocks are a little bit slow, and in the absence of consistent light cues — like when you travel across time zones — the pacemaker cells in your body want to have a longer day, said Michelle Girvan, a physicist at the University of Maryland who worked on the model published in the journal Chaos on Tuesday. “This is all because the body’s internal clock has a natural period of slightly longer than 24 hours, which means that it has an easier time traveling west and lengthening the day than traveling east and shortening the day,” Dr. Girvan said. © 2016 The New York Times Company
Keyword: Biological Rhythms
Link ID: 22446 - Posted: 07.16.2016
Helen Haste The American psychologist and educationist Jerome Bruner, who has died aged 100, repeatedly challenged orthodoxies and generated novel directions. His elegant, accessible writing reached wide audiences. His colleague Rom Harré described his lectures as inspiring: “He darted all over the place, one topic suggested another and so on through a thrilling zigzag.” To the charge that he was always asking impossible questions, Jerry replied: “They are pretty much impossible, but the search for the impossible is part of what intelligence is about.” He was willing to engage with controversy, both on academic issues and in education politics. Blind at birth because of cataracts, Jerry gained his sight after surgery at the age of two. He credited this for his sense that we actively interpret and organise our world rather than passively react to it – a theme that he continued to develop in different ways. His first work lay in perception, when he resumed research at Harvard after the second world war. He found that children’s judgments of the size of coins and coin-like disks varied: poorer children overestimated the size of the coins. This contributed to the emerging “new look” movement in psychology, involving values, intentions and interpretation in contrast to the then dominant behaviourist focus on passive learning, reward and punishment. His professorship at Harvard came in 1952, and by the middle of the decade a computer metaphor began to influence psychology – the “cognitive revolution”. With Jacqueline Goodnow and George Austin, Jerry published A Study of Thinking (1956). © 2016 Guardian News and Media Limited
NOBODY knows how the brain works. But researchers are trying to find out. One of the most eye-catching weapons in their arsenal is functional magnetic-resonance imaging (fMRI). In this, MRI scanners normally employed for diagnosis are used to study volunteers for the purposes of research. By watching people’s brains as they carry out certain tasks, neuroscientists hope to get some idea of which bits of the brain specialise in doing what. The results look impressive. Thousands of papers have been published, from workmanlike investigations of the role of certain brain regions in, say, recalling directions or reading the emotions of others, to spectacular treatises extolling the use of fMRI to detect lies, to work out what people are dreaming about or even to deduce whether someone truly believes in God. But the technology has its critics. Many worry that dramatic conclusions are being drawn from small samples (the faff involved in fMRI makes large studies hard). Others fret about over-interpreting the tiny changes the technique picks up. A deliberately provocative paper published in 2009, for example, found apparent activity in the brain of a dead salmon. Now, researchers in Sweden have added to the doubts. As they reported in the Proceedings of the National Academies of Science, a team led by Anders Eklund at Linkoping University has found that the computer programs used by fMRI researchers to interpret what is going on in their volunteers’ brains appear to be seriously flawed. © The Economist Newspaper Limited 2016
Keyword: Brain imaging
Link ID: 22444 - Posted: 07.15.2016
By Andy Coghlan There once was a brainy duckling. It could remember whether shapes or colours it saw just after hatching were the same as or different to each other. The feat surprised the researchers, who were initially sceptical about whether the ducklings could grasp such complex concepts as “same” and “different”. The fact that they could suggests the ability to think in an abstract way may be far more common in nature than expected, and not just restricted to humans and a handful of animals with big brains. “We were completely surprised,” says Alex Kacelnik at the University of Oxford, who conducted the experiment along with his colleague Antone Martinho III. Kacelnik and Martinho reasoned that ducklings might be able to grasp patterns relating to shape or colour as part of the array of sensory information they absorb soon after hatching. Doing so would allow them to recognise their mothers and siblings and distinguish them from all others – abilities vital for survival. In ducklings, goslings and other species that depend for survival on following their mothers, newborns learn quickly – a process called filial imprinting. Kacelnik wondered whether this would enable them to be tricked soon after hatching into “following” objects or colours instead of their natural mother, and recognising those same patterns in future. © Copyright Reed Business Information Ltd.
Rebecca Boyle Eliane Lucassen works the night shift at Leiden University Medical Center in the Netherlands, beginning her day at 6 p.m. Yet her own research has shown that this schedule might cause her health problems. “It’s funny,” the medical resident says. “Here I am, spreading around that it’s actually unhealthy. But it needs to be done.” Lucassen and Johanna Meijer, a neuroscientist at Leiden, report today in Current Biology1 that a constant barrage of bright light prematurely ages mice, playing havoc with their circadian clocks and causing a cascade of health problems. Mice exposed to constant light experienced bone-density loss, skeletal-muscle weakness and inflammation; restoring their health was as simple as turning the lights off. The findings are preliminary, but they suggest that people living in cities flooded with artificial light may face similar health risks. “We came to know that smoking was bad, or that sugar is bad, but light was never an issue,” says Meijer. “Light and darkness matter.” Disrupted patterns Many previous studies have hinted at a connection between artificial light exposure and health problems in animals and people2. Epidemiological analyses have found that shift workers have an increased risk of breast cancer3, metabolic syndrome4 and osteoporosis5, 6. People exposed to bright light at night are more likely to have cardiovascular disease and often don’t get enough sleep. © 2016 Macmillan Publishers Limited,
Keyword: Biological Rhythms
Link ID: 22442 - Posted: 07.15.2016
Michael Egnor The most intractable question in modern neuroscience and philosophy of the mind is often phrased "What is consciousness?" The problem has been summed up nicely by philosopher David Chalmers as what he calls the Hard Problem of consciousness: How is it that we are subjects, and not just objects? Chalmers contrasts this hard question with what he calls the Easy Problem of consciousness: What are the neurobiological substrates underlying such things as wakefulness, alertness, attention, arousal, etc. Chalmers doesn't mean of course that the neurobiology of arousal is easy. He merely means to show that even if we can understand arousal from a neurobiological standpoint, we haven't yet solved the hard problem: the problem of subjective experience. Why am I an I, and not an it? Chalmers's point is a good one, and I think that it has a rather straightforward solution. First, some historical background is necessary. "What is consciousness?" is a modern question. It wasn't asked before the 17th century, because no one before Descartes thought that the mind was particularly mysterious. The problem of consciousness was created by moderns. The scholastic philosophers, following Aristotle and Aquinas, understood the soul as the animating principle of the body. In a human being, the powers of the soul -- intellect, will, memory, perception, appetite, and such -- were no more mysterious than the other powers of the soul, such as respiration, circulation, etc. Of course, biology in the Middle Ages wasn't as advanced as it is today, so there was much they didn't understand about human physiology, but in principle the mind was just another aspect of human biology, not inherently mysterious. In modern parlance, the scholastics saw the mind as the Easy Problem, no more intractable than understanding how breathing or circulation work.
Link ID: 22441 - Posted: 07.15.2016
Laura Sanders If you’ve ever watched a baby purse her lips to hoot for the first time, or flash a big, gummy grin when she sees you, or surprise herself by rolling over, you’ve glimpsed the developing brain in action. A baby’s brain constructs itself into something that controls the body, learns and connects socially. Spending time with an older person, you may notice signs of slippage. An elderly man might forget why he went into the kitchen, or fail to anticipate the cyclist crossing the road, or muddle medications with awkward and unfamiliar names. These are the signs of the gentle yet unrelenting neural erosion that comes with normal aging. These two seemingly distinct processes — development and aging — may actually be linked. Hidden in the brain-building process, some scientists now suspect, are the blueprints for the brain’s demise. The way the brain is built, recent research suggests, informs how it will decline in old age. That the end can be traced to the beginning sounds absurd: A sturdily constructed brain stays strong for decades. During childhood, neural pathways make connections in a carefully choreographed order. But in old age, this sequence plays in reverse, brain scans reveal. In both appearance and behavior, old brains seem to drift backward toward earlier stages of development. What’s more, some of the same cellular tools are involved in both processes. © Society for Science & the Public 2000 - 2016
Keyword: Development of the Brain
Link ID: 22440 - Posted: 07.14.2016
Ramin Skibba Is Justin Bieber a musical genius or a talentless hack? What you 'belieb' depends on your cultural experiences. Some people like to listen to the Beatles, while others prefer Gregorian chants. When it comes to music, scientists find that nurture can trump nature. Musical preferences seem to be mainly shaped by a person’s cultural upbringing and experiences rather than biological factors, according to a study published on 13 July in Nature1. “Our results show that there is a profound cultural difference” in the way people respond to consonant and dissonant sounds, says Josh McDermott, a cognitive scientist at the Massachusetts Institute of Technology in Cambridge and lead author of the paper. This suggests that other cultures hear the world differently, he adds. The study is one of the first to put an age-old argument to the test. Some scientists believe that the way people respond to music has a biological basis, because pitches that people often like have particular interval ratios. They argue that this would trump any cultural shaping of musical preferences, effectively making them a universal phenomenon. Ethnomusicologists and music composers, by contrast, think that such preferences are more a product of one’s culture. If a person’s upbringing shapes their preferences, then they are not a universal phenomenon. © 2016 Macmillan Publishers Limited
Jon Hamilton Letting mice watch Orson Welles movies may help scientists explain human consciousness. At least that's one premise of the Allen Brain Observatory, which launched Wednesday and lets anyone with an Internet connection study a mouse brain as it responds to visual information. "Think of it as a telescope, but a telescope that is looking at the brain," says Christof Koch, chief scientific officer of the Allen Institute for Brain Science, which created the observatory. The hope is that thousands of scientists and would-be scientists will look through that telescope and help solve one of the great mysteries of human consciousness, Koch says. "You look out at the world and there's a picture in your head," he says. "You see faces, you see your wife, you see something on TV." But how does the brain create those images from the chaotic stream of visual information it receives? "That's the mystery," Koch says. There's no easy way to study a person's brain as it makes sense of visual information. So the observatory has been gathering huge amounts of data on mice, which have a visual system that is very similar to the one found in people. The data come from mice that run on a wheel as still images and movies appear on a screen in front of them. For the mice, it's a lot like watching TV on a treadmill at the gym. But these mice have been genetically altered in a way that allows a computer to monitor the activity of about 18,000 neurons as they respond to different images. "We can look at those neurons and from that decode literally what goes through the mind of the mouse," Koch says. Those neurons were pretty active when the mice watched the first few minutes of Orson Welles' film noir classic Touch of Evil. The film is good for mouse experiments because "It's black and white and it has nice contrasts and it has a long shot without having many interruptions," Koch says. © 2016 npr
By Tanya Lewis In recent years, research on mammalian navigation has focused on the role of the hippocampus, a banana-shaped structure known to be integral to episodic memory and spatial information processing. The hippocampus’s primary output, a region called CA1, is known to be divided into superficial and deep layers. Now, using two-photon imaging in mice, researchers at Columbia University in New York have found these layers have distinct functions: superficial-layer neurons encode more-stable maps, whereas deep-layer brain cells better represent goal-oriented navigation, according to a study published last week (July 7) in Neuron. “There are lots of catalogued differences in sublayers of pyramidal cells” within the hippocampus, study coauthor Nathan Danielson of Columbia told The Scientist. “The question is, are the principle cells in each subregion doing the same thing? Or is there a finer level of granularity?” For that past few decades, scientists have been chipping away at an explanation of the brain’s “inner GPS.” The 2014 Nobel Prize in Physiology or Medicine honored the discovery of so-called place cells and grid cells in the hippocampus, which keep track of an individual’s location and coordinates in space, respectively. Since then, studies have revealed that neurons in different hippocampal regions have distinct genetic, anatomical, and physiological properties, said Attila Losonczy of Columbia, Danielson’s graduate advisor and a coauthor on the study. © 1986-2016 The Scientist
Keyword: Learning & Memory
Link ID: 22437 - Posted: 07.14.2016