Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 41 - 60 of 2373

Laura Sanders Specialized cells that make up the brain’s GPS system have an expanding job description. In addition to mapping locations, these cells can keep track of distance and time, too, scientists report in the Nov. 4 Neuron. Those specialized cells, called grid cells, were thought to have a very specific job, says neuroscientist Loren Frank of the University of California, San Francisco. But, he says, the new study says, “not so fast, everybody.” These cells’ ability to detect time and distance is unexpected. “And I think it’s important,” Frank says. The growing to-do list of grid cells shows that the brain’s navigational system is surprisingly flexible. The discovery of grid cells, found in a part of the brain called the entorhinal cortex, was recognized with the Nobel Prize last year (SN Online: 10/6/14). These brain cells fire off regular signals as animals move around in space, partially forming an internal map of the environment. Neuroscientist Howard Eichenbaum of Boston University and colleagues wondered what those cells do when an animal stays put. By training rats to run on a treadmill, the researchers had a way to study grid cells as time and distance marched forward, but location remained the same. Unlike recently discovered “speed cells” (SN: 8/8/15, p. 8), these grid cells don’t change their firing rates to correspond to changes in the rats’ swiftness, the researchers found. Instead, these cells stay tuned to distance or time, or both. © Society for Science & the Public 2000 - 2015.

Keyword: Learning & Memory
Link ID: 21606 - Posted: 11.05.2015

Natasha Gilbert The eye-catching plumage of some male songbirds has long been explained as a result of sexual selection: brighter males compete more successfully for mates, so evolution favours their spread. Females, by contrast, remain drab. A new study turns this explanation on its head. Sexual-selection pressures drive females to evolve dull feathers more strongly than they drive males to become colourful, argues James Dale, an evolutionary ecologist at Massey University in Auckland, New Zealand. That surprising conclusion is based on a data set of plumage colour in nearly 6,000 songbirds, which Dale and his colleagues built. They used their data to ask how various potential evolutionary factors drive male and female plumage colour. If a particular songbird species was polygynous (that is, the males had more than one mate), displayed a large difference in size between males and females, and left care of the young mainly up to the females, then the researchers judged that sexual selection was likely to be an important factor in that species' evolution. The study, published in Nature1, found that sexual selection does play an important role in creating colour differences between male and female plumage. But the contrast is largely driven by females evolving to become drab. “Females are the chief architect of the difference,” says Dale. © 2015 Nature Publishing Group

Keyword: Sexual Behavior; Evolution
Link ID: 21605 - Posted: 11.05.2015

Laura Sanders Blood tells a story about the body it inhabits. As it pumps through vessels, delivering nutrients and oxygen, the ruby red liquid picks up information. Hormones carried by blood can hint at how hungry a person is, or how scared, or how sleepy. Other messages in the blood can warn of heart disease or announce a pregnancy. When it comes to the brain, blood also seems to be more than a traveling storyteller. In some cases, the blood may be writing the script. A well-fed brain is crucial to survival. Blood ebbs and flows within the brain, moving into active areas in response to the brain’s demands for fuel. Now scientists have found clues that blood may have an even more direct and powerful influence. Early experiments suggest that, instead of being at the beck and call of nerve cells, blood can actually control them. This role reversal hints at an underappreciated layer of complexity — a layer that may turn out to be vital to how the brain works. The give-and-take between brain and blood appears to change with age and with illness, researchers are finding. Just as babies aren’t born walking, their developing brain cells have to learn how to call for blood. And a range of age-related disorders, including Alzheimer’s disease, have been linked to dropped calls between blood and brain, a silence that may leave patches of brain unable to do their jobs. © Society for Science & the Public 2000 - 2015

Keyword: Brain imaging
Link ID: 21604 - Posted: 11.05.2015

Doubts are emerging about one of our leading models of consciousness. It seems that brain signals thought to reflect consciousness are also generated during unconscious activity. A decade of studies have lent credence to the global neuronal workspace theory of consciousness, which states that when something is perceived unconsciously, or subliminally, that information is processed locally in the brain. In contrast, conscious perception occurs when the information is broadcast to a “global workspace”, or assemblies of neurons distributed across various brain regions, leading to activity over the entire network. Proponents of this idea, Stanislas Dehaene at France’s national institute for health in Gif-sur-Yvette, and his colleagues, discovered that when volunteers view stimuli that either enter conscious awareness or don’t, their brains show identical EEG activity for the first 270 milliseconds. Then, if perception of the stimuli is subliminal, the brain activity peters out. However, when volunteers become conscious of the stimuli, there is a sudden burst of widespread brain activity 300 ms after the stimulus. This activity is characterised by an EEG signal called P3b, and has been called a neural correlate of consciousness. Brian Silverstein and Michael Snodgrass at the University of Michigan in Ann Arbor, and colleagues wondered if P3b could be detected during unconscious processing of stimuli. © Copyright Reed Business Information Ltd.

Keyword: Consciousness
Link ID: 21603 - Posted: 11.05.2015

Sara Reardon Military-service members can suffer brain injury and memory loss when exposed to explosions in enclosed spaces, even if they do not sustain overt physical injury. A strategy designed to improve memory by delivering brain stimulation through implanted electrodes is undergoing trials in humans. The US military, which is funding the research, hopes that the approach might help many of the thousands of soldiers who have developed deficits to their long-term memory as a result of head trauma. At the Society for Neuroscience meeting in Chicago, Illinois, on 17–21 October, two teams funded by the Defense Advanced Research Projects Agency presented evidence that such implanted devices can improve a person’s ability to retain memories. By mimicking the electrical patterns that create and store memories, the researchers found that gaps caused by brain injury can be bridged. The findings raise hopes that a ‘neuro­prosthetic’ that automatically enhances flagging memory could aid not only brain-injured soldiers, but also people who have had strokes — or even those who have lost some power of recall through normal ageing. Because of the risks associated with surgically placing devices in the brain, both groups are studying people with epilepsy who already have implanted electrodes. The researchers can use these electrodes both to record brain activity and to stimulate specific groups of neurons. Although the ultimate goal is to treat traumatic brain injury, these people might benefit as well, says biological engineer Theodore Berger at the University of Southern California (USC) in Los Angeles. That is because repeated seizures can destroy the brain tissue needed for long-term-memory formation. © 2015 Nature Publishing Group

Keyword: Learning & Memory; Robotics
Link ID: 21600 - Posted: 11.04.2015

Scientists have come up with a questionnaire they say should help diagnose a condition called face blindness. Prosopagnosia, as doctors call it, affects around two in every 100 people in the UK and is the inability to recognise people by their faces alone. In its most extreme form, people cannot even recognise their family or friends. Milder forms, while still distressing, can be tricky to diagnose, which is why tests are needed. People with prosopagnosia often use non-facial cues to recognise others, such as their hairstyle, clothes, voice, or distinctive features. Some may be unaware they have the condition, instead believing they have a "bad memory for faces". But prosopagnosia is entirely unrelated to intelligence or broader memory ability. One [anonymous] person with prosopagnosia explains: "My biggest problem is seeing the difference between ordinary-looking people, especially faces with few specific traits. "I work at a hospital with an awful lot of employees and I often introduce myself to colleagues with whom I have worked several times before. I also often have problems recognising my next-door neighbour, even though we have been neighbours for eight years now. She often changes clothes, hairstyle and hair colour. When I strive to recognise people, I try to use technical clues like clothing, hairstyle, scars, glasses, their dialect and so on." Doctors can use computer-based tests to see if people can spot famous faces and memorise and recognise a set of unfamiliar faces. And now Drs Richard Cook, Punit Shah and City University London and Kings College London have come up with a 20-item questionnaire to help measure the severity of someone's face blindness. © 2015 BBC

Keyword: Attention
Link ID: 21598 - Posted: 11.04.2015

By Christof Koch Artificial intelligence has been much in the news lately, driven by ever cheaper computer processing power that has become effectively a near universal commodity. The excitement swirls around mathematical abstractions called deep convolutional neural networks, or ConvNets. Applied to photographs and other images, the algorithms that implement ConvNets identify individuals from their faces, classify objects into one of 1,000 distinct categories (cheetah, husky, strawberry, catamaran, and so on)—and can describe whether they see “two pizzas sitting on top of a stove top oven” or “a red motorcycle parked on the side of the road.” All of this happens without human intervention. Researchers looking under the hood of these powerful algorithms are surprised, puzzled and entranced by the beauty of what they find. How do ConvNets work? Conceptually they are but one or two generations removed from the artificial neural networks developed by engineers and learning theorists in the 1980s and early 1990s. These, in turn, are abstracted from the circuits neuroscientists discovered in the visual system of laboratory animals. Already in the 1950s a few pioneers had found cells in the retinas of frogs that responded vigorously to small, dark spots moving on a stationary background, the famed “bug detectors.” Recording from the part of the brain's outer surface that receives visual information, the primary visual cortex, Torsten Wiesel and the late David H. Hubel, both then at Harvard University, found in the early 1960s a set of neurons they called “simple” cells. These neurons responded to a dark or a light bar of a particular orientation in a specific region of the visual field of the animal. © 2015 Scientific American

Keyword: Vision; Robotics
Link ID: 21597 - Posted: 11.03.2015

Alison Abbott Europe’s troubled Human Brain Project (HBP) has secured guarantees of European Commission financing until at least 2019 — but some scientists are still not sure that they want to take part in the mega-project, which has been fraught with controversy since its launch two years ago. On 30 October, the commission signed an agreement formally committing to fund the HBP past April 2016, when its preliminary 30-month ‘ramp-up’ phase ends. The deal also starts a process to change the project’s legal status so as to spread responsibility across many participating institutions. The commission hopes that this agreement will restore lost confidence in the HBP, which aims to better understand the brain using information and computing technologies, primarily through simulation. Last year, hundreds of scientists signed a petition claiming that the project was being mismanaged and was running off its scientific course; they pledged to boycott it if their concerns were ignored. Since then, the HBP has been substantially reformed along lines recommended by a mediation committee. It has dissolved its three-person executive board, which had assumed most of the management power. And it has committed to including studies on cognitive neuroscience (which the triumvirate had wanted to eliminate). It also opened up for general competition some €8.9 million ($US10 million) of cash previously allocated only to project insiders, and in September selected four major projects in systems and cognitive neuroscience proposed by different groups around Europe. © 2015 Nature Publishing Group

Keyword: Brain imaging
Link ID: 21596 - Posted: 11.03.2015

By SINDYA N. BHANOO Some kinds of itching can be caused by the lightest of touches, a barely felt graze that rustles tiny hairs on the skin’s surface. This type of itch is created via a dedicated neural pathway, a new study suggests. The finding, which appears in the journal Science, could help researchers better understand chronic itchiness in conditions like eczema, diabetic neuropathy, multiple sclerosis and some cancers. The study also may help researchers determine why certain patients do not respond well to antihistamine drugs. “In the future, we may have some way to manipulate neuron activity to inhibit itching,” said Quifu Ma, a neurobiologist at Harvard University and one of the study’s authors. In the study, Dr. Ma and his colleagues inhibited neurons that express a neuropeptide known as Y or NPY in mice. When these neurons were suppressed and the mice were poked with a tiny filament, they fell into scratching fits. Normally, mice would not even respond to this sort of stimuli. “We start to see skin lesions — they don’t stop scratching,” Dr. Ma said. “It’s pretty traumatic.” The neurons only seem related to itches prompted by light touching, known as mechanically induced itches. Chemical itches, like those caused by a mosquito bite or an allergic reaction, are not transmitted by the same neurons. © 2015 The New York Times Company

Keyword: Pain & Touch
Link ID: 21593 - Posted: 11.03.2015

by Bethany Brookshire Cheese is a delicious invention. But if you saw the news last week, you might think it’s on its way to being classified as a Schedule II drug. Headlines proclaimed “Say cheese? All the time? Maybe you have an addiction,” “Cheese really is crack” and “Your cheese addiction is real.” Under the headlines, the stories referred to a study examining the addictive properties of various foods. Pizza was at the top. The reason? The addictive properties of cheese, which the articles claim contains “dangerous” opiate-like chemicals called casomorphins. But you can’t explain away your affinity for cheese by saying you’re addicted. The study in those stories, published earlier this year in PLOS ONE, did investigate which foods are most associated with addictive-like eating behaviors. Pizza did come out on top in one experiment. But the scientists who did the research say this has little to do with the delicious dairy products involved. Instead, they argue, the foods we crave the most are those processed to have high levels of sugars and fat, and it’s these ingredients that leave us coming back for another slice. The cheese? “I was horrified by the misstatements and the oversimplifications … and the statements about how it’s an excuse to overeat,” says Ashley Gearhardt of the University of Michigan in Ann Arbor, who led the study. “Liking is not the same as addiction. We like lots of things. I like hip-hop music and sunshine and my wiener dog, but I’m not addicted to her. I eat cheese every day. That’s doesn’t mean you’re addicted or it has addictive potential.” © Society for Science & the Public 2000 - 2015

Keyword: Drug Abuse
Link ID: 21592 - Posted: 11.02.2015

By Simon Makin Most people have felt depressed or anxious, even if those feelings have never become debilitating. And how many times have you heard someone say, “I'm a little OCD”? Clearly, people intuitively think that most mental illnesses have a spectrum, ranging from mild to severe. Yet most people do not know what it feels like to hallucinate—to see or hear things that are not really there—or to have delusions, persistent notions that do not match reality. You're psychotic, or you're not, according to conventional wisdom. Evidence is growing, however, that there may be no clear dividing line. Psychiatrists have long debated whether psychosis exists on a spectrum, and researchers have been investigating the question for more than a decade now. A 2013 meta-analysis, combining much of the existing data, by Jim van Os of Maastricht University in the Netherlands and Richard Linscott of the University of Otago in New Zealand, found the prevalence of hallucinations and delusions in the general population was 7.2 percent—much higher than the 0.4 percent prevalence of schizophrenia diagnoses found in recent studies. Now the most comprehensive epidemiological study of psychotic experiences to date, published in July in JAMA Psychiatry, has given researchers the most detailed picture yet of how many people have these experiences and how frequently. The results strongly imply a spectrum—and suggest that the standard treatment for a psychotic episode might be due for an overhaul. After ruling out experiences caused by drugs or sleep, the researchers determined that 5.8 percent of the respondents had psychotic experiences. Two thirds of these people had had only one type of episode, with hallucinations being four times more common than delusions. © 2015 Scientific American

Keyword: Schizophrenia
Link ID: 21591 - Posted: 11.02.2015

By Jennie Baird Last week’s news that Sesame Street was introducing the first autistic Muppet was met in my house with a resounding, “Huh?” “But there already is an autistic Muppet,” my high-functioning 14-year-old said. “Fozzie Bear.” I had never thought of Fozzie that way, but my son had a point. Fozzie is not good at taking social cues; he doesn’t read a room well and he tends to monologue and perseverate (to repeat himself long after the need has passed). He interprets figurative language as literal — remember that fork in the road in “The Muppet Movie?” He has a verbal tic he falls back on, “wokka-wokka.” And he hates to be separated from his hat for no obvious reason. I’ve tested this theory on friends and have seen the light bulb of recognition go off every time. Of course Fozzie has autism! One friend, a mother whose son is also on the spectrum even told me her family had the exact same conversation. Sesame Street hopes children will identify with their new character Julia, described as a “friend who has autism,” and appearing, for now, only in the book “We’re Amazing 1-2-3!” There is no question, the mere presence of Julia is a positive development. But she also introduces a rarely discussed complication of autism. Let’s call it the Fozzie Conundrum. I’m particularly sensitive to the Fozzie Conundrum now that my son attends regular honors classes in a regular public high school. Naturally sociable and charismatic — and with eight years of support and interventions from a team of terrific teachers and therapists at specialized schools — he can easily “pass” as a regular, funny, quirky teenager. © 2015 The New York Times Company

Keyword: Autism
Link ID: 21590 - Posted: 11.02.2015

? Joanne Silberner Each year, nearly three times as many Americans die from suicide as from homicide. More Americans kill themselves than die from breast cancer. As Thomas Insel, longtime head of the National Institute of Mental Health, prepared to step down from his job in October, he cited the lack of progress in reducing the number of suicides as his biggest disappointment. While the homicide rate in the US has dropped 50 percent since the early 1990s, the suicide rate is higher than it was a decade ago. "That to me is unacceptable," Insel says. It hasn't been for lack of trying. The US has a national suicide hotline and there are suicide prevention programs in every state. There's screening, educational programs, and midnight walks to raise awareness. Yet over the last decade or so, the national suicide rate has increased. In 2003, the suicide rate was 10.8 per 100,000 people. In 2013, it was 12.6. An effort that began in Detroit in 2001 to treat depression, the most common cause of suicide, is offering hope. With a relentless focus on finding and treating people with depression, the Henry Ford Health System has cut the suicide rate among the people in its insurance plan dramatically. The story of the health system's success is a story of persistence, confidence, hope and a strict adherence to a very specific approach. © 2015 npr

Keyword: Depression
Link ID: 21589 - Posted: 11.02.2015

By Hanae Armitage Fake fingerprints might sound like just another ploy to fool the feds. But the world’s first artificial prints—reported today—have even cooler applications. The electronic material, which mimics the swirling designs imprinted on every finger, can sense pressure, temperature, and even sound. Though the technology has yet to be tested outside the lab, researchers say it could be key to adding sensation to artificial limbs or even enhancing the senses we already have. “It’s an interesting piece of work,” says John Rogers, materials scientist at the University of Illinois, Urbana-Champaign, who was not involved in the study. “It really adds to the toolbox of sensor types that can be integrated with the skin.” Electronic skins, known as e-skins, have been in development for years. There are several technologies used to mimic the sensations of real human skin, including sensors that can monitor health factors like pulse or temperature. But previous e-skins have been able to “feel” only two sensations: temperature and pressure. And there are additional challenges when it comes to replicating fingertips, especially when it comes to mimicking their ability to sense even miniscule changes in texture, says Hyunhyub Ko, a chemical engineer at Ulsan National Institute of Science and Technology in South Korea. So in the new study, Ko and colleagues started with a thin, flexible material with ridges and grooves much like natural fingerprints. This allowed them to create what they call a “microstructured ferroelectric skin” The e-skin’s perception of pressure, texture, and temperature all come from a highly sensitive structure called an interlocked microdome array—the tiny domes sandwiched in the bottom two layers of the e-skin, also shown in the figure below. © 2015 American Association for the Advancement of Science

Keyword: Pain & Touch; Robotics
Link ID: 21588 - Posted: 10.31.2015

Laura Sanders A fly tickling your arm hair can spark a maddening itch. Now, scientists have spotted nerve cells in mice that curb this light twiddling sensation. If humans possess similar itch-busters, the results, published in the Oct. 30 Science, could lead to treatments for the millions of people who suffer from intractable, chronic itch. For many of these people, there are currently no good options. “This is a major problem,” says clinician Gil Yosipovitch of Temple University School of Medicine in Philadelphia and director of the Temple Itch Center. The new study shows that mice handle an itch caused by a fluttery touch differently than other kinds of itch. This distinction “seems to have clinical applications that clearly open our field,” Yosipovitch says. In recent years, scientists have made progress teasing apart the pathways that carry itchy signals from skin to spinal cord to brain (SN: 11/22/2008, p. 16). But those itch signals often originate from chemicals, such as those delivered by mosquitoes. All that’s needed to spark a different sort of itch, called mechanical itch, is a light touch on the skin. The existence of this kind of itch is no surprise, Yosipovitch says. Mechanical itch may help explain why clothes or even dry, scaly skin can be itchy. The new finding came from itchy mice engineered to lack a type of nerve cell in their spinal cords. Without prompting, these mice scratched so often that they developed sore bald patches on their skin. © Society for Science & the Public 2000 - 2015

Keyword: Pain & Touch
Link ID: 21587 - Posted: 10.31.2015

A single variation in the gene for brain-derived neurotropic factor (BDNF) may influence obesity in children and adults, according to a new study funded by the National Institutes of Health. The study suggests that a less common version of the BDNF gene may predispose people to obesity by producing lower levels of BDNF protein, a regulator of appetite, in the brain. The authors propose that boosting BDNF protein levels may offer a therapeutic strategy for people with the genetic variation, which tends to occur more frequently in African Americans and Hispanics, than in non-Hispanic Caucasians. The study is published in the journal Cell Reports. Obesity in children and adults is a serious issue in the United States, contributing to health conditions such as heart disease, stroke and type 2 diabetes. Importantly, genetic factors can predispose a person to obesity, as well as influence the effectiveness of weight-loss strategies. The body relies on cells to process and store energy, and changes in genes that regulate these functions can cause an imbalance that leads to excessive energy storage and weight gain. “The BDNF gene has previously been linked to obesity, and scientists have been working for several years to understand how changes in this particular gene may predispose people to obesity,” said Jack A. Yanovski, M.D., Ph.D., one of the study authors and an investigator at NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). “This study explains how a single genetic change in BDNF influences obesity and may affect BDNF protein levels. Finding people with specific causes of obesity may allow us to evaluate effective, more-personalized treatments.”

Keyword: Obesity; Genes & Behavior
Link ID: 21585 - Posted: 10.31.2015

Heidi Ledford An analysis of 53 weight-loss studies that included more than 68,000 people has concluded that, despite their popularity, low-fat diets are no more effective than higher-fat diets for long-term weight loss. And overall, neither type of diet works particularly well. A year after their diets started, participants in the 53 studies were, on average, only about 5 kilograms (11 pounds) lighter. “That’s not that impressive,” says Kevin Hall, a physiologist at the US National Institute of Diabetes and Digestive and Kidney Diseases in Bethesda, Maryland. “All of these prescriptions for dieting seem to be relatively ineffective in the long term.” The study, published in The Lancet Diabetes and Endocrinology[1], runs counter to decades' worth of medical advice and adds to a growing consensus that the widespread push for low-fat diets was misguided. Nature looks at why low-fat diets were so popular and what diet doctors might prescribe next. Are the new findings a surprise? The advantages of low-fat diets have long been in question. “For decades we’ve been touting low-fat diets as the way to lose weight, but obesity has gone up,” says Deirdre Tobias, lead author of the study and an epidemiologist at Brigham and Women’s Hospital in Boston, Massachusetts. “It seemed evident that low-fat diets may not be the way to go.” © 2015 Nature Publishing Group,

Keyword: Obesity
Link ID: 21584 - Posted: 10.31.2015

By KATHARINE Q. SEELYE NEWTON, N.H. — When Courtney Griffin was using heroin, she lied, disappeared, and stole from her parents to support her $400-a-day habit. Her family paid her debts, never filed a police report and kept her addiction secret — until she was found dead last year of an overdose. At Courtney’s funeral, they decided to acknowledge the reality that redefined their lives: Their bright, beautiful daughter, just 20, who played the French horn in high school and dreamed of living in Hawaii, had been kicked out of the Marines for drugs. Eventually, she overdosed at her boyfriend’s grandmother’s house, where she died alone. “When I was a kid, junkies were the worst,” Doug Griffin, 63, Courtney’s father, recalled in their comfortable home here in southeastern New Hampshire. “I used to have an office in New York City. I saw them.” When the nation’s long-running war against drugs was defined by the crack epidemic and based in poor, predominantly black urban areas, the public response was defined by zero tolerance and stiff prison sentences. But today’s heroin crisis is different. While heroin use has climbed among all demographic groups, it has skyrocketed among whites; nearly 90 percent of those who tried heroin for the first time in the last decade were white. And the growing army of families of those lost to heroin — many of them in the suburbs and small towns — are now using their influence, anger and grief to cushion the country’s approach to drugs, from altering the language around addiction to prodding government to treat it not as a crime, but as a disease. © 2015 The New York Times Company

Keyword: Drug Abuse
Link ID: 21583 - Posted: 10.31.2015

Susan Milius Electric eels are even more shocking than biologists thought. When prey fights back, eels just — curl their tails. Muscle has evolved “into a battery” independently in two groups of fishes, explains Kenneth Catania of Vanderbilt University in Nashville. Smaller species send out slight tingles of electric current that detect the fish’s surroundings in murky nighttime water. People can handle these small fishes and not feel even a tickle. But touching the bigger Electrophorus electricus (a member of a South American group of battery-included fishes)“is reminiscent of walking into an electric fence on a farm,” Catania says. (He knows, unintentionally, from experience.) The modified muscle that works as an electricity-generating organ in the eel has just on/off power. But eels have a unique way of intensifying the effect, Catania reports October 28 in Current Biology. Catania has tussled with eels using what he calls his electric eel chew toy — a dead fish on a stick with electrodes inside the carcass to measure current. When fighting difficult prey Iike the recalcitrant toy, eels curl their tails toward the fish struggling in their jaws. This bend puts the electrically negative tail-end of the long battery organ closer to the electrically positive front end, effectively concentrating the electric field on the prey. An eel’s tail curl can double the strength of the electric field convulsing the prey. © Society for Science & the Public 2000 - 2015.

Keyword: Animal Communication; Aggression
Link ID: 21581 - Posted: 10.29.2015

By Nicholas Bakalar Certain personality traits are often attributed to oldest, middle and youngest children. But a new study found that birth order itself had no effect on character, though it may slightly affect intelligence. Researchers analyzed three large ongoing collections of data including more than 20,000 people: a British study that follows the lives of people who were born in one particular week in 1958, a German study of private households started in 1984 and a continuing study of Americans born between 1980 and 1984. They searched for differences in extroversion, emotional stability, agreeableness, conscientiousness, self-reported intellect, IQ, imagination and openness to experience. They analyzed families with sisters and brothers, large and small age gaps and different numbers of siblings. They even looked to see if being a middle child correlated with any particular trait. But no matter how they spliced the data, they could find no association of birth order with any personality characteristic. The study, in Proceedings of the National Academy of Sciences, did find evidence that older children have a slight advantage in IQ scores, but the difference was apparent only in a large sample, with little significance for any individual. The lead author, Julia M. Rohrer, a graduate student at the University of Leipzig, said that birth order can have an effect — if your older brother bullied you, for example. “But these effects are highly idiosyncratic,” she said. “There is no such thing as a typical older, middle or younger sibling. It’s important to stop believing that you are the way you are because of birth order.” © 2015 The New York Times Company

Keyword: Development of the Brain; Intelligence
Link ID: 21578 - Posted: 10.29.2015