Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
|By Annie Sneed It's easy to recall events of decades past—birthdays, high school graduations, visits to Grandma—yet who can remember being a baby? Researchers have tried for more than a century to identify the cause of “infantile amnesia.” Sigmund Freud blamed it on repression of early sexual experiences, an idea that has been discredited. More recently, researchers have attributed it to a child's lack of self-perception, language or other mental equipment required to encode memories. Neuroscientists Paul Frankland and Sheena Josselyn, both at the Hospital for Sick Children in Toronto, do not think linguistics or a sense of self offers a good explanation, either. It so happens that humans are not the only animals that experience infantile amnesia. Mice and monkeys also forget their early childhood. To account for the similarities, Frankland and Josselyn have another theory: the rapid birth of many new neurons in a young brain blocks access to old memories. In a new experiment, the scientists manipulated the rate at which hippocampal neurons grew in young and adult mice. The hippocampus is the region in the brain that records autobiographical events. The young mice with slowed neuron growth had better long-term memory. Conversely, the older mice with increased rates of neuron formation had memory loss. Based on these results, published in May in the journal Science, Frankland and Josselyn think that rapid neuron growth during early childhood disrupts the brain circuitry that stores old memories, making them inaccessible. Young children also have an underdeveloped prefrontal cortex, another region of the brain that encodes memories, so infantile amnesia may be a combination of these two factors. © 2014 Scientific American
By Sandra G. Boodman At first the rash didn’t bother her, said Julia Omiatek, recalling the itchy red bumps that suddenly appeared one day on her palm, near the base of her first and third fingers. It was January 2013 — the dead of winter in Columbus, Ohio — so when the area reddened and cracked a few weeks later, she assumed her problem was simply dry skin and slathered on some cream. Omiatek, then 35, had little time to ponder the origin of her problem. An occupational therapist who works with adult patients, she was also raising two children younger than 3. A few weeks later when her lips swelled and the rash appeared on her face, she decided it was time to consult her dermatologist. Skin problems were nothing new; Omiatek was so allergic to nickel that her mother had had to sew cloth inside her onesies to prevent the metal snaps from touching her skin and causing a painful irritation. Over the years she had learned to avoid nickel and contend with occasional, inexplicable rashes that seemed to clear up when she used Elidel, a prescription cream that treats eczema. But this time the perpetually itchy rash didn’t go away, no matter what she did. Over the course of 11 months, she saw four doctors, three of whom said they didn’t know what was causing the stubborn eruption that eluded numerous tests. The fourth specialist took one look at her hand and figured it out. “The location was a tip-off,” said Matthew Zirwas, an assistant professor of dermatology at the Ohio State University Wexner Medical Center who specializes in treating unexplained rashes. Omiatek’s case was considerably less severe than that of many of the approximately 300 other patients he has treated for the same problem.
Keyword: Pain & Touch
Link ID: 19900 - Posted: 07.30.2014
By ANNA NORTH What does it mean to be lonely? It’s tempting to equate the feeling with a dearth of social interaction, but some people are now saying that it’s more complicated than that — and that true loneliness might be dangerous. In a story at Medium, Robin Marantz Henig busts some common loneliness myths. Lonely people aren’t necessarily weird or uncool: Ms. Henig cites a study of Ohio State undergrads showing that “those who called themselves lonely had just as much ‘social capital’ — defined by physical attractiveness, height, weight, socioeconomic status, and academic achievement — as their non-lonely peers.” And they may not be actually alone: “The students at Ohio State who were lonely belonged to as many clubs and had as many roommates as those who were ‘socially embedded.’ And while some studies indicate that living alone puts people at greater risk for loneliness, living with a spouse is not necessarily any protection.” Rather, loneliness may be psychological. The lonely, writes Ms. Henig, are more likely than others “to feel put upon and misunderstood” in social situations, to see “social danger even where none might exist.” She writes: “People grow lonely because of the gloomy stories they tell themselves. And, in a cruel twist, the loneliness itself can further distort their thinking, making them misread other people’s good intentions, which in turn causes them to withdraw to protect themselves from further rejection — and causes other people to keep them at arm’s length.” This distancing can have a physical impact; Ms. Henig argues that loneliness deserves further study, in part because it may increase the risk of high blood pressure, sleep problems and Alzheimer’s disease. © 2014 The New York Times Company
|By Fikri Birey What’s the difference between you and a rat? The list is unsurprisingly long but now, we can cross a universal human experience — feelings of regret — off of it. A new study shows for the first time that rats regret bad decisions and learn from them. In addition to existentialist suggestions of a rat’s regret — and what that takes away from, or adds to, being “human” — the study is highly relevant to basic brain research. Researchers demonstrated that we can tap into complex internal states of rodents if we hone in on the right behavior and the right neurons. There is a significant literature on what brain regions are representative of certain states, like reward predictions and value calculations, but the study, powered by a novel behavioral test, is able to put together such discrete behavioral correlates into a “rat” definition of regret. Finding better animal models of human behavior constitute a long-standing challenge in neuroscience: It has been difficult to authentically recapitulate mental states in animal models of neuropsychiatric disorders: For example, an attempt to model depression in rodents can often go no further than relatively coarse approximations of the core symptoms like guilt or sadness, which often translates to behaviors like social avoidance or anhedonia in rodents. The inability to efficiently approach the questions of mental abnormalities is a major problem. Depression is currently ranked as the leading cause of disability globally, and it’s estimated that by 2020, depression will lead 1.5 million people to end their lives by suicide. Now, thanks to a simple yet well-conceived series of experiments by Steiner and Redish, a compound behavior like regret is fully open to investigation. The investigators use a spatial decision-making set-up called “Restaurant Row”: an arena with four zones where four different flavors of food (banana, cherry, chocolate or unflavored) are introduced in sequence. © 2014 Scientific American
By Marek Kohn “You know how they say that we can only access 20% of our brain?” says the man who offers stressed-out writer Eddie Morra a fateful pill in the 2011 film Limitless. “Well, what this does, it lets you access all of it.” Morra is instantly transformed into a superhuman by the fictitious drug NZT-48. Granted access to all cognitive areas, he learns to play the piano in three days, finishes writing his book in four, and swiftly makes himself a millionaire. Limitless is what you get when you flatter yourself that your head houses the most complex known object in the universe, and you run away with the notion that it must have powers to match. A number of so-called ‘smart drugs’ or cognitive enhancers have captured attention recently, from stimulants such as modafinil, to amphetamines (often prescribed under the name Adderall) and methylphenidate (also known by its brand name Ritalin). According to widespread news reports, students have begun using these drugs to enhance their performance in school and college, and are continuing to do so in their professional lives. Yet are these smart drugs all they are cracked up to be? Can they really make all of us more intelligent or learn more? Should we be asking deeper questions about what these pharmaceuticals can and can’t do? BBC © 2014
Posted by Celeste Biever | The four females and one male are onboard a satellite as part of an experiment to investigate sexual activity and reproduction in microgravity carried out by Russia’s space agency. Roscosmos launched the lizards using a six-tonne Foton-M4 rocket on 19 July. But the fate of the tiny cosmonauts became uncertain when their satellite briefly lost contact with ground control on Thursday 24 July. Luckily, technicians managed to restore control on Saturday, and Roscosmos announced on its website that since then it has communicated with the satellite 17 times.”Contact is established, the prescribed commands have been conducted according to plan,” said Roscosmos chief Oleg Ostapenko. Keeping the geckos company are Drosophila fruit flies, as well as mushrooms, plant seeds and various microorganisms that are also being studied. There is also a special vacuum furnace on board, which is being used to analyse the melting and solidification of metal alloys in microgravity. Foton-M4 is set to carry out experiments over two months, and involves a “study of the effect of microgravity on sexual behaviour, the body of adult animals and embryonic development”, according to the website of the Institute of Medico-Biological Problems of the Russian Academy of Sciences, which has developed the project along with Roscosmos. Specific aims of the Gecko-F4 mission include: Create the conditions for sexual activity, copulation and reproduction of geckos in orbit Film the geckos’ sex acts and potential egg-laying and maximise the likelihood that any eggs survive Detect possible structural and metabolic changes in the animals, as well as any eggs and foetuses © 2014 Macmillan Publishers Limited.
Keyword: Sexual Behavior
Link ID: 19896 - Posted: 07.30.2014
|By Ingrid Wickelgren One important function of your inner ear is stabilizing your vision when your head is turning. When your head turns one way, your vestibular system moves your eyes in the opposite direction so that what you are looking at remains stable. To see for yourself how your inner ears make this adjustment, called the vestibulo-ocular reflex, hold your thumb upright at arm’s length. Shake your head back and forth about twice per second while looking at your thumb. See that your thumb remains in focus. Now create the same relative motion by swinging your arm back and forth about five inches at the same speed. Notice that your thumb is blurry. To see an object clearly, the image must remain stationary on your retina. When your head turns, your vestibular system very rapidly moves your eyes in the opposite direction to create this stability. When the thumb moves, your visual system similarly directs the eyes to follow, but the movement is too slow to track a fast-moving object, causing blur. © 2014 Scientific American
By STEPHANIE FAIRYINGTON A few months ago, I was on a Manhattan-bound D train heading to work when a man with a chunky, noisy newspaper got on and sat next to me. As I watched him softly turn the pages of his paper, a chill spread like carbonated bubbles through the back of my head, instantly relaxing me and bringing me to the verge of sweet slumber. It wasn’t the first time I’d felt this sensation at the sound of rustling paper — I’ve experienced it as far back as I can remember. But it suddenly occurred to me that, as a lifelong insomniac, I might be able to put it to use by reproducing the experience digitally whenever sleep refused to come. Under the sheets of my bed that night, I plugged in some earphones, opened the YouTube app on my phone and searched for “Sound of pages.” What I discovered stunned me. There were nearly 2.6 million videos depicting a phenomenon called autonomous sensory meridian response, or A.S.M.R., designed to evoke a tingling sensation that travels over the scalp or other parts of the body in response to auditory, olfactory or visual forms of stimulation. The sound of rustling pages, it turns out, is just one of many A.S.M.R. triggers. The most popular stimuli include whispering; tapping or scratching; performing repetitive, mundane tasks like folding towels or sorting baseball cards; and role-playing, where the videographer, usually a breathy woman, softly talks into the camera and pretends to give a haircut, for example, or an eye examination. The videos span 30 minutes on average, but some last more than an hour. For those not wired for A.S.M.R. — and even for those who, like me, apparently are — the videos and the cast of characters who produce them — sometimes called “ASMRtists” or “tingle-smiths” — can seem weird, creepy or just plain boring. (Try pitching the pleasures of watching a nerdy German guy slowly and silently assemble a computer for 30 minutes.) © 2014 The New York Times Company
By Smitha Mundasad Health reporter, BBC News Scientists say a part of the brain, smaller than a pea, triggers the instinctive feeling that something bad is about to happen. Writing in the journal PNAS, they suggest the habenula plays a key role in how humans predict, learn from and respond to nasty experiences. And they question whether hyperactivity in this area is responsible for the pessimism seen in depression. They are now investigating whether the structure is involved in the condition. Animal studies have shown that the habenula fires up when subjects expect or experience adverse events, But in humans this tiny structure (less than 3mm in diameter) has proved difficult to see on scans. Inventing a technique to pinpoint the area, scientists at University College London put 23 people though MRI scanners to monitor their brain activity. Participants were shown a range of abstract pictures. A few seconds later, the images were linked to either punishment (painful electric shocks), reward (money) or neutral responses. For some images, a punishment or reward followed each time but for others this varied - leaving people uncertain whether they were going to feel pain or not. And when people saw pictures associated with shocks the habenula lit up. And the more certain they were a picture was going to result in a punishment, the stronger and faster the activity in this area. Scientists suggests the habenula is involved in helping people learn when it is best to stay away from something and may also signal just how bad a nasty event is likely to be. BBC © 2014
|By Jillian Rose Lim and LiveScience People who don't get enough sleep could be increasing their risk of developing false memories, a new study finds. In the study, when researchers compared the memory of people who'd had a good night's sleep with the memory of those who hadn't slept at all, they found that, under certain conditions, sleep-deprived individuals mix fact with imagination, embellish events and even "remember" things that never actually happened. False memories occur when people's brains distort how they remember a past event — whether it's what they did after work, how a painful relationship ended or what they witnessed at a crime scene. Memory is not an exact recording of past events, said Steven Frenda, a psychology Ph.D. student at the University of California, Irvine, who was involved in the study. Rather, fresh memories are constructed each time people mentally revisit a past event. During this process, people draw from multiple sources — like what they've been told by others, what they've seen in photographs or what they know as stereotypes or expectations, Frenda said. The new findings "have implications for people's everyday lives —recalling information for an exam, or in work contexts, but also for the reliability of eyewitnesses who may have experienced periods of restricted or deprived sleep," said Frenda, who noted that chronic sleep deprivation is on the rise. In a previous study, Frenda and his colleagues observed that people with restricted sleep (less than 5 hours a night) were more likely to incorporate misinformation into their memories of certain photos, and report they had seen video footage of a news event that didn't happen. In the current study, they wanted to see how a complete lack of sleep for 24 hours could influence a person's memory. © 2014 Scientific American
By Erik Schechter The folks who brought us the giant, smartphone-controlled cyborg cockroach are back—this time, with a wired-up scorpion. Be afraid. Backyard Brains, a small Michigan-based company dedicated to spreading the word about neuroscience, has been running surgical experiments on these deadly arachnids for the past two months, using electrical current to induce them to strike. Dylan Miller, a summer intern working the project, insists it's the first time that an electrical current has ever been used to remotely induce a scorpion to strike with its pedipalps (claws) and tail. "I was originally looking at how scorpions sense the ground vibrations of their prey," says Miller, a neuroscience major at Michigan State University, "and I just kind of stumbled on this defensive response." In retrospect, it's easy to see how Miller got there. Scorpions use vibrations and their tactile sense to navigate the world, identifying both prey and predator. A touch on the leg, for instance, tells a scorpion that it's under attack, provoking a defensive fight-or-flight reaction—either fleeing from danger or going full-out Bruce Lee. In nature, the scorpion would have to be physically touched for that to happen. But in the lab, an electrode to the leg nerves and a tiny, remote-controlled function generator feeding a signal will do the trick. The scorpion experiments build on the earlier work Backyard Brains has done with cockroaches, namely RoboRoach. A Kickstarter project back in June 2013 and now a real for-sale home kit, RoboRoach enables purchasers to surgically implant a live roach with three sets of electrodes and then control its movement with a smartphone app via a Bluetooth control unit worn on the roach's back. The controversial kit has been criticized as cruel by people like cognitive ethologist Marc Bekoff, but the company argues that RoboRoach's educational "benefits outweigh the cost." Undaunted by the criticism, Backyard Brains co-founder Gregory Gage was already tossing around the idea of robo-scorpions last October. ©2014 Hearst Communication, Inc
Link ID: 19891 - Posted: 07.29.2014
By CATHERINE SAINT LOUIS “This has happened before,” she tells herself. “It’s nowhere near as bad as before, and it will pass.” Robbie Pinter’s 21-year-old son, Nicholas, is upset again. He yells. He obsesses about something that can’t be changed. Even good news may throw him off. So Dr. Pinter breathes deeply, as she was taught, focusing on each intake and release. She talks herself through the crisis, reminding herself that this is how Nicholas copes with his autism and bipolar disorder. With these simple techniques, Dr. Pinter, who teaches English at Belmont University in Nashville, blunts the stress of parenting a child with severe developmental disabilities. Dr. Pinter, who said she descends from “a long line of the most nervous women,” credits her mindfulness practice with giving her the tools to cope with whatever might come her way. “It is very powerful,” she said. All parents endure stress, but studies show that parents of children with developmental disabilities, like autism, experience depression and anxiety far more often. Struggling to obtain crucial support services, the financial strain of paying for various therapies, the relentless worry over everything from wandering to the future — all of it can be overwhelming. “The toll stress-wise is just enormous, and we know that we don’t do a really great job of helping parents cope with it,” said Dr. Fred R. Volkmar, the director of Child Study Center at Yale University School of Medicine. “Having a child that has a disability, it’s all-encompassing,” he added. “You could see how people would lose themselves.” But a study published last week in the journal Pediatrics offers hope. It found that just six weeks of training in simple techniques led to significant reductions in stress, depression and anxiety among these parents. © 2014 The New York Times Company
Using data from over 18,000 patients, scientists have identified more than two dozen genetic risk factors involved in Parkinson’s disease, including six that had not been previously reported. The study, published in Nature Genetics, was partially funded by the National Institutes of Health (NIH) and led by scientists working in NIH laboratories. A gene chip. Scientists used gene chips to help discover new genes that may be involved with Parkinson's disease “Unraveling the genetic underpinnings of Parkinson’s is vital to understanding the multiple mechanisms involved in this complex disease, and hopefully, may one day lead to effective therapies,” said Andrew Singleton, Ph.D., a scientist at the NIH’s National Institute on Aging (NIA) and senior author of the study. Dr. Singleton and his colleagues collected and combined data from existing genome-wide association studies (GWAS), which allow scientists to find common variants, or subtle differences, in the genetic codes of large groups of individuals. The combined data included approximately 13,708 Parkinson’s disease cases and 95,282 controls, all of European ancestry. The investigators identified potential genetic risk variants, which increase the chances that a person may develop Parkinson’s disease. Their results suggested that the more variants a person has, the greater the risk, up to three times higher, for developing the disorder in some cases.
By DOUGLAS QUENQUA Like Pavlov’s dogs, most organisms can learn to associate two events that usually occur together. Now, a team of researchers says they have identified a gene that enables such learning. The scientists, at the University of Tokyo, found that worms could learn to avoid unpleasant situations as long as a specific insulin receptor remained intact. Roundworms were exposed to different concentrations of salt; some received food during the initial exposure, others did not. Later, when exposed to various concentrations of salt again, the roundworms that had been fed during the first stage gravitated toward their initial salt concentrations, while those that had been starved avoided them. But the results changed when the researchers repeated the experiment using worms with a defect in a particular receptor for insulin, a protein crucial to metabolism. Those worms could not learn to avoid the salt concentrations associated with starvation. “We looked for different forms of the receptor and found that a new one, which we named DAF-2c, functions in taste-aversion learning,” said Masahiro Tomioka, a geneticist at the University of Tokyo and an author of the study, which was published in the journal Science. “It turned out that only this form of the receptor can support learning” in roundworms. While human insulin receptors bear some resemblance to those of a roundworm, more study is needed to determine if it plays a similar role in memory and decision-making, Dr. Tomioka said. But studies have suggested a link between insulin levels and Alzheimer’s disease in humans. © 2014 The New York Times Company
By Smitha Mundasad Health reporter, BBC News Scientists have discovered a central hub of brain cells that may put the brakes on a desire to eat, a study in mice shows. And switching on these neurons can stop feeding immediately, according to the Nature Neurosciences report. Researchers say the findings may one day contribute to therapies for obesity and anorexia. Experts say this sheds light on the many complex nerve circuits involved in appetite control. Scientists from the California Institute of Technology suggest the nerve cells act as a central switchboard, combining and relaying many different messages in the brain to help reduce food intake. Using laser beams they were able to stimulate the neurons - leading to a complete and immediate stop to food consumption. Prof David Anderson, lead author of the study told the BBC: "It was incredibly surprising. "It was like you could just flick a switch and prevent the animals from feeding." Researchers then used chemicals to mimic a variety of scenarios - including feelings of satiety, malaise, nausea and a bitter taste. They found the neurons were active in all situations, suggesting they may be integral in the response to many diverse stimuli. BBC © 2014
Link ID: 19887 - Posted: 07.28.2014
|By James Phillips Our inner ear is a marvel. The labyrinthine vestibular system within it is a delicate, byzantine structure made up of tiny canals, crystals and pouches. When healthy, this system enables us to keep our balance and orient ourselves. Unfortunately, a study in the Archives of Internal Medicine found that 35 percent of adults over age 40 suffer from vestibular dysfunction. A number of treatments are available for vestibular problems. During an acute attack of vertigo, vestibular suppressants and antinausea medications can reduce the sensation of motion as well as nausea and vomiting. Sedatives can help patients sleep and rest. Anti-inflammatory drugs can reduce any damage from acute inflammation and antibiotics can treat an infection. If a structural change in the inner ear has loosened some of its particulate matter—for instance, if otolith (calcareous) crystals, which are normally in tilt-sensitive sacs, end up in the semicircular canals, making the canals tilt-sensitive—simple repositioning exercises in the clinic can shake the loose material, returning it where it belongs. After a successful round of therapy, patients no longer sense that they are tilting whenever they turn their heads. If vertigo is a recurrent problem, injecting certain medications can reduce or eliminate the fluctuating function in the affected ear. As a last resort, a surgeon can effectively destroy the inner ear—either by directly damaging the end organs or by cutting the eighth cranial nerve fibers, which carry vestibular information to the brain. The latter surgery involves removing a portion of the skull and shifting the brain sideways, so it is not for the faint of heart. © 2014 Scientific American
Link ID: 19886 - Posted: 07.28.2014
By PAUL VITELLO The conventional wisdom among animal scientists in the 1950s was that birds were genetically programmed to sing, that monkeys made noise to vent their emotions, and that animal communication, in general, was less like human conversation than like a bodily function. Then Peter Marler, a British-born animal behaviorist, showed that certain songbirds not only learned their songs, but also learned to sing in a dialect peculiar to the region in which they were born. And that a vervet monkey made one noise to warn its troop of an approaching leopard, another to report the sighting of an eagle, and a third to alert the group to a python on the forest floor. These and other discoveries by Dr. Marler, who died July 5 in Winters, Calif., at 86, heralded a sea change in the study of animal intelligence. At a time when animal behavior was seen as a set of instinctive, almost robotic responses to environmental stimuli, he was one of the first scientists to embrace the possibility that some animals, like humans, were capable of learning and transmitting their knowledge to other members of their species. His hypothesis attracted a legion of new researchers in ethology, as animal behavior research is also known, and continues to influence thinking about cognition. Dr. Marler, who made his most enduring contributions in the field of birdsong, wrote more than a hundred papers during a long career that began at Cambridge University, where he received his Ph.D. in zoology in 1954 (the second of his two Ph.D.s.), and that took him around the world conducting field research while teaching at a succession of American universities. Dr. Marler taught at the University of California, Berkeley, from 1957 to 1966; at Rockefeller University in New York from 1966 to 1989; and at the University of California, Davis, where he led animal behavior research, from 1989 to 1994. He was an emeritus professor there at his death. © 2014 The New York Times Company
By KATE MURPHY ONE of the biggest complaints in modern society is being overscheduled, overcommitted and overextended. Ask people at a social gathering how they are and the stock answer is “super busy,” “crazy busy” or “insanely busy.” Nobody is just “fine” anymore. When people aren’t super busy at work, they are crazy busy exercising, entertaining or taking their kids to Chinese lessons. Or maybe they are insanely busy playing fantasy football, tracing their genealogy or churning their own butter. And if there is ever a still moment for reflective thought — say, while waiting in line at the grocery store or sitting in traffic — out comes the mobile device. So it’s worth noting a study published last month in the journal Science, which shows how far people will go to avoid introspection. “We had noted how wedded to our devices we all seem to be and that people seem to find any excuse they can to keep busy,” said Timothy Wilson, a psychology professor at the University of Virginia and lead author of the study. “No one had done a simple study letting people go off on their own and think.” The results surprised him and have created a stir in the psychology and neuroscience communities. In 11 experiments involving more than 700 people, the majority of participants reported that they found it unpleasant to be alone in a room with their thoughts for just 6 to 15 minutes. Moreover, in one experiment, 64 percent of men and 15 percent of women began self-administering electric shocks when left alone to think. These same people, by the way, had previously said they would pay money to avoid receiving the painful jolt. It didn’t matter if the subjects engaged in the contemplative exercise at home or in the laboratory, or if they were given suggestions of what to think about, like a coming vacation; they just didn’t like being in their own heads. © 2014 The New York Times Company
By Michael Brooks Occasionally, scientific research comes up with banal findings that should nonetheless stop us in our tracks. For example, researchers recently published a study showing that a father’s brain will change its hormonal outputs and neural activity depending on his parenting duties. The conclusion of the research is, in essence, that men make good parents, too. Surely this is not news. Yet it does provide evidence that is sadly still useful. Those involved with issues of adoption, fathers’ rights, gay rights, child custody, and religion-fuelled bigotry will all benefit from understanding what we now know about what makes a good parent. The biggest enemy of progress has been the natural world, or at least our view of it. Females are the primary caregivers in 95 percent of mammal species. That is mainly because of lactation. Infants are nourished by their mothers’ milk, so it makes sense for most early caring to be done by females. Human beings, however, have developed more sophisticated means of nourishing and raising our offspring. Should the circumstances require a different set-up, we have ways to cope. It turns out that this is not just in terms of formula milk, nannies or day care: We also have a flexible brain. The new study, published in Proceedings of the National Academy of Sciences, scanned the brains of parents while they watched videos of their interactions with their children. The researchers found that this stimulated activity in two systems of the brain. One is an emotional network that deals with social bonding, ensures vigilance and coordinates responses to distress, providing chemical rewards for behaviours that maintain the child’s well-being. The other network is concerned with mental processing. It monitors the child’s likely state of mind, emotional condition, and future needs, allowing for planning. 2014 © The New Republic.
Keyword: Sexual Behavior
Link ID: 19883 - Posted: 07.26.2014
By MICHAEL INZLICHT and SUKHVINDER OBHI I FEEL your pain. These words are famously associated with Bill Clinton, who as a politician seemed to ooze empathy. A skeptic might wonder, though, whether he truly was personally distressed by the suffering of average Americans. Can people in high positions of power — presidents, bosses, celebrities, even dominant spouses — easily empathize with those beneath them? Psychological research suggests the answer is no. Studies have repeatedly shown that participants who are in high positions of power (or who are temporarily induced to feel powerful) are less able to adopt the visual, cognitive or emotional perspective of other people, compared to participants who are powerless (or are made to feel so). For example, Michael Kraus, a psychologist now at the University of Illinois at Urbana-Champaign, and two colleagues found that among full-time employees of a public university, those who were higher in social class (as determined by level of education) were less able to accurately identify emotions in photographs of human faces than were co-workers who were lower in social class. (While social class and social power are admittedly not the same, they are strongly related.) Why does power leave people seemingly coldhearted? Some, like the Princeton psychologist Susan Fiske, have suggested that powerful people don’t attend well to others around them because they don’t need them in order to access important resources; as powerful people, they already have plentiful access to those. We suggest a different, albeit complementary, reason from cognitive neuroscience. On the basis of a study we recently published with the researcher Jeremy Hogeveen, in the Journal of Experimental Psychology: General, we contend that when people experience power, their brains fundamentally change how sensitive they are to the actions of others. © 2014 The New York Times Company