Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 420

Sara Reardon Two monkeys sit at computer screens, eyeing one another as they wait for a promised reward: apple juice. Each has a choice — it can either select a symbol that results in juice being shared equally, or pick one that delivers most of the juice to itself. But being selfish is risky. If its partner also chooses not to share, neither gets much juice. This game, the ‘prisoner’s dilemma’, is a classic test of strategy that involves the simultaneous evaluation of an opponent’s thinking. Researchers have now discovered — and manipulated — specific brain circuits in rhesus macaques (Macaca mulatta) that seem to be involved in the animals’ choices, and in their assessments of their partners’ choices. Investigating the connections could shed light on how social context affects decision-making in humans, and how disorders that affect social skills, such as autism spectrum disorder, disrupt brain circuitry. “Once we have identified that there are particular neural signals necessary to drive the processes, we can begin to tinker,” says Michael Platt, a neurobiologist at Duke University in Durham, North Carolina. Neurobiologists Keren Haroush and Ziv Williams of Harvard Medical School in Boston, Massachusetts, zoomed in on neural circuits in rhesus macaques by implanting electrode arrays into a brain area called the dorsal anterior cingulate cortex (dACC), which is associated with rewards and decision-making. The arrays recorded the activity of hundreds of individual neurons. When the monkeys played the prisoner’s dilemma (see ‘A juicy experiment’) against a computer program, they rarely chose to cooperate. But when they played with another monkey that they could see, they were several times more likely to choose to share the juice. © 2014 Nature Publishing Group

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 19297 - Posted: 02.26.2014

Want to read someone’s mind? Look at their pupils. A person about to answer “yes” to a question, especially if they are more used to answering “no,” will have more enlarged pupils than someone about to answer “no,” according to a new study. Normally, pupils dilate when a person is in a darkened environment to let more light into the eye and allow better vision. But pupil size can also be altered by levels of signaling chemicals naturally produced by the brain. In the study, published online this week in the Proceedings of the National Academy of Sciences, scientists observed the pupils of 29 people as they pressed a “yes” or “no” button to indicate whether they’d seen a difficult-to-detect visual cue on a screen in front of them. When a person was deciding how to answer—in the seconds before pressing a button—their pupils grew larger. And if a person was normally biased toward answering “no” when they weren’t sure on the visual cue, then the pupil change was even more profound in the decision-making seconds before a “yes” answer. The finding could lead to new ways to detect people’s intrinsic biases and how confident they are in an answer given, important variables in many sociological and psychological studies. © 2014 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 19166 - Posted: 01.25.2014

By DAN HURLEY Two and a half millenniums ago, a prince named Siddhartha Gautama traveled to Bodh Gaya, India, and began to meditate beneath a tree. Forty-nine days of continuous meditation later, tradition tells us, he became the Buddha — the enlightened one. More recently, a psychologist named Amishi Jha traveled to Hawaii to train United States Marines to use the same technique for shorter sessions to achieve a much different purpose: mental resilience in a war zone. “We found that getting as little as 12 minutes of meditation practice a day helped the Marines to keep their attention and working memory — that is, the added ability to pay attention over time — stable,” said Jha, director of the University of Miami’s Contemplative Neuroscience, Mindfulness Research and Practice Initiative. “If they practiced less than 12 minutes or not at all, they degraded in their functioning.” Jha, whose program has received a $1.7 million, four-year grant from the Department of Defense, described her results at a bastion of scientific conservatism, the New York Academy of Sciences, during a meeting on “The Science of Mindfulness.” Yet mindfulness hasn’t long been part of serious scientific discourse. She first heard another scientist mention the word “meditation” during a lecture in 2005. “I thought, I can’t believe he just used that word in this audience, because it wasn’t something I had ever heard someone utter in a scientific context,” Jha said. Although pioneers like Jon Kabat-Zinn, now emeritus professor at the University of Massachusetts Medical Center, began teaching mindfulness meditation as a means of reducing stress as far back as the 1970s, all but a dozen or so of the nearly 100 randomized clinical trials have been published since 2005. And the most recent studies of mindfulness — the simple, nonjudgmental observation of a person’s breath, body or just about anything else — are taking the practice in directions that might have shocked the Buddha. In addition to military fitness, scientists are now testing brief stints of mindfulness training as a means to improve scores on standardized tests and lay down new connections between brain cells. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19142 - Posted: 01.16.2014

By Emilie Reas “Come on. Get out of the express checkout lane! That’s way more than twelve items, lady.” Without having to count, you can make a good guess at how many purchases the shopper in front of you is making. She may think she’s pulling a fast one, but thanks to the brain’s refined sense for quantity, she’s not fooling anyone. This ability to perceive numerosity – or number of items – does more than help prevent express lane fraud; it also builds the foundation for our arithmetic skills, the economic system and our concept of value. Until recently, it’s remained a puzzle how the brain allows us to so quickly and accurately judge quantity. Neuroscientists believe that neural representations of most high-level cognitive concepts – for example, those involved in memory, language or decision-making – are distributed, in a relatively disorganized manner, throughout the brain. In contrast, highly organized, specialized brain regions have been identified that represent most lower-level sensory information, such as sights, sounds, or physical touch. Such areas resemble maps, in that sensory information is arranged in a logical, systematic spatial layout. Notably, this type of neural topography has only previously been observed for the basic senses, but never for a high-level cognitive function. Researchers from the Netherlands may have discovered an exception to this rule, as reported in their recently published Science paper: a small brain area which represents numerosity along a continuous “map.” Just as we organize numbers along a mental “number line,” with one at the left, increasing in magnitude to the right, so is quantity mapped onto space in the brain. One side of this brain region responds to small numbers, the adjacent region to larger numbers, and so on, with numeric representations increasing to the far end. © 2014 Scientific American,

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19135 - Posted: 01.15.2014

by Helen Thomson DRAW a line across a page, then write on it what you had for dinner yesterday and what you plan to eat tomorrow. If you are a native English speaker, or hail from pretty much any European country, you no doubt wrote last night's meal to the left of tomorrow night's. That's because we construct mental timelines to represent and reason about time, and most people in the West think of the past as on the left, and the future as on the right. Arnaud Saj at the University of Geneva, Switzerland, and his colleagues wondered whether the ability to conjure up a mental timeline is a necessary part of reasoning about events in time. To investigate, they recruited seven Europeans with what's called left hemispatial neglect. That means they have damage to parts of the right side of their brain, limiting their ability to detect, identify and interact with objects in the left-hand side of space. They may eat from only the right side of a plate, shave just the right side of their face, and ignore numbers on the left side of a clock. The team also recruited seven volunteers who had damage to the right side of their brain but didn't have hemispatial neglect, and seven people with undamaged brains. All the volunteers took part in a variety of memory tests. First, they learned about a fictional man called David. They were shown pictures of what David liked to eat 10 years ago, and what he might like to eat in 10 years' time. Participants were then shown drawings of 10 of David's favourite foods, plus four food items they hadn't seen before. Participants had to say whether it was a food that David liked in the past or might like in future. The tests were repeated with items in David's apartment, and his favourite clothes. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19095 - Posted: 01.04.2014

Associated Press A sophisticated, real-world study confirms that dialing, texting or reaching for a cell phone while driving raises the risk of a crash or near-miss, especially for younger drivers. But the research also produced a surprise: Simply talking on the phone did not prove dangerous, as it has in other studies. This one did not distinguish between handheld and hands-free devices - a major weakness. And even though talking doesn't require drivers to take their eyes off the road, it's hard to talk on a phone without first reaching for it or dialing a number - things that raise the risk of a crash, researchers note. Earlier work with simulators, test tracks and cell phone records suggests that risky driving increases when people are on cell phones, especially teens. The 15- to 20-year-old age group accounts for 6 percent of all drivers but 10 percent of traffic deaths and 14 percent of police-reported crashes with injuries. For the new study, researchers at the Virginia Tech Transportation Institute installed video cameras, global positioning systems, lane trackers, gadgets to measure speed and acceleration, and other sensors in the cars of 42 newly licensed drivers 16 or 17 years old, and 109 adults with an average of 20 years behind the wheel. © 2014 Hearst Communications, Inc.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19091 - Posted: 01.04.2014

Tomas Jivanda Being pulled into the world of a gripping novel can trigger actual, measurable changes in the brain that linger for at least five days after reading, scientists have said. The new research, carried out at Emory University in the US, found that reading a good book may cause heightened connectivity in the brain and neurological changes that persist in a similar way to muscle memory. The changes were registered in the left temporal cortex, an area of the brain associated with receptivity for language, as well as the the primary sensory motor region of the brain. Neurons of this region have been associated with tricking the mind into thinking it is doing something it is not, a phenomenon known as grounded cognition - for example, just thinking about running, can activate the neurons associated with the physical act of running. “The neural changes that we found associated with physical sensation and movement systems suggest that reading a novel can transport you into the body of the protagonist,” said neuroscientist Professor Gregory Berns, lead author of the study. “We already knew that good stories can put you in someone else’s shoes in a figurative sense. Now we’re seeing that something may also be happening biologically.” 21 students took part in the study, with all participants reading the same book - Pompeii, a 2003 thriller by Robert Harris, which was chosen for its page turning plot. “The story follows a protagonist, who is outside the city of Pompeii and notices steam and strange things happening around the volcano,” said Prof Berns. “It depicts true events in a fictional and dramatic way. It was important to us that the book had a strong narrative line.” © independent.co.uk

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 19080 - Posted: 12.31.2013

Oliver Burkeman As we stumble again into the season of overindulgence – that sacred time of year when wine, carbs and sofas replace brisk walks for all but the most virtuous – a headline in the (excellent) new online science magazine Nautilus catches my eye: "What If Obesity Is Nobody's Fault?" The article describes new research on mice: a genetic alteration, it appears, can make them obese, despite eating no more than others. "Many of us unfortunately have had an attitude towards obese people [as] having a lack of willpower or self-control," one Harvard researcher is quoted as saying. "It's clearly something beyond that." No doubt. But that headline embodies an assumption that's rarely questioned. Suppose, hypothetically, obesity were solely a matter of willpower: laying off the crisps, exercising and generally bucking your ideas up. What makes us so certain that obesity would be the fault of the obese even then? This sounds like the worst kind of bleeding-heart liberalism, a condition from which I probably suffer (I blame my genes). But it's a real philosophical puzzle, with implications reaching far beyond obesity to laziness in all contexts, from politicians' obsession with "hardworking families" to the way people beat themselves up for not following through on their plans. We don't blame people for most physical limitations (if you broke your leg, it wouldn't be a moral failing to cancel your skydiving trip), nor for many other impediments: it's hardly your fault if you're born into educational or economic disadvantage. Yet almost everyone treats laziness and weakness of will as exceptions. If you can't be bothered to try, you've only yourself to blame. It's a rule some apply most harshly to themselves, mounting epic campaigns of self-chastisement for procrastinating, failing to exercise and so on. © 2013 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 19034 - Posted: 12.14.2013

By Emilie Reas Did you make it to work on time this morning? Go ahead and thank the traffic gods, but also take a moment to thank your brain. The brain’s impressively accurate internal clock allows us to detect the passage of time, a skill essential for many critical daily functions. Without the ability to track elapsed time, our morning shower could continue indefinitely. Without that nagging feeling to remind us we’ve been driving too long, we might easily miss our exit. But how does the brain generate this finely tuned mental clock? Neuroscientists believe that we have distinct neural systems for processing different types of time, for example, to maintain a circadian rhythm, to control the timing of fine body movements, and for conscious awareness of time passage. Until recently, most neuroscientists believed that this latter type of temporal processing – the kind that alerts you when you’ve lingered over breakfast for too long – is supported by a single brain system. However, emerging research indicates that the model of a single neural clock might be too simplistic. A new study, recently published in the Journal of Neuroscience by neuroscientists at the University of California, Irvine, reveals that the brain may in fact have a second method for sensing elapsed time. What’s more, the authors propose that this second internal clock not only works in parallel with our primary neural clock, but may even compete with it. Past research suggested that a brain region called the striatum lies at the heart of our central inner clock, working with the brain’s surrounding cortex to integrate temporal information. For example, the striatum becomes active when people pay attention to how much time has passed, and individuals with Parkinson’s Disease, a neurodegenerative disorder that disrupts input to the striatum, have trouble telling time. © 2013 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 14: Biological Rhythms, Sleep, and Dreaming
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 10: Biological Rhythms and Sleep
Link ID: 18978 - Posted: 11.27.2013

Ed Yong A large international group set up to test the reliability of psychology experiments has successfully reproduced the results of 10 out of 13 past experiments. The consortium also found that two effects could not be reproduced. Psychology has been buffeted in recent years by mounting concern over the reliability of its results, after repeated failures to replicate classic studies. A failure to replicate could mean that the original study was flawed, the new experiment was poorly done or the effect under scrutiny varies between settings or groups of people. To tackle this 'replicability crisis', 36 research groups formed the Many Labs Replication Project to repeat 13 psychological studies. The consortium combined tests from earlier experiments into a single questionnaire — meant to take 15 minutes to complete — and delivered it to 6,344 volunteers from 12 countries. The team chose a mix of effects that represent the diversity of psychological science, from classic experiments that have been repeatedly replicated to contemporary ones that have not. Ten of the effects were consistently replicated across different samples. These included classic results from economics Nobel laureate and psychologist Daniel Kahneman at Princeton University in New Jersey, such as gain-versus-loss framing, in which people are more prepared to take risks to avoid losses, rather than make gains1; and anchoring, an effect in which the first piece of information a person receives can introduce bias to later decisions2. The team even showed that anchoring is substantially more powerful than Kahneman’s original study suggested. © 2013 Nature Publishing Group

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 18974 - Posted: 11.26.2013

by Anil Ananthaswamy Can you tickle yourself if you are fooled into thinking that someone else is tickling you? A new experiment says no, challenging a widely accepted theory about how our brains work. It is well known that we can't tickle ourselves. In 2000, Sarah-Jayne Blakemore of University College London (UCL) and colleagues came up with a possible explanation. When we intend to move, the brain sends commands to the muscles, but also predicts the sensory consequences of the impending movement. When the prediction matches the actual sensations that arise, the brain dampens down its response to those sensations. This prevents us from tickling ourselves (NeuroReport, DOI: 10.1097/00001756-200008030-00002). Jakob Hohwy of Monash University in Clayton, Australia, and colleagues decided to do a tickle test while simultaneously subjecting people to a body swap illusion. In this illusion, the volunteer and experimenter sat facing each other. The subject wore goggles that displayed the feed from a head-mounted camera. In some cases the camera was mounted on the subject's head, so that they saw things from their own perspective, while in others it was mounted on the experimenter's head, providing the subject with the experimenter's perspective. Using their right hands, both the subject and the experimenter held on to opposite ends of a wooden rod, which had a piece of foam attached to each end. The subject and experimenter placed their left palms against the foam at their end. Next, the subject or the experimenter took turns to move the rod with their right hand, causing the piece of foam to tickle both of their left palms. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 18954 - Posted: 11.21.2013

By Daisy Grewal How good are you at multi-tasking? The way you answer that question may tell you more than you think. According to recent research, the better people think they are at multitasking, the worse they actually are at it. And the more that you think you are good at it, the more likely you are to multi-task when driving. Maybe the problem of distracted driving has less to do with the widespread use of smartphones and more to do with our inability to recognize our own limits. A study by David Sanbonmatsu and his colleagues looked at the relationship between people’s beliefs about their own multi-tasking ability and their likelihood of using a cell phone when driving. Importantly, the study also measured people’s actual multi-tasking abilities. The researchers found that people who thought they were good at multi-tasking were actually the worst at it. They were also the most likely to report frequently using their cell phones when driving. This may help explain why warning people about the dangers of cell phone use when driving hasn’t done much to curb the behavior. The study is another reminder that we are surprisingly poor judges of our own abilities. Research has found that people overestimate their own qualities in a number of areas including intelligence, physical health, and popularity. Furthermore, the worse we are at something, the more likely we may be to judge ourselves as competent at it. Psychologists David Dunning and Justin Kruger have studied how incompetence, ironically, is often the result of not being able to accurately judge one’s own incompetence. In one study, they found that people who scored the lowest on tests of grammar and logic were the most likely to overestimate their own abilities. The reverse was also true: the more competent people were most likely to underestimate their abilities. And multi-tasking may be just yet another area where incompetence breeds over-confidence. © 2013 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 18880 - Posted: 11.06.2013

Why do some people feel as though one of their body parts is not truly part of them and go to crazy lengths to get rid of it? Paul D. McGeochanswers: Certain people hold a deep desire to amputate a healthy limb. They are not psychotic, and they fully realize that what they want is abnormal. Nevertheless, they have felt from childhood that the presence of a specific limb, usually a leg, somehow makes their body “overcomplete.” Ultimately, many will achieve their desired amputation through self-inflicted damage or surgery. During the past few years my work with neuroscientists Vilayanur S. Ramachandran of U.C.S.D. and David Brang of Northwestern University, along with research by neuroscientist Peter Brugger of University Hospital Zurich in Switzerland, has transformed our understanding of this condition. Our findings suggest that a dysfunction of specific brain areas on the right side of the brain, which are involved in generating our body image, may explain the desire. Bizarre disorders of body image have long been known to arise after a stroke or other incident inflicts damage to the right side of the brain, particularly in the parietal lobe. The right posterior parietal cortex seems to combine several incoming streams of information—touch, joint position sense, vision and balance—to form a dynamic body image that changes as we interact with the world around us. In brain scans, we have found this exact part of the right parietal lobe to activate abnormally in individuals desiring limb removal. Because the primary sensory areas of the brain still function normally, sufferers are able to see and feel the limb in question. Yet they do not experience it as part of their body because the right posterior parietal lobe fails to adequately represent it. The mismatch between a person's actual physical body and his or her body image seems to cause ongoing arousal in the sympathetic nervous system, which may intensify the desire to remove the limb. Given that sufferers date these feelings to childhood, the right parietal dysfunction most likely is congenital or arises in early development. © 2013 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 18869 - Posted: 11.04.2013

Katherine Harmon Courage An infant's innate sense for numbers predicts how their mathematical aptitude will develop years later, a team of US researchers has found. Babies can spot if a set of objects increases or decreases in number — for instance, if the number of dots on a screen grows, even when dot size, colour and arrangement also change. But until recently, researchers could generally only determine the number sense of groups of babies, thus ruling out the ability to correlate this with later mathematics skills in individuals. In 2010, Elizabeth Brannon, a neuroscientist at Duke University in Durham, North Carolina, and her colleagues demonstrated that they could test and track infants' number sense over time1. To do this, six-month-old babies are presented with two screens. One shows a constant number of dots, such as eight, changing in appearance, and the other also shows changing dots but presents different numbers of them — eight sometimes and 16 other times, for instance. An infant who has a good primitive number sense will spend more time gazing at the screen that presents the changing number of dots. In the latest work, which is published in this week's Proceedings of the National Academy of Sciences2, Brannon's team took a group of 48 children who had been tested at six months of age and retested them three years later, using the same dot test but also other standard maths tests for preschoolers — including some that assessed the ability to count, to tell which of two numbers is larger and to do basic calculations. © 2013 Nature Publishing Group

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 14: Attention and Consciousness
Link ID: 18822 - Posted: 10.22.2013

by Helen Thomson ONE moment you are alive. The next you are dead. A few hours later and you are alive again. Pharmacologists have discovered a mechanism that triggers Cotard's syndrome – the mysterious condition that leaves people feeling like they, or parts of their body, no longer exist. With the ability to switch the so-called walking corpse syndrome on and off comes the prospect of new insights into how conscious experiences are constructed. Acyclovir – also known by the brand name Zovirax – is a common drug used to treat cold sores and other herpes infections. It usually has no harmful side effects. However, about 1 per cent of people who take the drug orally or intravenously experience some psychiatric side effects, including Cotard's. These occur mainly in people who have renal failure. To investigate the potential link between acyclovir and Cotard's, Anders Helldén at Karolinska University Hospital in Stockholm and Thomas Lindén at the Sahlgrenska Academy in Gothenburg pooled data from Swedish drug databases along with hospital admissions. They identified eight people with acyclovir-induced Cotard's. One woman with renal failure began using acyclovir to treat shingles. She ran into a hospital screaming, says Helldén. After an hour of dialysis, she started to talk: she said the reason she was so anxious was that she had a strong feeling she was dead. After a few more hours of dialysis she said, "I'm not quite sure whether I'm dead any more but I'm still feeling very strange." Four hours later: "I'm pretty sure I'm not dead any more but my left arm is definitely not mine." Within 24 hours, the symptoms had disappeared. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 18805 - Posted: 10.17.2013

by Colin Barras A part of all of us loves sums. Eavesdropping on the brain while people go about their daily activity has revealed the first brain cells specialised for numbers. Josef Parvizi and his colleagues at Stanford University in California enlisted the help of three people with epilepsy whose therapy involved placing a grid of electrodes on the surface of their brain that record activity. Neurons fired in a region called the intraparietal sulcus when the three volunteers performed arithmetic tests, suggesting they dealt with numbers. The team continued to monitor brain activity while the volunteers went about their normal activity in hospital. Comparing video footage of their stay with their brain activity (see video, above) revealed that the neurons remained virtually silent for most of the time, bursting into life only when the volunteers talked about numbers or numerical concepts such as "more than" or "less than". There is debate over whether some neural populations perform many functions or are involved in very precise tasks. "We show here that there is specialisation for numeracy," says Parvizi. Journal reference: Nature Communications, DOI: 10.1038/ncomms3528 © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 18796 - Posted: 10.16.2013

Many people, I've heard talk, wonder what's going on inside Republican speaker John Boehner's brain. For cognitive neuroscientists, Boehner's brain is a case study. At the same time, others are frustrated with Democrat Harry Reid. The Senate Majority leader needs to take a tip from our founding fathers. Many of the intellectual giants who founded our democracy were both statesmen and scientists, and they applied the latest in scientific knowledge of their day to advantage in governing. The acoustics of the House of Representatives, now Statuary Hall, allowed John Quincy Adams and his comrades to eavesdrop on other members of congress conversing in whispers on the opposite side of the parabolic-shaped room. Senator Reid, in stark contrast, is still applying ancient techniques used when senators wore togas -- reason and argument -- and we all know how badly that turned out. The search for a path to compromise can be found in the latest research on the neurobiological basis of social behavior. Consider this new finding just published in the journal Brain Research. Oxytocin, a peptide produced in the hypothalamus of the brain and known to cement the strong bond between mother and child at birth, has been found to promote compromise in rivaling groups! This new research suggests that Congresswoman Nancy Pelosi could single-handedly end the Washington deadlock by spritzing a bit of oxytocin in her perfume and wafting it throughout the halls of congress. One can only imagine the loving effect this hormone would have on Senate Republican Ted Cruz, suddenly overwhelmed with an irresistible urge to bond with his colleagues, fawning for a cozy embrace like a babe cuddling in its mother's arms. And it is so simple! No stealthy spiking the opponent's coffee (or third martini at lunch) would be required, oxytocin works when it is inhaled through the nasal passages as an odorless vapor. © 2013 TheHuffingtonPost.com, Inc.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 1: An Introduction to Brain and Behavior
Link ID: 18761 - Posted: 10.08.2013

By Bruce Bower Cartoon ghosts have scared up evidence that the ability to visualize objects in one’s mind materializes between ages 3 and 5. When asked to pick which of two mirror-image ghost cutouts or drawings fit in a ghost-shaped hole, few 3-year-olds, a substantial minority of 4-year-olds and most 5-year-olds regularly succeeded, say psychologist Andrea Frick of the University of Bern, Switzerland, and her colleagues. Girls performed as well as boys on the task, suggesting that men’s much-studied advantage over women in mental rotation doesn’t emerge until after age 5, the researchers report Sept. 17 in Cognitive Development. Mental rotation is a spatial skill regarded as essential for science and math achievement. Most tasks that researchers use to assess mental rotation skills involve pressing keys to indicate whether block patterns oriented at different angles are the same or different. That challenge overwhelms most preschoolers. Babies apparently distinguish block patterns from mirror images of those patterns (SN: 12/20/08, p. 8), but it’s unclear whether that ability enables mental rotation later in life. Frick’s team studied 20 children at each of three ages, with equal numbers of girls and boys. Youngsters saw two ghosts cut out of foam, each a mirror image of the other. Kids were asked to turn the ghosts in their heads and choose the one that would fit like a puzzle piece into a ghost’s outline on a board. Over seven trials, the ghosts were tilted at angles varying from the position of the outline. The researchers used three pairs of ghost cutouts, for a total of 21 trials. © Society for Science & the Public 2000 - 2013

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 8: Hormones and Sex
Link ID: 18706 - Posted: 09.26.2013

By Neuroskeptic Neuroscientists are interested in how brains interact socially. One of the main topics of study is ‘mentalizing’ aka ‘theory of mind’, the ability to accurately attribute mental states – such as beliefs and emotions – to other people. It is widely believed that the brain has specific areas for this – i.e. social “modules” (although today most neuroscientists are shy about using that word, it’s basically what’s at issue.) But two new papers out this week suggest that people can still mentalize successfully after damage to “key parts of the theory of mind network”. Herbet et al, writing in Cortex, showed few effects of surgical removal of the right frontal lobe in 10 brain tumour patients. On two different mentalizing tasks, they showed that removal caused either no decline in performance, or only a transient one. Meanwhile Michel et al report that the left temporal pole is dispensable for mentalizing as well, in a single case report in the Journal of Cognitive Neuroscience. They describe a patient suffering from frontotemporal dementia (FTD), whose left temporal lobe was severely atrophied. He’d lost the use of language, but he did quite normally on theory of mind tests adapted to be non-linguistic. In both papers, these patients don’t have those parts of the brain that are most activated in fMRI studies of mentalizing. Where the blobs on the brain normally go, they have no brain.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 18687 - Posted: 09.23.2013

By Melissa Hogenboom Science reporter, BBC News Smaller animals tend to perceive time in slow-motion, a new study has shown. This means that they can observe movement on a finer timescale than bigger creatures, allowing them to escape from larger predators. Insects and small birds, for example, can see more information in one second than a larger animal such as an elephant. The work is published in the journal Animal Behaviour. "The ability to perceive time on very small scales may be the difference between life and death for fast-moving organisms such as predators and their prey," said lead author Kevin Healy, at Trinity College Dublin (TCD), Ireland. The reverse was found in bigger animals which may miss things that smaller creatures can rapidly spot. In humans, too, there is variation among individuals. Athletes, for example, can often process visual information more quickly. An experienced goalkeeper would therefore be quicker than others in observing where a ball comes from. The speed at which humans absorb visual information is also age-related, said Andrew Jackson, a co-author of the work at TCD. "Younger people can react more quickly than older people, and this ability falls off further with increasing age." The team looked at the variation of time perception across a variety of animals. They gathered datasets from other teams who had used a technique called critical flicker fusion frequency, which measures the speed at which the eye can process light. BBC © 2013

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 18651 - Posted: 09.16.2013