Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 485

James M. Broadway “Where did the time go?” middle-aged and older adults often remark. Many of us feel that time passes more quickly as we age, a perception that can lead to regrets. According to psychologist and BBC columnist Claudia Hammond, “the sensation that time speeds up as you get older is one of the biggest mysteries of the experience of time.” Fortunately, our attempts to unravel this mystery have yielded some intriguing findings. In 2005, for instance, psychologists Marc Wittmann and Sandra Lenhoff, both then at Ludwig Maximilian University of Munich, surveyed 499 participants, ranging in age from 14 to 94 years, about the pace at which they felt time moving—from “very slowly” to “very fast.” For shorter durations—a week, a month, even a year—the subjects' perception of time did not appear to increase with age. Most participants felt that the clock ticked by quickly. But for longer durations, such as a decade, a pattern emerged: older people tended to perceive time as moving faster. When asked to reflect on their lives, the participants older than 40 felt that time elapsed slowly in their childhood but then accelerated steadily through their teenage years into early adulthood. There are good reasons why older people may feel that way. When it comes to how we perceive time, humans can estimate the length of an event from two very different perspectives: a prospective vantage, while an event is still occurring, or a retrospective one, after it has ended. In addition, our experience of time varies with whatever we are doing and how we feel about it. In fact, time does fly when we are having fun. Engaging in a novel exploit makes time appear to pass more quickly in the moment. But if we remember that activity later on, it will seem to have lasted longer than more mundane experiences. © 2016 Scientific American,

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 22447 - Posted: 07.16.2016

By SUNITA SAH A POPULAR remedy for a conflict of interest is disclosure — informing the buyer (or the patient, etc.) of the potential bias of the seller (or the doctor, etc.). Disclosure is supposed to act as a warning, alerting consumers to their adviser’s stake in the matter so they can process the advice accordingly. But as several recent studies I conducted show, there is an underappreciated problem with disclosure: It often has the opposite of its intended effect, not only increasing bias in advisers but also making advisees more likely to follow biased advice. When I worked as a physician, I witnessed how bias could arise from numerous sources: gifts or sponsorships from the pharmaceutical industry; compensation for performing particular procedures; viewing our own specialties as delivering more effective treatments than others’ specialties. Although most physicians, myself included, tend to believe that we are invulnerable to bias, thus making disclosures unnecessary, regulators insist on them, assuming that they work effectively. To some extent, they do work. Disclosing a conflict of interest — for example, a financial adviser’s commission or a physician’s referral fee for enrolling patients into clinical trials — often reduces trust in the advice. But my research has found that people are still more likely to follow this advice because the disclosure creates increased pressure to follow the adviser’s recommendation. It turns out that people don’t want to signal distrust to their adviser or insinuate that the adviser is biased, and they also feel pressure to help satisfy their adviser’s self-interest. Instead of functioning as a warning, disclosure can become a burden on advisees, increasing pressure to take advice they now trust less. © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22416 - Posted: 07.09.2016

By Emily Rosenzweig Life deals most of us a consistent stream of ego blows, be they failures at work, social slights, or unrequited love. Social psychology has provided decades of insight into just how adept we are at defending ourselves against these psychic threats. We discount negative feedback, compare ourselves favorably to those who are worse off than us, attribute our failures to others, place undue value on our own strengths, and devalue opportunities denied to us–all in service of protecting and restoring our sense of self-worth. As a group, this array of motivated mental processes that support mood repair and ego defense has been called the “psychological immune system.” Particularly striking to social psychologists is our ability to remain blind to our use of these motivated strategies, even when it is apparent to others just how biased we are. However there are times when we either cannot remain blind to our own psychological immune processes, or where we may find ourselves consciously wanting to use them expressly for the purpose of restoring our ego or our mood. What then? Can we believe a conclusion we reach even when we know that we arrived at it in a biased way? For example, imagine you’ve recently gone through a breakup and want to get over your ex. You decide to make a mental list of all of their character flaws in an effort to feel better about the relationship ending. A number of prominent social psychologists have suggested you’re out of luck—knowing that you’re focusing only on your ex’s worst qualities prevents you from believing the conclusion you’ve come to that you’re better off without him or her. In essence, they argue that we must remain blind to our own biased mental processes in order to reap their ego-restoring benefits. And in many ways this closely echoes the position that philosophers like Mele have taken about the possibility of agentic self-deception. © 2016 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22404 - Posted: 07.07.2016

Mo Costandi There’s much more to visual perception than meets the eye. What we see is not merely a matter of patterns of light falling on the retina, but rather is heavily influenced by so-called ‘top-down’ brain mechanisms, which can alter the visual information, and other types of sensory information, that enters the brain before it even reaches our conscious awareness. A striking example of this is a phenomenon called inattentional blindness, whereby narrowly focusing one’s attention on one visual stimulus makes us oblivious to other stimuli, even though they otherwise may be glaringly obvious, as demonstrated by the infamous ‘Invisible Gorilla’ study. Now researchers say they have discovered another extreme form of blindness, in which people fail to notice an unexpected image, even when shown by itself and staring them in the face. Marjan Persuh and Robert Melara of the City University of New York designed two experiments to investigate whether people’s prior expectations could block their awareness of meaningful and important visual stimuli. In the first, they recruited 20 student volunteers and asked them to perform a visual discrimination task. They were shown a series of images, consisting of successive pairs of faces, each of which were presented for half a second on a computer screen, and asked to indicate whether each pair showed faces of people of the same or different sex. Towards the end of each session, the participants were presented with a simple shape, which flashed onto the screen for one tenth of a second. They were then asked if they had seen anything new and, after replying, were told that a shape had indeed appeared, and asked to select the correct one from a display of four. This shape recognition task was then repeated in one final control trial. © 2016 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22394 - Posted: 07.04.2016

By MOSHE BAR A FRIEND of mine has a bad habit of narrating his experiences as they are taking place. I tease him for being a bystander in his own life. To be fair, we all fail to experience life to the fullest. Typically, our minds are too occupied with thoughts to allow complete immersion even in what is right in front of us. Sometimes, this is O.K. I am happy not to remember passing a long stretch of my daily commute because my mind has wandered and my morning drive can be done on autopilot. But I do not want to disappear from too much of life. Too often we eat meals without tasting them, look at something beautiful without seeing it. An entire exchange with my daughter (please forgive me) can take place without my being there at all. Recently, I discovered how much we overlook, not just about the world, but also about the full potential of our inner life, when our mind is cluttered. In a study published in this month’s Psychological Science, the graduate student Shira Baror and I demonstrate that the capacity for original and creative thinking is markedly stymied by stray thoughts, obsessive ruminations and other forms of “mental load.” Many psychologists assume that the mind, left to its own devices, is inclined to follow a well-worn path of familiar associations. But our findings suggest that innovative thinking, not routine ideation, is our default cognitive mode when our minds are clear. In a series of experiments, we gave participants a free-association task while simultaneously taxing their mental capacity to different degrees. In one experiment, for example, we asked half the participants to keep in mind a string of seven digits, and the other half to remember just two digits. While the participants maintained these strings in working memory, they were given a word (e.g., shoe) and asked to respond as quickly as possible with the first word that came to mind (e.g., sock). © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22360 - Posted: 06.25.2016

By Nancy Szokan Let’s begin by defining something psychologists call “ego depletion.” This is the idea that all of us have only a certain amount of self-control, and if we use up too much in one part of our lives, we will have less to use in others. An early example came from a 1998 study in which participants were tempted with a chocolate treat before being given a difficult puzzle: Those who resisted the temptation seemed to have used up some of their willpower, because they gave up on the puzzle faster than the treat eaters. There have been many subsequent studies about ego depletion, including its apparent effects on physical performance: In 2012, athletes who were given a difficult mental task before a physical challenge exhibited less determination to do well on the sports test than those who hadn’t done the puzzle. But recently a replication study (in which researchers repeat a published experiment to see if they come up with the same results) tested more than 2,000 participants at 24 labs and found the ego depletion effect to be very small or nonexistent. I Which, as Lea Winerman reports, has led such psychologists as Michael Inzlicht of the University of Toronto to a crisis of confidence. Maybe, he thinks, ego depletion and the other social psychological effects he has made a career of studying are “proven” by unreliable research. “I used to think there were errors, but that the errors were minor and it was fine,” Winerman quotes Inzlicht as saying in the June issue of Monitor on Psychology, a publication of the American Psychological Association. “But as I started surveying the field, I started thinking we’ve been making some major mistakes.”

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22337 - Posted: 06.20.2016

Alva Noë Sometimes the mind wanders. Thoughts pop into consciousness. Ideas or images are present when just a moment before they were not. Scientists recently have been turning their attention to making sense of this. One natural picture of the phenomenon goes something like this. Typically, our thoughts and feelings are shaped by what we are doing, by what there is around us. The world captures our attention and compels our minds this way or that. What explains the fact that you think of a red car when there is a red car in front of you is, well, the red car. And similarly, it is that loud noise that causes you to orient yourself to the commotion that is producing it. In such cases, we might say, the mind is coupled to the world around it and the world, in a way, plays us the way a person might play a piano. But sometimes, even without going to sleep, we turn away from the world. We turn inward. We are contemplative or detached. We decouple ourselves from the environment and we are set free, as it were, to let our minds play themselves. This natural picture has gained some support from the discovery of the so-called Default Mode Network. The DMN is a network of neural systems whose activation seems to be suppressed by active engagement with the world around us; DMN, in contrast, is activated (or rather, it tends to return to baseline levels of activity) precisely when we detach ourselves from what's going on around us. The DMN is the brain running in neutral. One of the leading hypotheses to explain mind-wandering and the emergence of spontaneous thoughts is that this is the result of the operation of the brain's Default Mode Network. (See this for a review of this literature.) © 2016 npr

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22331 - Posted: 06.18.2016

By Rachel Feltman Archerfish are already stars of the animal kingdom for their stunning spit-takes. They shoot high-powered water jets from their mouths to stun prey, making them one of just a few fish species known to use tools. But by training Toxotes chatareus to direct those jets of spit at certain individuals, scientists have shown that the little guys have another impressive skill: They seem to be able to distinguish one human face from another, something never before witnessed in fish and spotted just a few times in non-human animals. The results, published Tuesday in the Nature journal Scientific Reports, could help us understand how humans got so good at telling each other apart. Or how most people got to be good at that, anyway. I'm terrible at it. It's generally accepted that the fusiform gyrus, a brain structure located in the neocortex, allows humans to tell one another apart with a speed and accuracy that other species can't manage. But there's some debate over whether human faces are so innately complex — and that distinguishing them is more difficult than other tricks of memory or pattern recognition — that this region of the brain is a necessary facilitator of the skill that evolved especially for it. Birds, which have been shown to distinguish humans from one another, have the same structure. But some researchers still think that facial recognition might be something that humans learn — it's not an innate skill — and that the fusiform gyrus is just the spot where we happen to process all the necessary information.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22299 - Posted: 06.08.2016

By Clare Wilson We’ve all been there: after a tough mental slog your brain feels as knackered as your body does after a hard workout. Now we may have pinpointed one of the brain regions worn out by a mentally taxing day – and it seems to also affect our willpower, so perhaps we should avoid making important decisions when mentally fatigued. Several previous studies have suggested that our willpower is a finite resource, and if it gets depleted in one way – like finishing a difficult task – we find it harder to make other good choices, like resisting a slice of cake. In a small trial, Bastien Blain at INSERM in Paris and his colleagues asked volunteers to spend six hours doing tricky memory tasks, while periodically choosing either a small sum of cash now, or a larger amount after a delay. .. As the day progressed, people became more likely to act on impulse and to pick an immediate reward. This didn’t happen in the groups that spent time doing easier memory tasks, reading or gaming. For those engaged in difficult work, fMRI brain scans showed a decrease in activity in the middle frontal gyrus, a brain area involved in decision-making. “That suggests this region is becoming less excitable, which could be impairing people’s ability to resist temptation,” says Blain. It’s involved in decisions like ‘Shall I have a beer with my friends tonight, or shall I save money to buy a bike next month,’ he says. Previous research has shown that children with more willpower in a similar type of choice test involving marshmallows end up as more successful adults, by some measures. “Better impulse control predicts your eventual wealth and health,” says Blain. The idea that willpower can be depleted is contentious as some researchers have failed to replicate others’ findings. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 22292 - Posted: 06.07.2016

By Daniel Barron No matter where we call home, where we were raised, or what we ate for breakfast, our brains process information pretty much the same as anyone else in the world. Which makes sense—our genomes are 99.6-99.9% identical, which makes our brains nearly so. Look at a landscape or cityscape and comparable computations occur in your brain as in someone from another background or country. Zhangjiajie National Forest Park, China. Credit: Chensiyuan, via Wikimedia Commons under GFDL Consider my recent walk through China’s Zhangjiajie National Forest Park, an inspiration for James Cameron’s Avatar. Some of our first steps into the park involved a 1,070 foot ascent in the Bailong elevator, the world’s tallest outdoor elevator. Crammed within the carriage were travelers from Japan, India, China, the U.S.A., and Korea. No matter our origin, the Wulingyuan landscape didn’t disappoint: the towering red and green rock formations stretched towards the sky as they defied gravity. Gasps and awes were our linguistic currency while our visual cortices gleefully fired away. The approximately 3000 quartzite sandstone pillars, with their unusual red and green contrasts, mesmerized our visual centers, demanding our attention. One of the brain’s earliest visual processing centers, V1, lies at the middle of the back of our head. V1 identifies simple forms like vertical, horizontal, and diagonal edges of contrasting intensities, or lines. Look at a vertical line, and neurons that are sensitive to vertical lines will fire more quickly; look at a horizontal line, and our horizontal neurons buzz away. © 2016 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 22204 - Posted: 05.11.2016

By Sarah Kaplan Scientists have known for a while that stereotypes warp our perceptions of things. Implicit biases — those unconscious assumptions that worm their way into our brains, without our full awareness and sometimes against our better judgment — can influence grading choices from teachers, split-second decisions by police officers and outcomes in online dating. We can't even see the world without filtering it through the lens of our assumptions, scientists say. In a study published Monday in the journal Nature Neuroscience, psychologists report that the neurons that respond to things such as sex, race and emotion are linked by stereotypes, distorting the way we perceive people's faces before that visual information even reaches our conscious brains. "The moment we actually glimpse another person ... [stereotypes] are biasing that processing in a way that conforms to our already existing expectations," said Jonathan Freeman, a psychology professor at New York University and one of the authors of the report. Responsibility lies in two far-flung regions of the brain: the orbital frontal cortex, which rests just above the eyes and is responsible for rapid visual predictions and categorizations, and the fusiform cortex, which sits in the back of the brain and is involved in recognizing faces. When Freeman and his co-author, Ryan Stolier, had 43 participants look at images of faces in a brain scanner, they noticed that neurons seemed to be firing in similar patterns in both parts of the brain, suggesting that information from each part was influencing the other.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 22172 - Posted: 05.03.2016

By Stephen L. Macknik, Susana Martinez-Conde The renowned Slydini holds up an empty box for all to see. It is not really a box—just four connected cloth-covered cardboard walls, forming a floppy parallelogram with no bottom or top. Yet when the magician sets it down on a table, it looks like an ordinary container. Now he begins to roll large yellow sheets of tissue paper into balls. He claps his hands—SMACK!—as he crumples each new ball in a fist and then straightens his arm, wordlessly compelling the audience to gaze after his closed hand. He opens it, and ... the ball is still there. Nothing happened. Huh. Slydini's hand closes once more around the tissue, and it starts snaking around, slowly and gracefully, like a belly dancer's. The performance is mesmerizing. With his free hand, he grabs an imaginary pinch of pixie dust from the box to sprinkle on top of the other hand. This time he opens his hand to reveal that the tissue is gone! Four balls disappear in this fashion. Then, for the finale, Slydini tips the box forward and shows the impossible: all four balls have mysteriously reappeared inside. Slydini famously performed this act on The Dick Cavett Show in 1978. It was one of his iconic tricks. Despite the prestidigitator's incredible showmanship, though, the sleight only works because your brain cannot multitask. © 2016 Scientific American,

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 22114 - Posted: 04.19.2016

Noah Smith, ( How do human beings behave in response to risk? That is one of the most fundamental unanswered questions of our time. A general theory of decision-making amid uncertainty would be the kind of scientific advance that comes only a few times a century. Risk is central to financial and insurance markets. It affects the consumption, saving and business investment that moves the global economy. Understanding human behavior in the face of risk would let us reduce accidents, retire more comfortably, get cheaper health insurance and maybe even avoid recessions. A number of our smartest scientists have tried to develop a general theory of risk behavior. John von Neumann, the pioneering mathematician and physicist, took a crack at it back in 1944, when he developed the theory of expected utility along with Oskar Morgenstern. According to this simple theory, people value a possible outcome by multiplying the probability that something happens by the amount they would like it to happen. This beautiful idea underlies much of modern economic theory, but unfortunately it doesn't work well in most situations. Alternative theories have been developed for specific applications. The psychologist Daniel Kahneman won a Nobel Prize for the creation of prospect theory, which says -- among other things -- that people measure outcomes relative to a reference point. That theory does a great job of explaining the behavior of subjects in certain lab experiments, and can help account for the actions of certain inexperienced consumers. But it is very difficult to apply generally, because the reference points are hard to predict in advance and may shift in unpredictable ways.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 22058 - Posted: 04.01.2016

By PAM BELLUCK When people make risky decisions, like doubling down in blackjack or investing in volatile stocks, what happens in the brain? Scientists have long tried to understand what makes some people risk-averse and others risk-taking. Answers could have implications for how to treat, curb or prevent destructively risky behavior, like pathological gambling or drug addiction. Now, a study by Dr. Karl Deisseroth, a prominent Stanford neuroscientist and psychiatrist, and his colleagues gives some clues. The study, published Wednesday in the journal Nature, reports that a specific type of neuron or nerve cell, in a certain brain region helps galvanize whether or not a risky choice is made. The study was conducted in rats, but experts said it built on research suggesting the findings could be similar in humans. If so, they said, it could inform approaches to addiction, which involves some of the same neurons and brain areas, as well as treatments for Parkinson’s disease because one class of Parkinson’s medications turns some patients into problem gamblers. In a series of experiments led by Kelly Zalocusky, a doctoral student, researchers found that a risk-averse rat made decisions based on whether its previous choice involved a loss (in this case, of food). Rats whose previous decision netted them less food were prompted to behave conservatively next time by signals from certain receptors in a brain region called the nucleus accumbens, the scientists discovered. These receptors, which are proteins attached to neurons, are part of the dopamine system, a neurochemical important to emotion, movement and thinking. In risk-taking rats, however, those receptors sent a much fainter signal, so the rats kept making high-stakes choices even if they lost out. But by employing optogenetics, a technique that uses light to manipulate neurons, the scientists stimulated brain cells with those receptors, heightening the “loss” signal and turning risky rats into safer rats. © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 22025 - Posted: 03.24.2016

By Daniel Engber Nearly 20 years ago, psychologists Roy Baumeister and Dianne Tice, a married couple at Case Western Reserve University, devised a foundational experiment on self-control. “Chocolate chip cookies were baked in the room in a small oven,” they wrote in a paper that has been cited more than 3,000 times. “As a result, the laboratory was filled with the delicious aroma of fresh chocolate and baking.” Here’s how that experiment worked. Baumeister and Tice stacked their fresh-baked cookies on a plate, beside a bowl of red and white radishes, and brought in a parade of student volunteers. They told some of the students to hang out for a while unattended, eating only from the bowl of radishes, while another group ate only cookies. Afterward, each volunteer tried to solve a puzzle, one that was designed to be impossible to complete. Baumeister and Tice timed the students in the puzzle task, to see how long it took them to give up. They found that the ones who’d eaten chocolate chip cookies kept working on the puzzle for 19 minutes, on average—about as long as people in a control condition who hadn’t snacked at all. The group of kids who noshed on radishes flubbed the puzzle test. They lasted just eight minutes before they quit in frustration. The authors called this effect “ego depletion” and said it revealed a fundamental fact about the human mind: We all have a limited supply of willpower, and it decreases with overuse. © 2016 The Slate Group LLC.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21965 - Posted: 03.08.2016

Angus Chen We know we should put the cigarettes away or make use of that gym membership, but in the moment, we just don't do it. There is a cluster of neurons in our brain critical for motivation, though. What if you could hack them to motivate yourself? These neurons are located in the middle of the brain, in a region called the ventral tegmental area. A paper published Thursday in the journal Neuron suggests that we can activate the region with a little bit of training. The researchers stuck 73 people into an fMRI, a scanner that can detect what part of the brain is most active, and focused on that area associated with motivation. When the researchers said "motivate yourself and make this part of your brain light up," people couldn't really do it. "They weren't that reliable when we said, 'Go! Get psyched. Turn on your VTA,' " says Dr. Alison Adcock, a psychiatrist at Duke and senior author on the paper. That changed when the participants were allowed to watch a neurofeedback meter that displayed activity in their ventral tegmental area. When activity ramps up, the participants see the meter heat up while they're in the fMRI tube. "Your whole mind is allowed to speak to a specific part of your brain in a way you never imagined before. Then you get feedback that helps you discover how to turn that part of the brain up or down," says John Gabrieli, a neuroscientist at the Massachusetts Institute of Technology who was not involved with the work. © 2016 npr

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Link ID: 21954 - Posted: 03.05.2016

Monya Baker Is psychology facing a ‘replication crisis’? Last year, a crowdsourced effort that was able to validate fewer than half of 98 published findings1 rang alarm bells about the reliability of psychology papers. Now a team of psychologists has reassessed the study and say that it provides no evidence for a crisis. “Our analysis completely invalidates the pessimistic conclusions that many have drawn from this landmark study,” says Daniel Gilbert, a psychologist at Harvard University in Cambridge, Massachusetts, and a co-author of the reanalysis, published on 2 March in Science2. But a response3 in the same issue of Science counters that the reanalysis itself depends on selective assumptions. And others say that psychology still urgently needs to improve its research practices. Statistical criticism In August 2015, a team of 270 researchers reported the largest ever single-study audit of the scientific literature. Led by Brian Nosek, executive director of the Center for Open Science in Charlottesville, Virginia, the Reproducibility Project attempted to replicate studies in 100 psychology papers. (It ended up with 100 replication attempts for 98 papers because of problems assigning teams to two papers.) According to one of several measures of reproducibility, just 36% could be confirmed; by another statistical measure, 47% could1. Either way, the results looked worryingly feeble. “Both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted” Not so fast, says Gilbert. © 2016 Nature Publishing Group

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21953 - Posted: 03.05.2016

By David Z. Hambrick We all make stupid mistakes from time to time. History is replete with examples. Legend has it that the Trojans accepted the Greek’s “gift” of a huge wooden horse, which turned out to be hollow and filled with a crack team of Greek commandos. The Tower of Pisa started to lean even before construction was finished—and is not even the world’s farthest leaning tower. NASA taped over the original recordings of the moon landing, and operatives for Richard Nixon’s re-election committee were caught breaking into a Watergate office, setting in motion the greatest political scandal in U.S. history. More recently, the French government spent $15 billion on a fleet of new trains, only to discover that they were too wide for some 1,300 station platforms. We readily recognize these incidents as stupid mistakes—epic blunders. On a more mundane level, we invest in get-rich-quick schemes, drive too fast, and make posts on social media that we later regret. But what, exactly, drives our perception of these actions as stupid mistakes, as opposed to bad luck? Their seeming mindlessness? The severity of the consequences? The responsibility of the people involved? Science can help us answer these questions. In a study just published in the journal Intelligence, using search terms such as “stupid thing to do”, Balazs Aczel and his colleagues compiled a collection of stories describing stupid mistakes from sources such as The Huffington Post and TMZ. One story described a thief who broke into a house and stole a TV and later returned for the remote; another described burglars who intended to steal cell phones but instead stole GPS tracking devices that were turned on and gave police their exact location. The researchers then had a sample of university students rate each story on the responsibility of the people involved, the influence of the situation, the seriousness of the consequences, and other factors. © 2016 Scientific American,

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21928 - Posted: 02.24.2016

Alison Abbott. More than 50 years after a controversial psychologist shocked the world with studies that revealed people’s willingness to harm others on order, a team of cognitive scientists has carried out an updated version of the iconic ‘Milgram experiments’. Their findings may offer some explanation for Stanley Milgram's uncomfortable revelations: when following commands, they say, people genuinely feel less responsibility for their actions — whether they are told to do something evil or benign. “If others can replicate this, then it is giving us a big message,” says neuroethicist Walter Sinnot-Armstrong of Duke University in Durham, North Carolina, who was not involved in the work. “It may be the beginning of an insight into why people can harm others if coerced: they don’t see it as their own action.” The study may feed into a long-running legal debate about the balance of personal responsibility between someone acting under instruction and their instructor, says Patrick Haggard, a cognitive neuroscientist at University College London, who led the work, published on 18 February in Current Biology1. Milgram’s original experiments were motivated by the trial of Nazi Adolf Eichmann, who famously argued that he was ‘just following orders’ when he sent Jews to their deaths. The new findings don’t legitimize harmful actions, Haggard emphasizes, but they do suggest that the ‘only obeying orders’ excuse betrays a deeper truth about how a person feels when acting under command. © 2016 Nature Publishing Group

Related chapters from BP7e: Chapter 15: Emotions, Aggression, and Stress; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 14: Attention and Consciousness
Link ID: 21915 - Posted: 02.19.2016

By Jordana Cepelewicz Seasonal variations play a major role in the animal kingdom—in reproduction, food availability, hibernation, even fur color. Whether this seasonality has such a significant influence on humans, however, is an open question. Its best-known association is with mood—that is, feeling down during the colder months and up in the summer—and, in extreme cases, seasonal depression, a phenomenon known as seasonal affective disorder (SAD). A new study published in this week’s Proceedings of the National Academy of Sciences seeks to delve deeper into how human biology has adapted not only to day/night cycles (circadian rhythms) but to yearly seasonal patterns as well. Scientists have previously found seasonal variation in the levels and concentrations of certain compounds associated with mood (including dopamine and serotonin), conception and even mortality. Now for the first time, using functional MRI, “it’s [been] conclusively shown that cognition and the brain’s means of cognition are seasonal,” says neuroscientist Gilles Vandewalle of the University of Liège in Belgium, the study’s lead researcher. These findings come at a time when some scientists are disputing the links between seasonality and mental health. Originally aiming to investigate the impact of sleep and sleep deprivation on brain function, Vandewalle and his fellow researchers placed 28 participants on a controlled sleep/wake schedule for three weeks before bringing them into the laboratory, where they stayed for 4.5 days. During this time they underwent a cycle of sleep deprivation and recovery in the absence of seasonal cues such as natural light, time information and social interaction. Vandewalle’s team repeated the entire procedure with the same subjects several times throughout the course of nearly a year and a half. © 2016 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 14: Biological Rhythms, Sleep, and Dreaming
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 10: Biological Rhythms and Sleep
Link ID: 21882 - Posted: 02.10.2016