Chapter 15. Emotions, Aggression, and Stress
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Tanya Lewis WASHINGTON — From the stroke of a mother's hand to the embrace of a lover, sensations of gentle touch activate a specialized set of nerves in humans. The brain is widely believed to contain a "map" of the body for sensing touch. But humans may also have an emotional body map that corresponds to feelings of gentle touch, according to new research presented here Sunday (Nov. 16) at the 44th annual meeting of the Society for Neuroscience. For humans and all social species, touch plays a fundamental role in the formation and maintenance of social bonds, study researcher Susannah Walker, a behavioral neuroscientist at Liverpool John Moores University in the United KIngdom, said in a news conference. [Top 10 Things That Make Humans Special] "Indeed, a lack of touch can have a detrimental effect on both our physical health and our psychological well-being," Walker said. In a clinical setting, physical contact with premature infants has been shown to boost growth, decrease stress and aid brain development. But not much research has focused on the basis of these effects in the nervous system, Walker said. The human body has a number of different kinds of nerves for perceiving touch. Thicker nerves surrounded by a fatty layer of insulation (called myelin) identify touch and temperature and rapidly send those signals to the brain, whereas thinner nerves that lack this insulation send sensory information more slowly.
By Bethany Brookshire WASHINGTON – Moldy houses are hard on the lungs, and new results in mice suggest that they could also be bad for the brain. Inhaling mold spores made mice anxious and forgetful, researchers reported November 15 at the annual meeting of the Society for Neuroscience. Cheryl Harding, a psychologist at the City University of New York, and colleagues dripped low doses of spores from the toxic mold Stachybotrys into mouse noses three times per week. After three weeks, the mice didn’t look sick. But they had trouble remembering a fearful place. The mice were also more anxious than normal counterparts. The anxiety and memory deficits went along with decreases in new brain cells in the hippocampus — a part of the brain that plays a role in memory — compared with control mice. Harding and colleagues also found that the behaviors linked to increased inflammatory proteins in the hippocampus. Exposure to mold’s toxins and structural proteins may trigger an immune response in the brain. The findings, Harding says, may help explain some of the conditions that people living in moldy buildings complain about, such as anxiety and cognitive problems. C. Harding et al. Mold inhalation, brain inflammation, and behavioral dysfunction. Society for Neuroscience Meeting, Washington, DC, November 15, 2014. © Society for Science & the Public 2000 - 2014.
By John Bohannon If you had the choice between hurting yourself or someone else in exchange for money, how altruistic do you think you’d be? In one infamous experiment, people were quite willing to deliver painful shocks to anonymous victims when asked by a scientist. But a new study that forced people into the dilemma of choosing between pain and profit finds that participants cared more about other people’s well-being than their own. It is hailed as the first hard evidence of altruism for the young field of behavioral economics. Human behavior toward others is hard to predict. On the one hand, we stand out in the animal world for our altruism, often making significant sacrifices to help out a stranger in need. And all but the most antisocial people experience psychological distress at witnessing, let alone causing, pain in others. Yet study after study in the field of behavioral economics has demonstrated that we tend to value our own needs and desires above those of others. For example, researchers have found that just thinking about money makes people behave more selfishly. To try to reconcile the angels and devils of our nature, a team led by Molly Crockett, a psychologist at the University of Oxford in the United Kingdom, combined the classic psychological and economics tools for probing altruism: pain and money. Everyone has their own pain threshold, so the first task was a pain calibration. Researchers administered electric shocks with electrodes attached to the wrists of 160 subjects, starting at an almost imperceptible level and amping up until the subject described the pain as intolerable. (For most people, that threshold for pain is similar to holding your wrist under a stream of 50°C water.) © 2014 American Association for the Advancement of Science.
By Greg Miller This robot causes people to experience the illusory sensation of someone standing behind them. © Alain Herzog/EPFL People who’ve stared death in the face and lived to tell about it—mountain climbers who’ve made a harrowing descent, say, or survivors of the World Trade Center attacks—sometimes report that just when their situation seemed impossible, a ghostly presence appeared. People with schizophrenia and certain types of neurological damage sometimes report similar experiences, which scientists call, aptly, “feeling of presence.” Now a team of neuroscientists says it has identified a set of brain regions that seems to be involved in generating this illusion. Better yet, they’ve built a robot that can cause ordinary people to experience it in the lab. The team was led by Olaf Blanke, a neurologist and neuroscientist at the Swiss Federal Institute of Technology in Lausanne. Blanke has a long-standing interest in creepy illusions of bodily perception. Studying these bizarre phenomena, he says, could point to clues about the biology of mental illness and the mechanisms of human consciousness. In 2006, for example, Blanke and colleagues published a paper in Nature that had one of the best titles you’ll ever see in a scientific journal: “Induction of an illusory shadow person.” In that study, they stimulated the brain of a young woman who was awaiting brain surgery for severe epilepsy. Surgeons had implanted electrodes on the surface of her brain to monitor her seizures, and when the researchers passed a mild current through the electrodes, stimulating a small region at the intersection of the temporal and parietal lobes of her brain, she experienced what she described as a shadowy presence lurking nearby, mimicking her own posture. Colored areas indicate regions of overlap in the lesions of neurological patients who experienced feeling of presence illusions. © 2014 Condé Nast.
by Helen Thomson Scared of the dark? Terrified of heights? Spiders make you scream? For the first time, a person's lifelong phobia has been completely abolished overnight. Unfortunately, it required removing a tiny bit of the man's brain, so for now, most people will have to find another way to dispel their fears. The phobia was abolished by accident. A 44-year-old business man started having seizures out of the blue. Brain scans showed he had an abnormality in his left amygdala – an area in the temporal lobe involved in emotional reactions, among other things. Further tests showed the cause was sarcoidosis, a rare condition that causes damage to the lungs, skin and, occasionally, the brain. Doctors decided it was necessary to remove the man's damaged left amygdala. The surgery went well, but soon after the man noticed a strange turn of events. Not only did he have a peculiar "stomach-lurching" aversion to music – which was particularly noticeable when he heard the song accompanying a certain TV advert – but he also discovered he was no longer afraid of spiders. While his aversion to music waned over time, his arachnophobia never returned. Before the surgery he would throw tennis balls at spiders, or use hairspray to immobilise them before vacuuming them up. Now he is able to touch and observe the little critters at close distance and says he actually finds them fascinating. He hasn't noticed any changes to other kinds of fears or anxieties. For example, he is equally as anxious about public speaking now as he was prior to surgery. © Copyright Reed Business Information Ltd.
Link ID: 20265 - Posted: 11.01.2014
by Dan Jones The way your brain reacts to a single disgusting image can be used to predict whether you lean to the left or the right politically. A number of studies have probed the emotions of people along the political spectrum, and found that disgust in particular is tightly linked to political orientation. People who are highly sensitive to disgusting images – of bodily waste, gore or animal remains – are more likely to sit on the political right and show concern for what they see as bodily and spiritual purity, so tend to oppose abortion and gay marriage, for example. A team led by Read Montague, a neuroscientist at Virginia Tech in Roanoke, recruited 83 volunteers and performed fMRI brain scans on them as they looked at a series of 80 images that were either pleasant, disgusting, threatening or neutral. Participants then rated the images for their emotional impact and completed a series of questionnaires that assessed whether they were liberal, moderate or conservative. The brain-imaging results were then fed to a learning algorithm which compared the whole-brain responses of liberals and conservatives when looking at disgusting images versus neutral ones. For both political groups, the algorithm was able to pick out distinct patterns of brain activity triggered by the disgusting images. And even though liberals and conservatives consciously reported similar emotional reactions to the images, the specific brain regions involved and their patterns of activation differed consistently between the two groups – so much so that they represented a neural signature of political leaning, the team concludes. © Copyright Reed Business Information Ltd
Link ID: 20263 - Posted: 11.01.2014
By Jenna Bilbrey Your starbase is almost complete. All you need is a few more tons of ore. You could take the afternoon to mine it from an asteroid field, but you’ve heard of a Ska’ari who trades ore for cheap. So you message your alliance, use your connections to set up a meeting, and hop in your spacecraft. It’s good to have friends, even if they are virtual. An online science fiction game may not seem like the ideal place to study human behavior, but physicist Stefan Thurner has shown that the way people act in the virtual world isn’t so different from how they act in the real one. Thurner studies all sorts of complex systems at the Medical University of Vienna, so when one of his doctoral students just happened to create one of the most popular free browser-based games in Europe, Thurner suggested using the game, called Pardus, to study the spontaneous organization of people in a closed society. For almost three-and-a-half years, they monitored the interactions of roughly 7000 active players at one time within the game’s virtual world. Unlike in real life, Pardus players’ moves are tracked and their interactions are recorded automatically by the game. “We have information about everything,” Thurner says. “We know who is where at what point in time, … who exchanges things or money with whom, who is friends with whom, … who hates someone else, who collaborates with whom in entrepreneurial activities, who is in a criminal gang with whom, etc. Even though the society is artificial, it’s a human society.” © 2014 American Association for the Advancement of Science.
By Melissa Hogenboom Science reporter, BBC News A genetic analysis of almost 900 offenders in Finland has revealed two genes associated with violent crime. Those with the genes were 13 times more likely to have a history of repeated violent behaviour. The authors of the study, published in the journal Molecular Psychiatry, said at least 4-10% of all violent crime in Finland could be attributed to individuals with these genotypes. But they stressed the genes could not be used to screen criminals. Many more genes may be involved in violent behaviour and environmental factors are also known to have a fundamental role. Even if an individual has a "high-risk combination" of these genes the majority will never commit a crime, the lead author of the work Jari Tiihonen of the Karolinska Institutet in Sweden said. "Committing a severe, violent crime is extremely rare in the general population. So even though the relative risk would be increased, the absolute risk is very low," he told the BBC. The study, which involved analysis of almost 900 criminals, is the first to have looked at the genetic make-up of so many violent criminals in this way. Warrior gene Each criminal was given a profile based on their offences, categorising them into violent or non-violent. The association between genes and previous behaviour was strongest for the 78 who fitted the "extremely violent offender" profile. This group had committed a total of 1,154 murders, manslaughters, attempted homicides or batteries. A replication group of 114 criminals had all committed at least one murder. BBC © 2014
By GABRIELE OETTINGEN MANY people think that the key to success is to cultivate and doggedly maintain an optimistic outlook. This belief in the power of positive thinking, expressed with varying degrees of sophistication, informs everything from affirmative pop anthems like Katy Perry’s “Roar” to the Mayo Clinic’s suggestion that you may be able to improve your health by eliminating “negative self-talk.” But the truth is that positive thinking often hinders us. More than two decades ago, I conducted a study in which I presented women enrolled in a weight-reduction program with several short, open-ended scenarios about future events — and asked them to imagine how they would fare in each one. Some of these scenarios asked the women to imagine that they had successfully completed the program; others asked them to imagine situations in which they were tempted to cheat on their diets. I then asked the women to rate how positive or negative their resulting thoughts and images were. A year later, I checked in on these women. The results were striking: The more positively women had imagined themselves in these scenarios, the fewer pounds they had lost. My colleagues and I have since performed many follow-up studies, observing a range of people, including children and adults; residents of different countries (the United States and Germany); and people with various kinds of wishes — college students wanting a date, hip-replacement patients hoping to get back on their feet, graduate students looking for a job, schoolchildren wishing to get good grades. In each of these studies, the results have been clear: Fantasizing about happy outcomes — about smoothly attaining your wishes — didn’t help. Indeed, it hindered people from realizing their dreams. © 2014 The New York Times Company
By J. PEDER ZANE Striking it rich is the American dream, a magnetic myth that has drawn millions to this nation. And yet, a countervailing message has always percolated through the culture: Money can’t buy happiness. From Jay Gatsby and Charles Foster Kane to Tony Soprano and Walter White, the woefully wealthy are among the seminal figures of literature, film and television. A thriving industry of gossipy, star-studded magazines and websites combines these two ideas, extolling the lifestyles of the rich and famous while exposing the sadness of celebrity. All of which raises the question: Is the golden road paved with misery? Yes, in a lot of cases, according to a growing body of research exploring the connection between wealth and happiness. Studies in behavioral economics, cognitive psychology and neuroscience are providing new insights into how a changing American economy and the wiring of the human brain can make life on easy street feel like a slog. Make no mistake, it is better to be rich than poor — psychologically as well as materially. Levels of depression, anxiety and stress diminish as incomes rise. What has puzzled researchers is that the psychological benefits of wealth seem to stop accruing once people reach an income of about $75,000 a year. “The question is, What are the factors that dampen the rewards of income?” said Scott Schieman, a professor of sociology at the University of Toronto. “Why doesn’t earning even more money — beyond a certain level — make us feel even happier and more satisfied?” The main culprit, he said, is the growing demands of work. For millenniums, leisure was wealth’s bedfellow. The rich were different because they worked less. The tables began to turn in America during the 1960s, when inherited privilege gave way to educational credentials and advancement became more closely tied to merit. © 2014 The New York Times Company
Link ID: 20236 - Posted: 10.23.2014
James Hamblin People whose faces are perceived to look more "competent" are more likely to be CEOs of large, successful companies. Having a face that people deem "dominant" is a predictor of rank advancement in the military. People are more likely to invest money with people who look "trustworthy." These sorts of findings go on and on in recent studies that claim people can accurately guess a variety of personality traits and behavioral tendencies from portraits alone. The findings seem to elucidate either canny human intuition or absurd, misguided bias. There has been a recent boom in research on how people attribute social characteristics to others based on the appearance of faces—independent of cues about age, gender, race, or ethnicity. (At least, as independent as possible.) The results seem to offer some intriguing insight, claiming that people are generally pretty good at predicting who is, for example, trustworthy, competent, introverted or extroverted, based entirely on facial structure. There is strong agreement across studies as to what facial attributes mean what to people, as illustrated in renderings throughout this article. But it's, predictably, not at all so simple. Christopher Olivola, an assistant professor at Carnegie Mellon University, makes the case against face-ism today, in the journal Trends in Cognitive Sciences. In light of many recent articles touting people's judgmental abilities, Olivola and Princeton University's Friederike Funk and Alexander Todorov say that a careful look at the data really doesn't support these claims. And "instead of applauding our ability to make inferences about social characteristics from facial appearances," Olivola said, "the focus should be on the dangers."
BY Bethany Brookshire Stress is our coping response. Whether emotional or physical, stress is how organisms react to upheaval in their lives. And in many cases, that response requires tradeoffs. An animal will make it through now, but may come out with fewer fat stores or a shorter life span. But a new study shows that under certain conditions, developmental stress in male zebra finches might have a positive effect, in the form of more offspring to carry on his genes. Ondi Crino, a biologist now at Macquarie University in Sydney, examined how stress during development might affect reproductive success in male zebra finches. She purchased 10 male and 10 female zebra finches from pet shops near the University of Montana. The birds were allowed to pair off and nest. When the first batch of chicks was 12 days old, Crino fed half of the male offspring peanut oil, and half peanut oil with the hormone corticosterone mixed in. Both humans and finches produce stress-related hormones. Humans produce cortisol, while finches produce corticosterone. These two hormones increase during times of stress and cause many of the negative effects we associate with worry and pressure. So administering corticosterone is one method of “stressing” an animal without changing anything else in its environment. The dose was in the range of what a young bird might experience in the midst of a natural upheaval such as a cold snap or famine. After 16 days of the peanut oil supplement, the young male birds receiving corticosterone were smaller than their relaxed counterparts. They also had a larger spike in their own corticosterone levels when they were stressed. But over time, the chicks that received corticosterone appeared to grow out of their stressful upbringing. By adulthood they were the same size as controls, and they did not show frazzled feathers or pale colors that might indicate a rough chickhood. © Society for Science & the Public 2000 - 2014
By Jane E. Brody In the 1997 film “As Good As It Gets,” Jack Nicholson portrays Melvin Udall, a middle-aged man with obsessive-compulsive disorder who avoids stepping on cracks, locks doors and flips light switches exactly five times, and washes his hands repeatedly, each time tossing out the new bar of soap he used. He brings wrapped plastic utensils to the diner where he eats breakfast at the same table every day. Though the film is billed as a romantic comedy, Melvin’s disorder is nothing to laugh about. O.C.D. is often socially, emotionally and vocationally crippling. It can even be fatal. Four years ago, John C. Kelly, 24, killed himself in Irvington, N.Y., after a long battle with a severe form of obsessive-compulsive disorder. Mr. Kelly was a devoted baseball player, and now friends hold an annual softball tournament to raise money for the foundation established in his honor to increase awareness of the disorder. Obsessive thoughts and compulsive behaviors occur in almost every life from time to time. I have a fair share of compulsive patterns: seasonings arranged in strict alphabetical order; kitchen equipment always put back the same way in the same place; two large freezers packed with foods just in case I need them. I hold onto a huge collection of plastic containers, neatly stacked with their covers, and my closets bulge with clothes and shoes I haven’t worn in years, and probably never will again — yet cannot bring myself to give away. But these common habits fall far short of the distressing obsessions and compulsions that are the hallmarks of O.C.D.: intrusive, disturbing thoughts or fears that cannot be ignored and compel the sufferer to engage in ritualistic, irrational behaviors to relieve the resulting anxiety.
Keyword: OCD - Obsessive Compulsive Disorder
Link ID: 20208 - Posted: 10.16.2014
Daniel Cressey Mirrors are often used to elicit aggression in animal behavioural studies, with the assumption being that creatures unable to recognize themselves will react as if encountering a rival. But research suggests that such work may simply reflect what scientists expect to see, and not actual aggression. For most people, looking in a mirror does not trigger a bout of snarling hostility at the face staring back. But many animals do seem to react aggressively to their mirror image, and for years mirrors have been used to trigger such responses for behavioural research on species ranging from birds to fish. “There’s been a very long history of using a mirror as it’s just so handy,” says Robert Elwood, an animal-behaviour researcher at Queen’s University in Belfast, UK. Using a mirror radically simplifies aggression experiments, cutting down the number of animals required and providing the animal being observed with an ‘opponent’ perfectly matched in terms of size and weight. But in a study just published in Animal Behaviour1, Elwood and his team add to evidence that many mirror studies are flawed. The researchers looked at how convict cichlid fish (Amatitlania nigrofasciata) reacted both to mirrors and to real fish of their own species. This species prefers to display their right side in aggression displays, which means that they end up alongside each other in a head-to-tail configuration. It is impossible for a fish to achieve this with their own reflection, but Elwood reasoned that fish faced with a mirror would attempt it, and flip from side to side as they tried to present an aggressive display. On the other hand, if the reflection did not trigger an aggressive reaction, the fish would not display such behaviour as much or as frequently. © 2014 Nature Publishing Group,
By MOISES VELASQUEZ-MANOFF WHEN Andre H. Lagrange, a neurologist at Vanderbilt University in Nashville, saw the ominous white spots on the patient’s brain scan, he considered infection or lymphoma, a type of cancer. But tests ruled out both. Meanwhile, anti-epilepsy drugs failed to halt the man’s seizures. Stumped, Dr. Lagrange turned to something the mother of the 30-year-old man kept repeating. The fits coincided, she insisted, with spells of constipation and diarrhea. That, along with an odd rash, prompted Dr. Lagrange to think beyond the brain. Antibody tests, followed by an intestinal biopsy, indicated celiac disease, an autoimmune disorder of the gut triggered by the gluten proteins in wheat and other grains. Once on a gluten-free diet, the man’s seizures stopped; those brain lesions gradually disappeared. He made a “nearly complete recovery,” Dr. Lagrange told me. I began encountering case descriptions like this some years ago as I researched autoimmune disease. The first few seemed like random noise in an already nebulous field. But as I amassed more — describing seizures, hallucinations, psychotic breaks and even, in one published case, what looked like regressive autism, all ultimately associated with celiac disease — they began to seem less like anomalies, and more like a frontier in celiac research. They tended to follow a similar plot. What looked like neurological or psychiatric symptoms appeared suddenly. The physician ran through a diagnostic checklist without success. Drugs directed at the brain failed. Some clue suggestive of celiac disease was observed. The diagnosis was made. And the patient recovered on a gluten-free diet. The cases highlighted, in an unusually concrete fashion, the so-called gut-brain axis. The supposed link between the intestinal tract and the central nervous system is much discussed in science journals, often in the context of the microbial community inhabiting the gut. But it’s unclear how, really, we can leverage the link to improve health. © 2014 The New York Times Company
Link ID: 20200 - Posted: 10.13.2014
by Mallory Locklear Do you have an annoying friend who loves bungee jumping or hang-gliding, and is always blathering on about how it never scares them? Rather than being a macho front, their bravado may have a biological basis. Research from Stony Brook University in New York shows that not all risk-takers are cut from the same cloth. Some actually seem to feel no fear – or at least their bodies and brains don't respond to danger in the usual way. The study is the first to attempt to tease apart the differences in the risk-taking population. In order to ensure every participant was a card-carrying risk-taker, the team led by Lilianne Mujica-Parodi, recruited 30 first-time skydivers. "Most studies on sensation-seeking compare people who take risks and people who don't. We were interested in something more subtle – those who take risks adaptively and those who do so maladaptively." In other words, do all risk-takers process potential danger in the same way or do some ignore the risks more than others? To find out, the researchers got their participants to complete several personality questionnaires, including one that asked them to rank how well statements such as, "The greater the risk the more fun the activity," described them. Next, the team used fMRI imaging to observe whether the participants' corticolimbic brain circuit – which is involved in risk assessment - was well-regulated. A well-regulated circuit is one that reacts to a threat and then returns to a normal state afterwards. © Copyright Reed Business Information Ltd
By ALINA TUGEND MANY workers now feel as if they’re doing the job of three people. They are on call 24 hours a day. They rush their children from tests to tournaments to tutoring. The stress is draining, both mentally and physically. At least that is the standard story about stress. It turns out, though, that many of the common beliefs about stress don’t necessarily give the complete picture. MISCONCEPTION NO. 1 Stress is usually caused by having too much work. While being overworked can be overwhelming, research increasingly shows that being underworked can be just as challenging. In essence, boredom is stressful. “We tend to think of stress in the original engineering way, that too much pressure or too much weight on a bridge causes it to collapse,” said Paul E. Spector, a professor of psychology at the University of South Florida. “It’s more complicated than that.” Professor Spector and others say too little to do — or underload, as he calls it — can cause many of the physical discomforts we associate with being overloaded, like muscle tension, stomachaches and headaches. A study published this year in the journal Experimental Brain Research found that measurements of people’s heart rates, hormonal levels and other factors while watching a boring movie — men hanging laundry — showed greater signs of stress than those watching a sad movie. “We tend to think of boredom as someone lazy, as a couch potato,” said James Danckert, a professor of neuroscience at the University of Waterloo in Ontario, Canada, and a co-author of the paper. “It’s actually when someone is motivated to engage with their environment and all attempts to do so fail. It’s aggressively dissatisfying.” © 2014 The New York Times Company
|By Daisy Yuhas Do we live in a holographic universe? How green is your coffee? And could drinking too much water actually kill you? Before you click those links you might consider how your knowledge-hungry brain is preparing for the answers. A new study from the University of California, Davis, suggests that when our curiosity is piqued, changes in the brain ready us to learn not only about the subject at hand, but incidental information, too. Neuroscientist Charan Ranganath and his fellow researchers asked 19 participants to review more than 100 questions, rating each in terms of how curious they were about the answer. Next, each subject revisited 112 of the questions—half of which strongly intrigued them whereas the rest they found uninteresting—while the researchers scanned their brain activity using functional magnetic resonance imaging (fMRI). During the scanning session participants would view a question then wait 14 seconds and view a photograph of a face totally unrelated to the trivia before seeing the answer. Afterward the researchers tested participants to see how well they could recall and retain both the trivia answers and the faces they had seen. Ranganath and his colleagues discovered that greater interest in a question would predict not only better memory for the answer but also for the unrelated face that had preceded it. A follow-up test one day later found the same results—people could better remember a face if it had been preceded by an intriguing question. Somehow curiosity could prepare the brain for learning and long-term memory more broadly. The findings are somewhat reminiscent of the work of U.C. Irvine neuroscientist James McGaugh, who has found that emotional arousal can bolster certain memories. But, as the researchers reveal in the October 2 Neuron, curiosity involves very different pathways. © 2014 Scientific American
Helen Thomson You'll have heard of Pavlov's dogs, conditioned to expect food at the sound of a bell. You might not have heard that a scarier experiment – arguably one of psychology's most unethical – was once performed on a baby. In it, a 9-month-old, at first unfazed by the presence of animals, was conditioned to feel fear at the sight of a rat. The infant was presented with the animal as someone struck a metal pole with a hammer above his head. This was repeated until he cried at merely the sight of any furry object – animate or inanimate. The "Little Albert" experiment, performed in 1919 by John Watson of Johns Hopkins University Hospital in Baltimore, Maryland, was the first to show that a human could be classically conditioned. The fate of Albert B has intrigued researchers ever since. Hall Beck at the Appalachian State University in Boone, North Carolina, has been one of the most tenacious researchers on the case. Watson's papers stated that Albert B was the son of a wet nurse who worked at the hospital. Beck spent seven years exploring potential candidates and used facial analysis to conclude in 2009 that Little Albert was Douglas Merritte, son of hospital employee Arvilla. Douglas was born on the same day as Albert and several other points tallied with Watson's notes. Tragically, medical records showed that Douglas had severe neurological problems and died at an early age of hydrocephalus, or water on the brain. According to his records, this seems to have resulted in vision problems, so much so that at times he was considered blind. © Copyright Reed Business Information Ltd.
By Fredrick Kunkle Here’s something to worry about: A recent study suggests that middle-age women whose personalities tend toward the neurotic run a higher risk of developing Alzheimer’s disease later in life. The study by researchers at the University of Gothenburg in Sweden followed a group of women in their 40s, whose disposition made them prone to anxiety, moodiness and psychological distress, to see how many developed dementia over the next 38 years. In line with other research, the study suggested that women who were the most easily upset by stress — as determined by a commonly used personality test — were two times more likely to develop Alzheimer’s disease than women who were least prone to neuroticism. In other words, personality really is — in some ways — destiny. “Most Alzheimer’s research has been devoted to factors such as education, heart and blood risk factors, head trauma, family history and genetics,” study author Lena Johansson said in a written statement. “Personality may influence the individual’s risk for dementia through its effect on behavior, lifestyle or reactions to stress.” The researchers cautioned that the results cannot be extrapolated to men because they were not included in the study and that further work is needed to determine possible causes for the link. The study, which appeared Wednesday in the American Academy of Neurology’s journal, Neurology, examined 800 women whose average age in 1968 was 46 years to see whether neuroticism — which involves being easily distressed and subject to excessive worry, jealousy or moodiness — might have a bearing on the risk of dementia.