Chapter 14. Attention and Higher Cognition

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 1514

By Pragya Agarwal If you have seen the documentary Free Solo, you will be familiar with Alex Honnold. He ascends without protective equipment of any kind in treacherous landscapes where, above about 15 meters, any slip is generally lethal. Even just watching him pressed against the rock with barely any handholds makes me nauseous. In a functional magnetic resonance imaging (fMRI) test with Honnold, neurobiologist Jane Joseph found there was near zero activation in his amygdala. This is a highly unusual brain reaction and may explain why Alex feels no threat in free solo climbs that others wouldn’t dare attempt. But this also shows how our amygdala activates in that split second to warn us, and why it plays an important role in our unconscious biases. Having spent many years researching unconscious bias for my book, I have realized that it remains problematic to pinpoint as it is hidden and is often in complete contrast to what are our expected beliefs. Neuroimaging research is beginning to give us more insight into the formation of our unconscious biases. Recent fMRI neuroscience studies demonstrate that people use different areas of the brain when reasoning about familiar and unfamiliar situations. The neural zones that respond to stereotypes primarily include the amygdala, the prefrontal cortex, the posterior cingulate and the anterior temporal cortex, and that they are described as all “lighting up like a Christmas tree” when stereotypes are activated (certain parts of the brain become more activated than others during certain tasks). People also use different areas of the brain when reasoning about familiar and unfamiliar situations. When we meet someone new, we are not merely focusing on our verbal interaction. © 2020 Scientific American,

Keyword: Attention; Brain imaging
Link ID: 27184 - Posted: 04.13.2020

Anne Trafton | MIT News Office Imagine you are meeting a friend for dinner at a new restaurant. You may try dishes you haven’t had before, and your surroundings will be completely new to you. However, your brain knows that you have had similar experiences — perusing a menu, ordering appetizers, and splurging on dessert are all things that you have probably done when dining out. MIT neuroscientists have now identified populations of cells that encode each of these distinctive segments of an overall experience. These chunks of memory, stored in the hippocampus, are activated whenever a similar type of experience takes place, and are distinct from the neural code that stores detailed memories of a specific location. The researchers believe that this kind of “event code,” which they discovered in a study of mice, may help the brain interpret novel situations and learn new information by using the same cells to represent similar experiences. “When you encounter something new, there are some really new and notable stimuli, but you already know quite a bit about that particular experience, because it’s a similar kind of experience to what you have already had before,” says Susumu Tonegawa, a professor of biology and neuroscience at the RIKEN-MIT Laboratory of Neural Circuit Genetics at MIT’s Picower Institute for Learning and Memory. Tonegawa is the senior author of the study, which appears today in Nature Neuroscience. Chen Sun, an MIT graduate student, is the lead author of the paper. New York University graduate student Wannan Yang and Picower Institute technical associate Jared Martin are also authors of the paper.

Keyword: Learning & Memory; Attention
Link ID: 27174 - Posted: 04.07.2020

Oliver Wainwright Some whisper gently into the microphone, while tapping their nails along the spine of a book. Others take a bar of soap and slice it methodically into tiny cubes, letting the pieces clatter into a plastic tray. There are those who dress up as doctors and pretend to perform a cranial nerve exam, and the ones who eat food as noisily as they can, recording every crunch and slurp in 3D stereo sound. To an outsider, the world of ASMR videos can be a baffling, kooky place. In a fast-growing corner of the internet, millions of people are watching each other tap, rattle, stroke and whisper their way through hours of homemade videos, with the aim of being lulled to sleep, or in the hope of experiencing “the tingles” – AKA, the autonomous sensory meridian response. “It feels like a rush of champagne bubbles at the top of your head,” says curator James Taylor-Foster. “There’s a mild sense of euphoria and a feeling of deep calm.” Taylor-Foster has spent many hours trawling the weirdest depths of YouTube in preparation for a new exhibition, Weird Sensation Feels Good, at ArkDes, Sweden’s national centre for architecture and design, on what he sees as one of the most important creative movements to emerge from the internet. (Though the museum has been closed due to the coronavirus pandemic, the show will be available to view online.) It will be the first major exhibition about ASMR, a term that was coined a decade ago when cybersecurity expert Jennifer Allen was looking for a word to describe the warm effervescence she felt in response to certain triggers. She had tried searching the internet for things like “tingling head and spine” or “brain orgasm”. In 2009, she hit upon a post on a health message board titled WEIRD SENSATION FEELS GOOD. © 2020 Guardian News & Media Limited

Keyword: Hearing; Attention
Link ID: 27169 - Posted: 04.04.2020

Stephanie Preston The media is replete with COVID-19 stories about people clearing supermarket shelves – and the backlash against them. Have people gone mad? How can one individual be overfilling his own cart, while shaming others who are doing the same? As a behavioral neuroscientist who has studied hoarding behavior for 25 years, I can tell you that this is all normal and expected. People are acting the way evolution has wired them. The word “hoarding” might bring to mind relatives or neighbors whose houses are overfilled with junk. A small percentage of people do suffer from what psychologists call “hoarding disorder,” keeping excessive goods to the point of distress and impairment. But hoarding is actually a totally normal and adaptive behavior that kicks in any time there is an uneven supply of resources. Everyone hoards, even during the best of times, without even thinking about it. People like to have beans in the pantry, money in savings and chocolates hidden from the children. These are all hoards. Most Americans have had so much, for so long. People forget that, not so long ago, survival often depended on working tirelessly all year to fill root cellars so a family could last through a long, cold winter – and still many died. Similarly, squirrels work all fall to hide nuts to eat for the rest of the year. Kangaroo rats in the desert hide seeds the few times it rains and then remember where they put them to dig them back up later. A Clark’s nutcracker can hoard over 10,000 pine seeds per fall – and even remember where it put them. © 2010–2020, The Conversation US, Inc.

Keyword: Obesity; Attention
Link ID: 27149 - Posted: 03.30.2020

By Douglas Starr When Jennifer Eberhardt appeared on The Daily Show with Trevor Noah in April 2019, she had a hard time keeping a straight face. But some of the laughs were painful. Discussing unconscious racial bias, which she has studied for years, the Stanford University psychologist mentioned the “other-race effect,” in which people have trouble recognizing faces of other racial groups. Criminals have learned to exploit the effect, she told Noah. In Oakland, California, a gang of black teenagers caused a mini–crime wave of purse snatchings among middle-aged women in Chinatown. When police asked the teens why they targeted that neighborhood, they said the Asian women, when faced with a lineup, “couldn’t tell the brothers apart.” “That is one of the most horrible, fantastic stories ever!” said Noah, a black South African. But it was true. Eberhardt has written that the phrase “they all look alike,” long the province of the bigot, “is actually a function of biology and exposure.” There’s no doubt plenty of overt bigotry exists, Eberhardt says; but she has found that most of us also harbor bias without knowing it. It stems from our brain’s tendency to categorize things—a useful function in a world of infinite stimuli, but one that can lead to discrimination, baseless assumptions, and worse, particularly in times of hurry or stress. Over the decades, Eberhardt and her Stanford team have explored the roots and ramifications of unconscious bias, from the level of the neuron to that of society. In cleverly designed experiments, she has shown how social conditions can interact with the workings of our brain to determine our responses to other people, especially in the context of race. Eberhardt’s studies are “strong methodologically and also super real-world relevant,” says Dolly Chugh of New York University’s Stern School of Business, a psychologist who studies decision-making. © 2020 American Association for the Advancement of Science.

Keyword: Attention; Emotions
Link ID: 27145 - Posted: 03.27.2020

By Scott Barry Kaufman Who are you and how did you become interested in free will? I am an Assistant Professor of Philosophy at Iona College where I also serve as a faculty member for the Iona Neuroscience program. I have previously worked in the Scientific and Philosophical Studies of Mind program at Franklin and Marshall College as well as previous appointments as a Lecturer at King’s College London and University of Alabama. My recent and forthcoming publications focus on issues of autonomy in terms of philosophical accounts of free will as well as how it intersects with neuroscience and psychiatry. One of the main questions I investigate is what neuroscience can tell us about meaningful agency (see here for my recent review of the topic as part of an extended review of research on agency, freedom, and responsibility for the John Templeton Foundation). I became interested in free will via an interdisciplinary route. As an undergraduate at Grinnell College, I majored in psychology with a strong emphasis on experimental psychology and clinical psychology. During my senior year at Grinnell I realized that I was fascinated by the theoretical issues operating in the background of the psychological studies that we read and conducted, especially issues of how the mind is related to the brain, prospects for the scientific study of consciousness, and how humans as agents fit into a natural picture of the world. So I followed these interests to the study of philosophy of psychology and eventually found my way to the perfect fusion of these topics: the neuroscience of free will. What is free will? Free will seems to be a familiar feature of our everyday lives — most of us believe that (at least at times) what we do is up to us to some extent. For instance, that I freely decided to take my job or that I am acting freely when I decide to go for a run this afternoon. Free will is not just that I move about in the world to achieve a goal, but that I exercise meaningful control over what I decide to do. My decisions and actions are up to me in the sense that they are mine — a product of my values, desires, beliefs, and intentions. I decided to take this job because I valued the institution’s mission or I believed that this job would be enriching or a good fit for me. Correspondingly, it seems to me that at least at times I could have decided to and done something else than what I did. I decided to go for a run this afternoon, but no one made me and I wasn’t subject to any compulsion; I could have gone for a coffee instead, at least it seems to me. Philosophers take these starting points and work to construct plausible accounts of free will. Broadly speaking, there is a lot of disagreement as to the right view of free will, but most philosophers believe that a person has free will if they have the ability to act freely, and that this kind of control is linked to whether it would be appropriate to hold that person responsible (e.g., blame or praise them) for what they do. For instance, we don’t typically hold people responsible for what they do if they were acting under severe threat or inner compulsion. © 2020 Scientific American

Keyword: Consciousness
Link ID: 27128 - Posted: 03.17.2020

As we get older, we become more easily distracted, but it isn't always a disadvantage, according to researchers. Tarek Amer, a psychology postdoctoral research fellow at Columbia University, says that although our ability to focus our attention on specific things worsens as we get older, our ability to take in broad swaths of information remains strong. So in general, older adults are able to retain information that a more focused person could not. For the last few years, Amer's research has focused mainly on cognitive control, a loose term that describes one's ability to focus their attention. His work at the University of Toronto, where he received his PhD in 2018, looked specifically at older adults aged 60 to 80. Amer joined Spark host Nora Young to discuss his research and how it could be implemented in practical ways. What happens to our ability to concentrate as we get older? There's a lot of research that shows as we get older, this ability tends to decline or is reduced with age. So essentially, what we see is that relative to younger adults, older adults have a harder time focusing on one thing while ignoring distractions. This distraction can be from the external world. This can also be internally based distractions, such as our own thoughts, which are usually not related to the task at hand. With respect to mind wandering specifically, the literature is ... mixed. [The] typical finding is that older adults tend to, at least in lab-based tasks, mind wander less. So I know that you've been looking, in your own research, at concentration and memory formation. So what exactly are you studying? One of the things I was interested in is whether this [decline in the ability to concentrate] could be associated with any benefits in old age. For example, one thing that we showed is that when older and younger adults perform a task that includes both task-relevant as well as task-irrelevant information, older adults are actually processing both types of information. So if we give them a memory task at the end that actually is testing memory for the irrelevant information … we see that older adults actually outperform younger adults. ©2020 CBC/Radio-Canada.

Keyword: Attention; Alzheimers
Link ID: 27116 - Posted: 03.14.2020

By Susana Martinez-Conde Parents tend to be just a bit biased about their children’s looks (not me though—my kids are objectively beautiful), but as it turns out, this type of self-deception is not as benign as one might think. According to recent research, many parents appear to suffer from a sort of denial concerning their kids’ weights, which poses a considerable obstacle to remediating childhood obesity by way of promoting healthy eating habits at home. The latest of such studies was published last month in the American Journal of Human Biology, and conducted by a team of scientists at the University of Coimbra in Portugal. Daniela Rodrigues and her collaborators, Aristides Machado-Rodrigues and Cristina Padez, recruited hundreds of parents and children for their research. All the participating children were between 6 and 10 years old and attended elementary school in Portugal. A total of 834 parents completed questionnaires that included a variety of questions, such as whether they thought that their children’s weight was a bit too little, a bit too much, way too much, or just fine. In turn, the team collected the weights and heights of the 793 participating children, at their respective schools. The results were in line with the researchers’ predictions, but nonetheless remarkable. Of the 33% parents who misperceived their children’s weight, 93% underestimated it. Moreover, parents who underestimated their kids’ weights were 10 to 20 times more likely to have an obese child. Several factors were associated with the parental weight underestimation, including a higher BMI (body mass index) for the mothers, younger ages for the children, lower household income (for girls) and urban living (for boys). However, such associations did not explain why parents underestimated their children’s weights to begin with. © 2020 Scientific American

Keyword: Obesity; Attention
Link ID: 27106 - Posted: 03.09.2020

Dori Grijseels In 2016, three neuroscientists wrote a commentary article arguing that, to truly understand the brain, neuroscience needed to change. From that paper, the International Brain Laboratory (IBL) was born. The IBL, now a collaboration between 22 labs across the world, is unique in biology. The IBL is modeled on physics collaborations, like the ATLAS experiment at CERN, where thousands of scientists work together on a common problem, sharing data and resources during the process. This was in response to the main criticism that the paper’s authors, Zachary Mainen, Michael Häusser and Alexandre Pouget, had about existing neuroscience collaborations: labs came together to discuss generalities, but all the experiments were done separately. They wanted to create a collaboration in which scientists worked together throughout the process, even though their labs may be distributed all over the globe. The IBL decided to focus on one brain function only: decision-making. Decision-making engages the whole brain, since it requires using both input from the senses and information about previous experiences. If someone is thinking about bringing a sweater when they go out, they will use their senses to determine whether it looks and feels cold outside, but they might also remember that, yesterday, they were cold without a sweater. For its first published (in pre-print form) experiment, seven labs of the 22 collaborating in the IBL tested 101 mice on their decision-making ability. The mice saw a black and white grating either to their right or to their left. They then had to twist a little Lego wheel to move the grating to the middle. By rewarding them with sugary water whenever they did the task correctly, the mice gradually learned. It is easy for them to decide which way to twist the wheel if the grating has a high contrast, because it stands out compared to the background of their visual field. However, the mice were also presented with a more ambiguously-patterned grating not easily distinguishable from the background, so the decision of which way to turn the wheel was more difficult. In some cases, the grating was even indistinguishable from the background. Between all seven labs –which were spread across three countries – the mice completed this task three million times. © 2017 – 2019 Massive Science Inc.

Keyword: Attention; Learning & Memory
Link ID: 27102 - Posted: 03.07.2020

By Liz Langley It might be time to reconsider what it means to call someone a “rat.” Previous research has shown the much-maligned rodents assist comrades in need, as well as remember individual rats that have helped them—and return the favor. Now, a new study builds on this evidence of empathy, revealing that domestic rats will avoid harming other rats. In the study, published March 5 in the journal Current Biology, rats were trained to pull levers to get a tasty sugar pellet. If the lever delivered a mild shock to a neighbor, several of the rats stopped pulling that lever and switched to another. Harm aversion, as it's known, is a well-known human trait regulated by a part of the brain called the anterior cingulate cortex (ACC). Further experiments showed the ACC controls this behavior in rats, too. This is the first time scientists have found the ACC is necessary for harm aversion in a non-human species. This likeness between human and rat brains is “super-exciting for two reasons,” says study co-author Christian Keysers, of the Netherlands Institute for Neuroscience. For one, it suggests that preventing harm to others is rooted deep in mammals' evolutionary history. (See what a rat looks like when it’s happy.) What’s more, the finding could have a real impact on people suffering from psychiatric disorders such as psychopathy and sociopathy, whose anterior cingulate cortexes are impaired. “We currently have no effective drugs to reduce violence in antisocial populations,” Keysers says, and figuring out how to increase such patients’ aversion to hurting others could be a powerful tool.

Keyword: Attention; Emotions
Link ID: 27101 - Posted: 03.07.2020

By Cindi May Music makes life better in so many ways. It elevates mood, reduces stress and eases pain. Music is heart-healthy, because it can lower blood pressure, reduce heart rate and decrease stress hormones in the blood. It also connects us with others and enhances social bonds. Music can even improve workout endurance and increase our enjoyment of challenging activities. The fact that music can make a difficult task more tolerable may be why students often choose to listen to it while doing their homework or studying for exams. But is listening to music the smart choice for students who want to optimize their learning? A new study by Manuel Gonzalez of Baruch College and John Aiello of Rutgers University suggests that for some students, listening to music is indeed a wise strategy, but for others, it is not. The effect of music on cognitive functioning appears not to be “one-size-fits-all” but to instead depend, in part, on your personality—specifically, on your need for external stimulation. People with a high requirement for such stimulation tend to get bored easily and to seek out external input. Those individuals often do worse, paradoxically, when listening to music while engaging in a mental task. People with a low need for external stimulation, on the other hand, tend to improve their mental performance with music. But other factors play a role as well. Gonzalez and Aiello took a fairly sophisticated approach to understanding the influence of music on intellectual performance, assessing not only listener personality but also manipulating the difficulty of the task and the complexity of the music. Whether students experience a perk or a penalty from music depends on the interplay of the personality of the learner, the mental task, and the music. © 2020 Scientific American

Keyword: Learning & Memory; Attention
Link ID: 27093 - Posted: 03.05.2020

By Virginia Morell Whether it’s calculating your risk of catching the new coronavirus or gauging the chance of rain on your upcoming beach vacation, you use a mix of statistical, physical, and social information to make a decision. So do New Zealand parrots known as keas, scientists report today. It’s the first time this cognitive ability has been demonstrated outside of apes, and it may have implications for understanding how intelligence evolved. “It’s a neat study,” says Karl Berg, an ornithologist and parrot expert at the University of Texas Rio Grande Valley, Brownsville, who was not involved with this research. Keas already had a reputation in New Zealand—and it wasn’t a great one. The olive-brown, crow-size birds can wield their curved beaks like knives—and did so on early settlers’ sheep, slicing through wool and muscle to reach the fat along their spines. These days, they’re notorious for slashing through backpacks for food and ripping windshield wipers off cars. To see whether keas’ intelligence extended beyond being mischievous, Amalia Bastos, a doctoral candidate in comparative psychology at the University of Auckland, and colleagues turned to six captive keas at a wildlife reserve near Christchurch, New Zealand. The researchers taught the birds that a black token always led to a tasty food pellet, whereas an orange one never did. When the scientists placed two transparent jars containing a mix of tokens next to the keas and removed a token with a closed hand, the birds were more likely to pick hands dipped into jars that contained more black than orange tokens, even if the ratio was as close as 63 to 57. That experiment combined with other tests “provide conclusive evidence” that keas are capable of “true statistical inference,” the scientists report in today’s issue of Nature Communications. © 2020 American Association for the Advancement of Science

Keyword: Evolution; Attention
Link ID: 27092 - Posted: 03.04.2020

Differences associated with learning difficulties are found less in specific areas of the brain and more in the connections between them, experts say. After scanning 479 children's brains, Cambridge University researchers found they were organised in multiple "hubs". Those with no difficulties - or very specific ones, such as poor listening skills - had well connected hubs. But those with widespread and severe difficulties - 14-30% of all children - were found to have poor connections. It was recently suggested schools were failing to spot ADHD and autism, which could be contributing to a rise in exclusions. Dr Duncan Astle told BBC News: "We have spent decades searching for the brain areas for different types of developmental difficulty such as ADHD and dyslexia. "Our findings show that something which is far more important is the way a child's brain is organised. "In particular, the role that highly connected 'hub' regions play. "This has not been shown before and its implications for our scientific understanding of developmental difficulties is big. "How do these hubs emerge over developmental time? "What environmental and genetic factors can influence this emergence?" "Another key finding is that the diagnostic labels children had been given were not closely related to their cognitive difficulties - for example, two children with ADHD [attention deficit hyperactivity disorder] could be very different from each other. "This has been well known in practice for a long time but poorly documented in the scientific literature." Mental-health disorders © 2020 BBC

Keyword: ADHD; Dyslexia
Link ID: 27080 - Posted: 02.28.2020

Jordana Cepelewicz Decisions, decisions. All of us are constantly faced with conscious and unconscious choices. Not just about what to wear, what to eat or how to spend a weekend, but about which hand to use when picking up a pencil, or whether to shift our weight in a chair. To make even trivial decisions, our brains sift through a pile of “what ifs” and weigh the hypotheticals. Even for choices that seem automatic — jumping out of the way of a speeding car, for instance — the brain can very quickly extrapolate from past experiences to make predictions and guide behavior. In a paper published last month in Cell, a team of researchers in California peered into the brains of rats on the cusp of making a decision and watched their neurons rapidly play out the competing choices available to them. The mechanism they described might underlie not just decision-making, but also animals’ ability to envision more abstract possibilities — something akin to imagination. The group, led by the neuroscientist Loren Frank of the University of California, San Francisco, investigated the activity of cells in the hippocampus, the seahorse-shaped brain region known to play crucial roles both in navigation and in the storage and retrieval of memories. They gave extra attention to neurons called place cells, nicknamed “the brain’s GPS” because they mentally map an animal’s location as it moves through space. Place cells have been shown to fire very rapidly in particular sequences as an animal moves through its environment. The activity corresponds to a sweep in position from just behind the animal to just ahead of it. (Studies have demonstrated that these forward sweeps also contain information about the locations of goals or rewards.) These patterns of neural activity, called theta cycles, repeat roughly eight times per second in rats and represent a constantly updated virtual trajectory for the animals. All Rights Reserved © 2020

Keyword: Attention; Learning & Memory
Link ID: 27070 - Posted: 02.25.2020

By Abby Sher The rules were simple. Whenever Madonna sang, we strutted our stuff up and down the matted blue carpet. If the music stopped, we struck a pose in front of the full-length mirror. “Your face is crooked!” my friend Diana shrieked. “Your legs are 10 feet long!” I yelled back. It wasn’t an insult; it was true. The mirror in my bedroom was old and warped, like in a fun house. We spent hours in front of it, jutting out our hips and crossing our eyes; laughing at how ugly we looked. How round and pointy, long and short we could be, all at the same time. I don’t know exactly when it became painful for me to look at my reflection. Maybe when I was told to cover the mirrors in our house for my father’s funeral (a Jewish tradition). I was 11 at the time and couldn’t understand how these pale lips and string bean legs of mine were here, while my dad was forever gone. So I kept staring at my body in that glass, feeling a new kind of grief and confusion rip through me. A few weeks later, I started junior high, where looks were everything. I used a mirror so I could run turquoise eyeliner across my lids or zero in on a blooming pimple. But I got more and more frustrated by what I saw. My splotchy skin and bushy eyebrows felt untamable; my arms too long. By high school, I grew out my frizzy bangs to hide my face and wore baggy overalls with a tiny cowbell around my neck, as if I were lost in the fields and needed to find my way home. It wasn’t until after college that I dove headlong into an eating disorder. There was no definitive moment where I said, I’m going to try starving myself today. Instead it was a gradual whittling away at my body. I became obsessed with shrinking myself down to a size 0; spending hours at the gym until I was dizzy and frantic, fueling myself on coffee and sugarless gum. © 2020 The New York Times Company

Keyword: Anorexia & Bulimia; Attention
Link ID: 27068 - Posted: 02.25.2020

By Sara Reardon To many people’s eyes, artist Mark Rothko’s enormous paintings are little more than swaths of color. Yet a Rothko can fetch nearly $100 million. Meanwhile, Pablo Picasso’s warped faces fascinate some viewers and terrify others. Why do our perceptions of beauty differ so widely? The answer may lie in our brain networks. Researchers have now developed an algorithm that can predict art preferences by analyzing how a person’s brain breaks down visual information and decides whether a painting is “good.” The findings show for the first time how intrinsic features of a painting combine with human judgment to give art value in our minds. Most people—including researchers—consider art preferences to be all over the map, says Anjan Chatterjee, a neurologist and cognitive neuroscientist at the University of Pennsylvania who was not involved in the study. Many preferences are rooted in biology–sugary foods, for instance, help us survive. And people tend to share similar standards of beauty when it comes to human faces and landscapes. But when it comes to art, “There are relatively arbitrary things we seem to care about and value,” Chatterjee says. To figure out how the brain forms value judgments about art, computational neuroscientist Kiyohito Iigaya and his colleagues at the California Institute of Technology first asked more than 1300 volunteers on the crowdsourcing website Amazon Mechanical Turk to rate a selection of 825 paintings from four Western genres including impressionism, cubism, abstract art, and color field painting. Volunteers were all over the age of 18, but researchers didn’t specify their familiarity with art or their ethnic or national origin. © 2020 American Association for the Advancement of Science

Keyword: Vision; Attention
Link ID: 27062 - Posted: 02.21.2020

By Richard Klasco, M.D. A. The theory of the “sugar high” has been debunked, yet the myth persists. The notion that sugar might make children behave badly first appeared in the medical literature in 1922. But the idea did not capture the public’s imagination until Dr. Ben Feingold’s best-selling book, “Why Your Child Is Hyperactive,” was published in 1975. In his book, Dr. Feingold describes the case of a boy who might well be “patient zero” for the putative connection between sugar and hyperactivity: [The mother’s] fair-haired, wiry son loved soft drinks, candy and cake — not exactly abnormal for any healthy child. He also seemed to go completely wild after birthday parties and during family gatherings around holidays. In the mid-’70s, stimulant drugs such as Ritalin and amphetamine were becoming popular for the treatment of attention deficit hyperactivity disorder. For parents who were concerned about drug side effects, the possibility of controlling hyperactivity by eliminating sugar proved to be an enticing, almost irresistible, prospect. Some studies supported the theory. They suggested that high sugar diets caused spikes in insulin secretion, which triggered adrenaline production and hyperactivity. But the data were weak and were soon questioned by other scientists. An extraordinarily rigorous study settled the question in 1994. Writing in the New England Journal of Medicine, a group of scientists tested normal preschoolers and children whose parents described them as being sensitive to sugar. Neither the parents, the children nor the research staff knew which of the children were getting sugary foods and which were getting a diet sweetened with aspartame and other artificial sweeteners. Urine was tested to verify compliance with the diets. Nine different measures of cognitive and behavioral performance were assessed, with measurements taken at five-second intervals. © 2020 The New York Times Company

Keyword: ADHD; Obesity
Link ID: 27060 - Posted: 02.21.2020

Maternal obesity may increase a child’s risk for attention-deficit hyperactivity disorder (ADHD), according to an analysis by researchers from the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), part of the National Institutes of Health. The researchers found that mothers — but not fathers — who were overweight or obese before pregnancy were more likely to report that their children had been diagnosed with attention-deficit hyperactivity disorder (ADHD) or to have symptoms of hyperactivity, inattentiveness or impulsiveness at ages 7 to 8 years old. Their study appears in The Journal of Pediatrics. The study team analyzed the NICHD Upstate KIDS Study, which recruited mothers of young infants and followed the children through age 8 years. In this analysis of nearly 2,000 children, the study team found that women who were obese before pregnancy were approximately twice as likely to report that their child had ADHD or symptoms of hyperactivity, inattention or impulsiveness, compared to children of women of normal weight before pregnancy. The authors suggest that, if their findings are confirmed by additional studies, healthcare providers may want to screen children of obese mothers for ADHD so that they could be offered earlier interventions. The authors also note that healthcare providers could use evidence-based strategies to counsel women considering pregnancy on diet and lifestyle. Resources for plus-size pregnant women and their healthcare providers are available as part of NICHD’s Pregnancy for Every Body initiative.

Keyword: ADHD; Development of the Brain
Link ID: 27055 - Posted: 02.20.2020

By Laura Sanders SEATTLE — Live bits of brain look like any other piece of meat — pinkish, solid chunks of neural tissue. But unlike other kinds of tissue or organs donated for research, they hold the memories, thoughts and feelings of a person. “It is identified with who we are,” Karen Rommelfanger, a neuroethicist at Emory University in Atlanta, said February 13 in a news conference at the annual meeting of the American Association for the Advancement of Science. That uniqueness raises a whole new set of ethical quandaries when it comes to experimenting with living brain tissue, she explained. Such donations are crucial to emerging research aimed at teasing out answers to what makes us human. For instance, researchers at the Seattle-based Allen Institute for Brain Science conduct experiments on live brain tissue to get clues about how the cells in the human brain operate (SN: 8/7/19). These precious samples, normally discarded as medical waste, are donated by patients undergoing brain surgery and raced to the lab while the nerve cells are still viable. Other experiments rely on systems that are less sophisticated than a human brain, such as brain tissue from other animals and organoids. These clumps of neural tissue, grown from human stem cells, are still a long way from mimicking the complexities of the human brain (SN: 10/24/19). But with major advances, these systems might one day be capable of much more advanced behavior, which might ultimately lead to awareness, a conundrum that raises ethical issues. © Society for Science & the Public 2000–2020

Keyword: Consciousness; Emotions
Link ID: 27047 - Posted: 02.18.2020

By Tam Hunt Strangely, modern science was long dominated by the idea that to be scientific means to remove consciousness from our explanations, in order to be “objective.” This was the rationale behind behaviorism, a now-dead theory of psychology that took this trend to a perverse extreme. Behaviorists like John Watson and B.F. Skinner scrupulously avoided any discussion of what their human or animal subjects thought, intended or wanted, and focused instead entirely on behavior. They thought that because thoughts in other peoples’ heads, or in animals, are impossible to know with certainty, we should simply ignore them in our theories. We can only be truly scientific, they asserted, if we focus solely on what can be directly observed and measured: behavior. Erwin Schrödinger, one of the key architects of quantum mechanics in the early part of the 20th century, labeled this approach in his philosophical 1958 book Mind and Matter, the “principle of objectivation” and expressed it clearly: Advertisement “By [the principle of objectivation] I mean … a certain simplification which we adopt in order to master the infinitely intricate problem of nature. Without being aware of it and without being rigorously systematic about it, we exclude the Subject of Cognizance from the domain of nature that we endeavor to understand. We step with our own person back into the part of an onlooker who does not belong to the world, which by this very procedure becomes an objective world.” Schrödinger did, however, identify both the problem and the solution. He recognized that “objectivation” is just a simplification that is a temporary step in the progress of science in understanding nature. © 2020 Scientific American

Keyword: Consciousness
Link ID: 27044 - Posted: 02.18.2020