Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 3141 - 3160 of 29522

By Bradley Berman The day is approaching when commuters stuck in soul-crushing traffic will be freed from the drudgery of driving. Companies are investing billions to devise sensors and algorithms so motorists can turn our attention to where we like it these days: our phones. But before the great promise of multitasking on the road can be realized, we need to overcome an age-old problem: motion sickness. “The autonomous-vehicle community understands this is a real problem it has to deal with,” said Monica Jones, a transportation researcher at the University of Michigan. “That motivates me to be very systematic.” So starting in 2017, Ms. Jones led a series of studies in which more than 150 people were strapped into the front seat of a 2007 Honda Accord. They were wired with sensors and set on a ride that included roughly 50 left-hand turns and other maneuvers. Each subject was tossed along the same twisty route for a second time but also asked to complete a set of 13 simple cognitive and visual tasks on an iPad Mini. About 11 percent of the riders got nauseated or, for other reasons, asked that the car be stopped. Four percent vomited. Ms. Jones takes no joy in documenting her subjects’ getting dizzy, hyperventilating or losing their lunch. She feels their pain. Ms. Jones, a chronic sufferer of motion sickness, has experienced those discomforts in car back seats all her life. “I don’t remember not experiencing it,” she said. “As I’m getting older, it’s getting worse.” It’s also getting worse for the legions of commuters hailing Ubers or taxis and hopping in, barely lifting their gaze from a screen in the process. © 2020 The New York Times Company

Keyword: Miscellaneous
Link ID: 26976 - Posted: 01.21.2020

Edward Bullmore Unlikely as it may seem, #inflammation has become a hashtag. It seems to be everywhere suddenly, up to all sorts of tricks. Rather than simply being on our side, fighting infections and healing wounds, it turns out to have a dark side as well: the role it plays in causing us harm. It’s now clear that inflammation is part of the problem in many, if not all, diseases of the body. And targeting immune or inflammatory causes of disease has led to a series of breakthroughs, from new treatments for rheumatoid arthritis and other auto-immune diseases in the 1990s, through to the advent of immunotherapy for some cancers in the 2010s. Even more pervasively, low-grade inflammation, detectable only by blood tests, is increasingly considered to be part of the reason why common life experiences such as poverty, stress, obesity or ageing are bad for public health. Advertisement The brain is rapidly emerging as one of the new frontiers for inflammation. Doctors like myself, who went to medical school in the 20th century, were taught to think that there was an impermeable barrier between the brain and the immune system. In the 21st century, however, it has become clear that they are deeply interconnected and talk to each other all the time. Medical minds are now opening up to the idea that inflammation could be as widely and deeply implicated in brain and mind disorders as it is in bodily disorders. Advances in treatment of multiple sclerosis have shown the way. Many of the new medicines for MS were designed and proven to protect patients from brain damage caused by their own immune systems. The reasonably well-informed hope – and I emphasise those words at this stage – is that targeting brain inflammation could lead to breakthroughs in prevention and treatment of depression, dementia and psychosis on a par with the proven impact of immunological medicines for arthritis, cancer and MS. Indeed, a drug originally licensed for multiple sclerosis is already being tried as a possible immune treatment for schizophrenia. © 2020 Guardian News & Media Limited

Keyword: Alzheimers; Neuroimmunology
Link ID: 26975 - Posted: 01.21.2020

By Tina Hesman Saey Some hairy cells in the nose may trigger sneezing and allergies to dust mites, mold and other substances, new work with mice suggests. When exposed to allergens, these “brush cells” make chemicals that lead to inflammation, researchers report January 17 in Science Immunology. Only immune cells previously were thought to make such inflammatory chemicals — fatty compounds known as lipids. The findings may provide new clues about how people develop allergies. Brush cells are shaped like teardrops topped by tufts of hairlike projections. In people, mice and other animals, these cells are also found in the linings of the trachea and the intestines, where they are known as tuft cells (SN: 4/13/18). However, brush cells are far more common in the nose than in other tissues, and may help the body identify when pathogens or noxious chemicals have been inhaled, says Lora Bankova, an allergist and immunologist at Brigham and Women’s Hospital in Boston. Bankova and her colleagues discovered that, when exposed to certain molds or dust mite proteins, brush cells in mice’s noses churn out inflammation-producing lipids, called cysteinyl leukotrienes. The cells also made the lipids when encountering ATP, a chemical used by cells for energy that also signals when nearby cells are damaged, as in an infection. Mice exposed to allergens or ATP developed swelling of their nasal tissues. But mice that lacked brush cells suffered much less inflammation. Such inflammation may lead to allergies in some cases. The researchers haven’t yet confirmed that brush cells in human noses respond to allergens in the same way as these cells do in mice. © Society for Science & the Public 2000–2020

Keyword: Chemical Senses (Smell & Taste); Neuroimmunology
Link ID: 26974 - Posted: 01.21.2020

By Harry Guinness The world isn’t made for night owls. You struggle into work in the dark hours before 10 a.m. — or your morning coffee — and you’re greeted by some chipper person who has already been to the gym and is six items into his to-do list. I used to fantasize about fitting punishments for such morning people, but in the last two years I’ve seen the (morning) light, and I’ve become one of them. If you love staying up late but hate crawling through your mornings in a haze, here’s how you can do it too. After a long, draining day you finally get home, settle down in front of the TV and throw on whatever season you’re currently bingeing. Heaven. But then, when a reasonable bedtime rolls around, you don’t want to stop. It has been a hard day, aren’t you entitled to just one more episode? So you push play, trade a bit of sleep for more Netflix time and continue the cycle that keeps you tired all the time. Dr. Alex Dimitriu, founder of the Menlo Park Psychiatry and Sleep Medicine clinic, explained it like this: “Long days leave us tired and exhausted, but the reality is, our days would be less hard, and less exhausting, if we weren’t so tired through them. The trouble with being a night owl is that your sleep gets clipped in the morning hours, where most of the precious REM or dream sleep occurs. Instead of sleeping seven or eight hours per night, most night owls get forced to sleep five or six — with a hard start time in the morning.” Dr. Dimitriu can’t stress enough just how important REM sleep is. It’s “the key to our emotional and creative energy” and comparable to “self-therapy,” he said, adding that it “balances us out in more ways than I can describe” and that without enough of it, our memory and moods take a hit. If you have the freedom to wake up when you like, then things are different, but if that extra Netflix episode is forcing you to cut your sleep short, then you should try to do something about it. © 2020 The New York Times Company

Keyword: Biological Rhythms; Sleep
Link ID: 26973 - Posted: 01.21.2020

By Aaron E. Carroll Childhood obesity is a major public health problem, and has been for some time. Almost 20 percent of American children are affected by obesity, as well as about 40 percent of adults. Over all, this costs the United States around $150 billion in health care spending each year. Pediatricians like me, and many other health professionals, know it’s a problem, and yet we’ve been relatively unsuccessful in tackling it. About six years ago, some reports seemed to show that rates had stabilized in children and even decreased in those ages 2 to 5. Later studies showed this trend to be an illusion. If anything, things have gotten worse. Efforts to help can backfire. People on diets often gain weight. Although individual studies have pointed to potential interventions and solutions, these have not yet translated into actual improvements. Part of the problem may be flawed research. A recent paper in Pediatric Obesity provided a guide on how to do better. Its suggestions fall into five general themes. 1) When things look better, it’s critical to ask “compared to what?” In short, you need a control group. Over time, changes in behaviors or measurements often follow a pattern known as regression toward the mean. Outliers (in this case those who are more overweight) tend to move toward the average. Thus, interventions might look as if they’re working when they’re not. Control groups — participants who don’t receive the intervention — can help ensure that we’re seeing real effectiveness. Even then, things can get tricky. In a randomized controlled trial, it’s important to keep the comparisons directly between the intervention and control groups. A common mistake is comparing each group after the intervention with the same group before the intervention. In other words, people could compare a dieting group to itself, before and after, and compare the control group to itself, before and after, to see if the © 2020 The New York Times Company

Keyword: Obesity
Link ID: 26972 - Posted: 01.20.2020

By Donna Jackson Nakazawa More than a decade ago, I was diagnosed with a string of autoimmune diseases, one after another, including a bone marrow disorder, thyroiditis, and then Guillain-Barré syndrome, which left me paralyzed while raising two young children. I recovered from Guillain-Barré only to relapse, becoming paralyzed again. My immune system was repeatedly and mistakenly attacking my body, causing the nerves in my arms, legs, and those I needed to swallow to stop communicating with my brain, leaving me confined to — and raising my children from — bed. As I slowly began to recover and learn to walk again, I noticed that along with residual physical losses I had experienced shifts in my mood and clarity of mind. Although I’d always been an optimistic person, I felt a bleak unshakable dread, which didn’t feel like the “old me.” I also noticed cognitive glitches. Names, words, and facts were hard to bring to mind. I can still recall cutting up slices of watermelon, putting them in a bowl, and staring down at them thinking, “What is this again?” I knew the word but couldn’t remember it. I covered my lapse by bringing the bowl to the table and waiting for my children to call out, “Yay! Watermelon!” And I thought, “Yes. Of course. Watermelon.” As a science journalist whose niche spans neuroscience, immunology, and human emotion, I knew at the time that it didn’t make scientific sense that inflammation in the body could be connected to — much less cause — illness in the brain. At that time, scientific dogma held that the brain was the only organ in the body not ruled by the immune system. The brain was considered to be “immune privileged.” © 2020 STAT

Keyword: Glia; Alzheimers
Link ID: 26971 - Posted: 01.20.2020

Jennifer Rankin in Brussels A pioneering Belgian neurologist has been awarded €1m to fund further work in helping diagnose the most severe brain injuries, as he seeks to battle “the silent epidemic” and help people written off as “vegetative” who, it is believed, will never recover. Steven Laureys, head of the coma science group at Liège University hospital, plans to use the £850,000 award – larger than the Nobel prize – to improve the diagnosis of coma survivors labelled as being in a “persistent vegetative state”. That is “a horrible term” he says, although still one widely used by the general public and many clinicians. Laureys, who has spent more than two decades exploring the boundaries of human consciousness, prefers the term “unresponsive wakefulness” to describe people who are unconscious but show signs of being awake, such as opening their eyes or moving. These patients are often wrongly described as being in a coma, a condition that only lasts a few weeks, in which people are completely unresponsive. “The old view was to consider consciousness, which was one of the biggest mysteries for science to solve, as all or nothing,” he told the Guardian, shortly after he was awarded the Generet prize by Belgium’s King Baudouin Foundation this week. He said that a third of patients he treats at the Liège coma centre had been wrongly diagnosed as being in a vegetative state, despite signs of consciousness. As a young doctor in the 1990s he was frustrated by the questions that torture the families of coma survivors: can their loved ones see or hear them? Can they feel anything, including pain? Laureys and his 30-strong team of engineers and clinicians have shown that some of those with a “vegetative state” diagnosis are minimally conscious, showing signs of awareness such as responding to commands with their eyes. © 2020 Guardian News & Media Limited

Keyword: Consciousness
Link ID: 26970 - Posted: 01.20.2020

Hannah Devlin Science correspondent The death in 2002 of the former England and West Bromwich Albion striker Jeff Astle from degenerative brain disease placed the spotlight firmly on the possibility of a link between heading footballs and the risk of dementia. The coroner at the inquest ruled that Astle, 59, died from an “industrial disease” brought on by the repeated trauma of headers, and a later examination of Astle’s brain appeared to bear out this conclusion. At that time there was sparse scientific data on the issue, but since then the balance of evidence has steadily tipped further in favour of a link. It has been shown that even single episodes of concussion can have lifelong consequences. Children in Scotland could be banned from heading footballs over dementia link Read more A 2016 study based on health records of more than 100,000 people in Sweden found that after a single diagnosed concussion people were more likely to have mental health problems and less likely to graduate from high school and college. Other research has shown that people in prison or homeless are more likely to have had a past experience of concussion. In 2017, researchers from University College London examined postmortem the brains of six former footballers who had developed dementia. They found signs of brain injury called chronic traumatic encephalopathy (CTE) in four cases. Last year a study by a team at Glasgow University found that former professional footballers were three and a half times more likely to die from dementia and other serious neurological diseases. The study was the largest ever, based on the health records of 7,676 ex-players and 23,000 members of the public, and was possibly the trigger for the Scottish FA’s plan to follow US soccer in banning heading the ball for young players. © 2020 Guardian News & Media Limited

Keyword: Brain Injury/Concussion
Link ID: 26969 - Posted: 01.17.2020

By Betsy Mason Despite weighing less than half an ounce, mountain chickadees are able to survive harsh winters complete with subzero temperatures, howling winds and heavy snowfall. How do they do it? By spending the fall hiding as many as 80,000 individual seeds, which they then retrieve — by memory — during the winter. Their astounding ability to keep track of that many locations puts their memory among the most impressive in the animal kingdom. It also makes chickadees an intriguing subject for animal behavior researchers. Cognitive ecologist Vladimir Pravosudov of the University of Nevada, Reno, has dedicated his career to studying this tough little bird’s amazing memory. Writing in 2013 on the cognitive ecology of food caching in the Annual Review of Ecology, Evolution, and Systematics, he and coauthor Timothy Roth argued that answers to big questions about the evolution of cognition may lie in the brains of these little birds. In July, at a meeting of the Animal Behavior Society in Chicago, Pravosudov presented his group’s latest research on the wild chickadees that live in the Sierra Nevada mountains. He and his graduate students were able to show for the first time that an individual bird’s spatial memory has a direct impact on its survival. The team did this by building an experimental contraption that uses radio-frequency identification (RFID) technology and electronic leg bands to test individual birds’ memory in the wild and then track their longevity. The researchers found that the birds with the best memory were most likely to survive the winter. What are some of the big ideas driving your work on chickadees? If some species are smart, or not smart, the question is: Why? Cognitive ecologists like me are specifically trying to figure out which ecological factors may have shaped the evolution of these differences in cognition. In other words, the idea is to understand the ecological and evolutionary reasons for variation in cognition. © 2020 Annual Reviews, Inc

Keyword: Learning & Memory
Link ID: 26968 - Posted: 01.17.2020

By Karen Weintraub Alzheimer's disease has long been characterized by the buildup of two distinct proteins in the brain: first beta-amyloid, which accumulates in clumps, or plaques, and then tau, which forms toxic tangles that lead to cell death. But how beta-amyloid leads to the devastation of tau has never been precisely clear. Now a new study at the University of Alabama at Birmingham appears to describe that missing mechanism. The study details a cascade of events. Buildup of beta-amyloid activates a receptor that responds to a brain chemical called norepinephrine, which is commonly known for mobilizing the brain and body for action. Activation of this receptor by both beta-amyloid and norepinephrine boosts the activity of an enzyme that activates tau and increases the vulnerability of brain cells to it, according to the study, published in Science Translational Medicine. Essentially, beta-amyloid hijacks the norepinephrine pathway to trigger a toxic buildup of tau, says Qin Wang, the study’s senior author and a professor of neuropharmacology in the department of cell, developmental and integrative biology at the University of Alabama at Birmingham. “We really show that this norepinephrine is a missing piece of this whole Alzheimer’s disease puzzle,” she says. This cascade explains why so many previous Alzheimer’s treatments have failed, Wang says. Most of the drugs developed in recent decades have targeted the elimination of beta-amyloid. But the new research suggests that norepinephrine amplifies the damage wrought by that protein. © 2020 Scientific American

Keyword: Alzheimers
Link ID: 26967 - Posted: 01.17.2020

Ashley Yeager About four years ago, pathologist Matthew Anderson was examining slices of postmortem brain tissue from an individual with autism under a microscope when he noticed something extremely odd: T cells swarming around a narrow space between blood vessels and neural tissue. The cells were somehow getting through the blood-brain barrier, a wall of cells that separates circulating blood from extracellular fluid, neurons, and other cell types in the central nervous system, explains Anderson, who works at Beth Israel Deaconess Medical Center in Boston. “I just have seen so many brains that I know that this is not normal.” He soon identified more T-cell swarms, called lymphocytic cuffs, in a few other postmortem brains of people who had been diagnosed with autism. Not long after that, he started to detect another oddity in the brain tissue—tiny bubbles, or blebs. “I’d never seen them in any other brain tissue that I’ve looked at for many, many different diseases,” he says. Anderson began to wonder whether the neurological features he was observing were specific to autism. To test the idea, he and his colleagues examined postmortem brain tissue samples from 25 people with autism spectrum disorder (ASD) and 30 developmentally normal controls. While the lymphocytic cuffs only sporadically turned up in the brains of neurotypical individuals, the cuffs were abundant in a majority of the brains from individuals who had had ASD. Those same samples also had blebs that appeared in the same spots as the cuffs. Staining the brain tissue revealed that the cuffs were filled with an array of different types of T cells, while the blebs contained fragments of astrocytes, non-neuronal cells that support the physical structure of the brain and help to maintain the blood-brain barrier. © 1986–2020 The Scientist

Keyword: Autism; Neuroimmunology
Link ID: 26966 - Posted: 01.17.2020

Merrit Kennedy Smoking can be an easy habit to pick up and a hard one to quit. Here's the good news — there are decades of research on how to drop the habit. And we heard from hundreds of former smokers about how they did it. If you've tried to quit before, and it didn't work out, don't let that discourage you from trying again. It's common for quitting to take multiple attempts. "It's not a one time event. It is a process," says Gary Tedeschi, the clinical director at the California Smokers' Helpline. "And if I could say nothing else, I would say never ever stop trying to quit." We heard about a wide range of methods that helped people quit—and the truth is that no one method will work for everyone. But it's clear that having a roadmap for how you want to quit is going to boost your chances of succeeding. 1. You need a plan "A lot of smokers, when they are thinking about quitting, they sort of dive in without a plan," says Yvonne Prutzman, a scientist from the National Cancer Institute's Tobacco Control Research Branch. "And maybe the plan is to rely on willpower — but that makes it a lot harder for them," she adds. Your plan might be pretty personal. People quit many different ways — and reach the conclusion that they need to quit for very different reasons. For example, for Stacey Moore from Georgia, a serious health scare prompted her decision. "Just a couple of weeks ago I woke up with what I thought was a cancerous lump in my throat," she tells NPR. "It turned out to just be tonsillitis but it scared me enough that I knew I just had to stop, I just can't play this roulette game anymore." Others, like Greg Moulton from South Carolina, spent months or even years preparing to quit, slowly reducing the amount of nicotine they were taking in every day. "You slay the beast slowly and let it bleed to death on its own," he says. © 2020 npr

Keyword: Drug Abuse
Link ID: 26965 - Posted: 01.17.2020

Hannah Devlin Science correspondent A groundbreaking brain-scanning technique has uncovered evidence that suggests schizophrenia is linked to a loss of connections between brain cells. Scientists had previously suspected a breakdown in the connections between neurons played a role in the condition, based on postmortem studies. The latest research, the first to find evidence for this in the brains of living people, could pave the way for new and better treatment. Prof Oliver Howes from the MRC London Institute of Medical Sciences, Imperial College London and King’s College London, who led the study, said: “Our current treatments for schizophrenia target only one aspect of the disease: the psychotic symptoms. “But the debilitating cognitive symptoms, such as loss of abilities to plan and remember, often cause much more long-term disability and there’s no treatment for them at the moment.” Howes believes the loss of connections, known as synapses, between brain cells, could be responsible for this broader array of symptoms. The study, published in Nature Communications, focused on measuring a protein found in synapses called SV2A, which has been shown to be a good marker of the overall density of connections in the brain. They used a tracer that binds to the protein and which emits a signal that can be picked up by a PET brain scan, which provided an indirect measure of the density of connections. The team scanned 18 adults with schizophrenia and compared them with 18 people without the condition. They found that levels of SV2A were significantly lower in the front of the brain – the region involved in planning – in people with schizophrenia. © 2020 Guardian News & Media Limited

Keyword: Schizophrenia
Link ID: 26964 - Posted: 01.15.2020

By Laura Sanders A parasite common in cats can eliminate infected mice’s fear of felines — a brain hijack that leads to a potentially fatal attraction. But this cat-related boldness (SN: 9/18/13) isn’t the whole story. Once in the brain, the single-celled parasite Toxoplasma gondii makes mice reckless in all sorts of dangerous scenarios, researchers write January 14 in Cell Reports. Infected mice spent more time in areas that were out in the open, exposed places that uninfected mice usually avoid. Infected mice also prodded an experimenter’s hand inside a cage — an intrusion that drove uninfected mice to the other side of the cage. T. gondii–infected mice were even unfazed by an anesthetized rat, a mouse predator, the researchers from the University of Geneva and colleagues found. And infected mice spent more time than uninfected mice exploring the scents of foxes and relatively harmless guinea pigs. The extent of mice’s infections, measured by the load of parasite cysts in the brain, seemed to track with the behavior changes, the researchers report. Toxoplasma gondiiToxoplasma gondii, tweaked to glow green, was isolated from the brain of an infected mouse.Pierre-Mehdi Hammoudi, Damien Jacot The parasite needs to get into the guts of cats to sexually reproduce. Other animals can become infected by ingesting T. gondii through direct or indirect contact with cat feces. The parasite can then spread throughout the body and ultimately form cysts in the brain. People can become infected with T. gondii, though usually not as severely as mice. Some studies have hinted, however, at links between the parasite and human behaviors such as inattention and suicide, as well as mental disorders such as schizophrenia. © Society for Science & the Public 2000–2020

Keyword: Emotions
Link ID: 26963 - Posted: 01.15.2020

By Tom Siegfried Long before Apple watches, grandfather clocks or even sundials, nature provided living things with a way to tell time. Life evolved on a rotating world that delivered alternating light and darkness on a 24-hour cycle. Over time, cellular chemistry tuned itself to that rhythm. Today, circadian rhythms — governed by a master timekeeper in the brain — guide sleeping schedules and mealtimes and influence everything from diet to depression to the risk of cancer. While an Apple watch can monitor a few vital functions such as your heart rate, your body’s natural clock controls or affects nearly all of them. Lately, research by Takahashi and others has suggested strategies for manipulating the body’s clock to correct circadian-controlled chemistry when it goes awry. Such circadian interventions could lead to relief for shift workers, antidotes for jet lag, and novel treatments for mood disorders and obesity, not to mention the prospect of counteracting aging. Prime weapons for the assault on clock-related maladies, Takahashi believes, can be recruited from an arsenal of small molecules, including some existing medical drugs. “Researchers are increasingly interested in developing small molecules to target the circadian system directly for therapeutic gains,” Takahashi and coauthors Zheng Chen and Seung-Hee Yoo wrote in the 2018 Annual Review of Pharmacology and Toxicology. In sophisticated life-forms (such as mammals), central control of the body’s clock resides in a small cluster of nerve cells within the brain’s hypothalamus. That cluster, called the suprachiasmatic nucleus — SCN for short — is tuned to the day-night signal by light transmitted via the eyes and the optic nerve. But the SCN does not do the job alone. It’s the master clock, for sure, but satellite timekeepers operate in all kinds of cells and body tissues. © 2020 Annual Reviews, Inc

Keyword: Biological Rhythms
Link ID: 26962 - Posted: 01.15.2020

Matthew Schafer and Daniela Schiller How do animals, from rats to humans, intuit shortcuts when moving from one place to another? Scientists have discovered mental maps in the brain that help animals picture the best routes from an internalized model of their environments. Physical space is not all that is tracked by the brain's mapmaking capacities. Cognitive models of the environment may be vital to mental processes, including memory, imagination, making inferences and engaging in abstract reasoning. Most intriguing is the emerging evidence that maps may be involved in tracking the dynamics of social relationships: how distant or close individuals are to one another and where they reside within group hierarchies. We are often told that there are no shortcuts in life. But the brain—even the brain of a rat—is wired in a way that completely ignores this kind of advice. The organ, in fact, epitomizes a shortcut-finding machine. The first indication that the brain has a knack for finding alternative routes was described in 1948 by Edward Tolman of the University of California, Berkeley. Tolman performed a curious experiment in which a hungry rat ran across an unpainted circular table into a dark, narrow corridor. The rat turned left, then right, and then took another right and scurried to the far end of a well-lit narrow strip, where, finally, a cup of food awaited. There were no choices to be made. The rat had to follow the one available winding path, and so it did, time and time again, for four days. On the fifth day, as the rat once again ran straight across the table into the corridor, it hit a wall—the path was blocked. The animal went back to the table and started looking for alternatives. Overnight, the circular table had turned into a sunburst arena. Instead of one track, there were now 18 radial paths to explore, all branching off from the sides of the table. After venturing out a few inches on a few different paths, the rat finally chose to run all the way down path number six, the one leading directly to the food. © 2020 Scientific American,

Keyword: Attention
Link ID: 26961 - Posted: 01.15.2020

By Eryn Brown On March 30, 1981, 25-year-old John W. Hinckley Jr. shot President Ronald Reagan and three other people. The following year, he went on trial for his crimes. Defense attorneys argued that Hinckley was insane, and they pointed to a trove of evidence to back their claim. Their client had a history of behavioral problems. He was obsessed with the actress Jodie Foster, and devised a plan to assassinate a president to impress her. He hounded Jimmy Carter. Then he targeted Reagan. In a controversial courtroom twist, Hinckley’s defense team also introduced scientific evidence: a computerized axial tomography (CAT) scan that suggested their client had a “shrunken,” or atrophied, brain. Initially, the judge didn’t want to allow it. The scan didn’t prove that Hinckley had schizophrenia, experts said — but this sort of brain atrophy was more common among schizophrenics than among the general population. It helped convince the jury to find Hinckley not responsible by reason of insanity. Nearly 40 years later, the neuroscience that influenced Hinckley’s trial has advanced by leaps and bounds — particularly because of improvements in magnetic resonance imaging (MRI) and the invention of functional magnetic resonance imaging (fMRI), which lets scientists look at blood flows and oxygenation in the brain without hurting it. Today neuroscientists can see what happens in the brain when a subject recognizes a loved one, experiences failure, or feels pain. Despite this explosion in neuroscience knowledge, and notwithstanding Hinckley’s successful defense, “neurolaw” hasn’t had a tremendous impact on the courts — yet. But it is coming. Attorneys working civil cases introduce brain imaging ever more routinely to argue that a client has or has not been injured. Criminal attorneys, too, sometimes argue that a brain condition mitigates a client’s responsibility. Lawyers and judges are participating in continuing education programs to learn about brain anatomy and what MRIs and EEGs and all those other brain tests actually show.

Keyword: Brain imaging; Aggression
Link ID: 26960 - Posted: 01.15.2020

By Gareth Cook One of science’s most challenging problems is a question that can be stated easily: Where does consciousness come from? In his new book Galileo’s Error: Foundations for a New Science of Consciousness, philosopher Philip Goff considers a radical perspective: What if consciousness is not something special that the brain does but is instead a quality inherent to all matter? It is a theory known as “panpsychism,” and Goff guides readers through the history of the idea, answers common objections (such as “That’s just crazy!”) and explains why he believes panpsychism represents the best path forward. He answered questions from Mind Matters editor Gareth Cook. Can you explain, in simple terms, what you mean by panpsychism? In our standard view of things, consciousness exists only in the brains of highly evolved organisms, and hence consciousness exists only in a tiny part of the universe and only in very recent history. According to panpsychism, in contrast, consciousness pervades the universe and is a fundamental feature of it. This doesn’t mean that literally everything is conscious. The basic commitment is that the fundamental constituents of reality—perhaps electrons and quarks—have incredibly simple forms of experience. And the very complex experience of the human or animal brain is somehow derived from the experience of the brain’s most basic parts. It might be important to clarify what I mean by “consciousness,” as that word is actually quite ambiguous. Some people use it to mean something quite sophisticated, such as self-awareness or the capacity to reflect on one’s own existence. This is something we might be reluctant to ascribe to many nonhuman animals, never mind fundamental particles. But when I use the word consciousness, I simply mean experience: pleasure, pain, visual or auditory experience, et cetera. © 2020 Scientific American,

Keyword: Consciousness
Link ID: 26959 - Posted: 01.15.2020

By Joseph Stern, M.D. The bullet hole in the teenager’s forehead was so small, it belied the damage already done to his brain. The injury was fatal. We knew this the moment he arrived in the emergency room. Days later, his body was being kept alive in the intensive care unit despite an exam showing that he was brain-dead and no blood was flowing to his brain. Eventually, all his organs failed and his heart stopped beating. But the nurses continued to care for the boy and his family, knowing he was already dead but trying to help the family members with the agonizing process of accepting his death. This scenario occurs all too frequently in the neurosurgical I.C.U. Doctors often delay the withdrawal of life-sustaining supports such as ventilators and IV drips, and nurses continue these treatments — adhering to protocols, yet feeling internal conflict. A lack of consensus or communication among doctors, nurses and families often makes these situations more difficult for all involved. Brain death is stark and final. When the patient’s brain function has ceased, bodily death inevitably follows, no matter what we do. Continued interventions, painful as they may be, are necessarily of limited duration. We can keep a brain-dead patient’s body alive for a few days at the most before his heart stops for good. Trickier and much more common is the middle ground of a neurologically devastating injury without brain death. Here, decisions can be more difficult, and electing to continue or to withdraw treatment much more problematic. Inconsistent communication and support between medical staff members and families plays a role. A new field, neuropalliative care, seeks to focus “on outcomes important to patients and families” and “to guide and support patients and families through complex choices involving immense uncertainty and intensely important outcomes of mind and body.” © 2020 The New York Times Company

Keyword: Consciousness
Link ID: 26958 - Posted: 01.14.2020

By Brooke N. Dulka Glutamate, arguably the most important chemical in your nervous system, is older than the brain itself. From a single cell bacterium, to mushrooms and plants, to you—every living thing on this planet relies on this tiny molecule for cellular communication. It is absolutely critical for everything we do. “The function of most, if not all, of the trillions of cells in the brain are regulated by glutamate,” neuroscientist David Baker explains to me. On November 1, 2019, neuroscientists gathered at the Harley-Davidson Museum in Milwaukee, WI to share their science. The chrome-laden motorcycle in the corner of the room was hard to ignore, but it was the presentation of Baker, a professor at Marquette University, that really caught my attention. Baker has dedicated his career to understanding how glutamate can treat disorders of the brain. Specifically, his hopes for targeting glutamate lie in a mechanism called system xc-. Glutamate is often called the “major excitatory neurotransmitter” within the brain. It is the brain’s “go” signal. Baker notes that glutamate receptors are found in every kind of brain cell, which means it is doing more than regulating the activity of neurons, it is regulating the brain’s support cells too. Glutamate is that widespread and important! But being almost everywhere increases the chances that something, somewhere, could go wrong. Thus, most disorders of the brain involve some degree of glutamate dysfunction. This includes disorders such as schizophrenia, depression, obsessive-compulsive disorder, Alzheimer’s disease and more. While one might think that this awareness provides neuroscientists with critical insights into treating disorders of the brain, actually the opposite has occurred. In fact, most psychiatric drugs weren’t even discovered through systematic drug development, as one might expect. More often than not, the drugs we commonly use today were serendipitous findings or accidental discoveries. Baker notes that almost none of the most commonly prescribed drugs for psychiatric disorders target glutamate. Given the importance of glutamate to nearly every brain function, there is a genuine, and well-reasoned, concern among both neuroscientists and psychiatrists that glutamatergic therapeutics will produce widespread impairments in the brain. © 2020 Scientific American

Keyword: Schizophrenia
Link ID: 26957 - Posted: 01.14.2020