Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Researchers have discovered a technique for directly reprogramming skin cells into light-sensing rod photoreceptors used for vision. The lab-made rods enabled blind mice to detect light after the cells were transplanted into the animals’ eyes. The work, funded by the National Eye Institute (NEI), published April 15 in Nature. The NEI is part of the National Institutes of Health. Up until now, researchers have replaced dying photoreceptors in animal models by creating stem cells from skin or blood cells, programming those stem cells to become photoreceptors, which are then transplanted into the back of the eye. In the new study, scientists show that it is possible to skip the stem-cell intermediary step and directly reprogram skins cells into photoreceptors for transplantation into the retina. “This is the first study to show that direct, chemical reprogramming can produce retinal-like cells, which gives us a new and faster strategy for developing therapies for age-related macular degeneration and other retinal disorders caused by the loss of photoreceptors,” said Anand Swaroop, Ph.D., senior investigator in the NEI Neurobiology, Neurodegeneration, and Repair Laboratory, which characterized the reprogrammed rod photoreceptor cells by gene expression analysis. “Of immediate benefit will be the ability to quickly develop disease models so we can study mechanisms of disease. The new strategy will also help us design better cell replacement approaches,” he said. Scientists have studied induced pluripotent stem (iPS) cells with intense interest over the past decade. IPSCs are developed in a lab from adult cells —rather than fetal tissue— and can be used to make nearly any type of replacement cell or tissue. But iPS cell reprogramming protocols can take six months before cells or tissues are ready for transplantation. By contrast, the direct reprogramming described in the current study coaxed skin cells into functional photoreceptors ready for transplantation in only 10 days. The researchers demonstrated their technique in mouse eyes, using both mouse- and human-derived skin cells.
Keyword: Vision; Stem Cells
Link ID: 27196 - Posted: 04.16.2020
Gregory Berns, M.D., Ph.D. There is no official census for dogs and cats, but in 2016, the American Veterinary Medical Association estimated that 59 percent of households in the United States had a pet. Although the numbers of dogs and cats remains debatable, dogs continue to gain in popularity with 38 percent of households having at least one. Families with children are even more likely to have a dog (55 percent). With all due respect to cats, dogs have insinuated themselves into human society, forming deep emotional bonds with us and compelling us to feed and shelter them. Worldwide, the dog population is approaching one billion, the majority free-ranging. Even though many people are convinced they know what their dog is thinking, little is actually known about what is going on in dogs’ heads. This may be surprising because the field of experimental psychology had its birth with Pavlov and his salivating dogs. But as dogs gained traction as household pets, in many cases achieving the status of family members, their use as research subjects fell out of favor. In large part, this was a result of the Animal Welfare Act of 1966, which set standards for the treatment of animals in research and put an end to the practice of stealing pets for experimentation. How strange it is then that these creatures, whose nearest relatives are wolves, live with us and even share our beds, yet we know almost nothing about what they’re thinking. In the last decade or so, however, the situation has begun to change, and we are in the midst of a renaissance of canine cognitive science. Research labs have sprung up around the world, and dogs participate not as involuntary subjects, but as partners in scientific discovery. This new research is beginning to shed light on what it’s like to be a dog and the nature of the dog-human bond. © 2020 The Dana Foundation.
Keyword: Brain imaging; Evolution
Link ID: 27195 - Posted: 04.16.2020
Abby Olena Base editors, which convert one nucleotide to another without a double-strand DNA break, have the potential to treat diseases caused by mutant genes. One drawback, though, is that the DNA that encodes CRISPR base editors is long—too long to fit in the adeno-associated viruses (AAVs) most commonly used for gene therapy. In a study published in Molecular Therapy on January 13, researchers split the DNA encoding a base editor into two AAV vectors and injected them into a mouse model of inherited amyotrophic lateral sclerosis (ALS). The strategy disabled the disease-causing gene, improving the animals’ symptoms and prolonging their lives. “We’d like to be able to make gene editing tools that can fit inside an AAV vector. Unfortunately, some of the tools are so big that they can’t fit inside, so in this study, they were able to come up with a solution to that by using a split protein,” says David Segal, a biochemist at the University of California, Davis, who was not involved in the work. “It’s not the first time that that system has been used, but it’s the first time it’s been applied to this kind of base editor.” Pablo Perez-Pinera, a bioengineer at University of Illinois at Urbana-Champaign, and colleagues developed a strategy to split the base editor into two chunks. In a study published in 2019, they generated two different AAV vectors, each containing a portion of coding DNA for an adenine-to-thymine base editor. They also included sequences encoding so-called inteins—short peptides that when they are expressed within proteins stick together and cleave themselves out, a bit like introns in RNA. The researchers built the inteins into the vectors such that when the inteins produced by the two vectors dimerized, bringing the two base editor parts together, and then excised themselves, they left behind a full-length, functional base editor. © 1986–2020 The Scientist
Keyword: ALS-Lou Gehrig's Disease
; Genes & Behavior
Link ID: 27194 - Posted: 04.15.2020
By Kenneth S. Kosik No fundamental obstacle prevents us from developing an effective treatment for Alzheimer's disease. Other troubles of human nature, such as violence, greed and intolerance, have a bewildering variety of daunting causes and uncertainties. But Alzheimer's, at its core, is a problem of cell biology whose solution should be well within our reach. There is a fairly good chance that the scientific community might already have an unrecognized treatment stored away in a laboratory freezer among numerous vials of chemicals. And major insights may now reside, waiting to be noticed, in big databases or registries of clinical records, neuropsychological profiles, brain-imaging studies, biological markers in blood and spinal fluid, genomes, protein analyses, neuron recordings, or animal and cell culture models. But we have missed those clues because for decades we have spent too much time chasing every glossy new finding in Alzheimer's research and too little time thinking deeply about the underlying biology of this ailment. Instead our work has been driven by a number of assumptions. Among those assumptions has been the central and dominant role of the protein fragment called beta-amyloid. A large amount of data supports the idea that beta-amyloid plays an important part in the disease. We have developed drugs that can reduce concentrations of the protein fragments in people with Alzheimer's, yet by and large they have not stopped patients' cognitive decline in any meaningful way. It now seems simplistic to conclude that eliminating or inhibiting beta-amyloid will cure or treat those suffering from the disease, especially without far deeper and more comprehensive knowledge of how it develops and progresses [see “The Amyloid Drug Struggle”]. We have not been barking up a completely wrong research tree, but our zeal has led us to ignore other trees and even the roots of this particular one. © 2020 Scientific American
Keyword: Alzheimers
Link ID: 27193 - Posted: 04.15.2020
According to a recent analysis of data from two major eye disease studies, adherence to the Mediterranean diet – high in vegetables, whole grains, fish, and olive oil – correlates with higher cognitive function. Dietary factors also seem to play a role in slowing cognitive decline. Researchers at the National Eye Institute (NEI), part of the National Institutes of Health, led the analysis of data from the Age-Related Eye Disease Study (AREDS) and AREDS2. They published their results today in Alzheimer’s and Dementia: the Journal of the Alzheimer’s Association. “We do not always pay attention to our diets. We need to explore how nutrition affects the brain and the eye” said Emily Chew, M.D., director of the NEI Division of Epidemiology and Clinical Applications and lead author of the studies. The researchers examined the effects of nine components of the Mediterranean diet on cognition. The diet emphasizes consumption of whole fruits, vegetables, whole grains, nuts, legumes, fish, and olive oil, as well as reduced consumption of red meat and alcohol. AREDS and AREDS2 assessed over years the effect of vitamins on age-related macular degeneration (AMD), which damages the light-sensitive retina. AREDS included about 4,000 participants with and without AMD, and AREDS2 included about 4,000 participants with AMD. The researchers assessed AREDS and AREDS2 participants for diet at the start of the studies. The AREDS study tested participants’ cognitive function at five years, while AREDS2 tested cognitive function in participants at baseline and again two, four, and 10 years later. The researchers used standardized tests based on the Modified Mini-Mental State Examination to evaluate cognitive function as well as other tests. They assessed diet with a questionnaire that asked participants their average consumption of each Mediterranean diet component over the previous year.
Keyword: Alzheimers; Obesity
Link ID: 27192 - Posted: 04.15.2020
By Gary Stix Consumer genetic tests can sometimes result in a terrible surprise appearing in the same report that divulges whether one has a cilantro aversion or wet or dry earwax. Test takers may receive the devastating news that they have a version of a gene—apolipoprotein E epsilon 4 (APOE e4)—that greatly increases their chances of getting Alzheimer’s disease. The shock can be so great that some will seek solace in a support group to help them adjust to the possibility that they could run into cognitive problems beginning in their 50s or 60s. One thing that makes the information so difficult to absorb is that there is no certainty about it. A person with one copy of the APOE e4 gene is more than three times as likely to wind up with Alzheimer’s (one copy can be inherited from each parent). A hit of two copies increases the risk by 10 times or more. APOE e4 may also reduce the age of the disease’s onset by up to a decade. Still, not everyone who is an APOE e4 carrier will ultimately receive a diagnosis for Alzheimer’s, the most common form of dementia. Given the ambiguities, scientists have long wondered whether other genes might counterbalance APOE e4's effects. A new paper may have found a candidate for just such a gene. Advertisement An analysis across multiple studies—with results from more than 20,000 individuals—found that APOE e4 carriers between the ages of 60 and 80 who also had a particular variant of a gene called klotho (named for Clotho, one of the Greek Fates, who spins the thread of life) were 30 percent less likely to receive an Alzheimer's diagnosis than carriers without it. People in their late 70s with a single copy of the klotho variant were also less apt to experience the initial cognitive losses (mild cognitive impairments) that often precede an Alzheimer’s diagnosis. Study participants with the relevant variant also had reduced signs of the hallmark clumps of beta-amyloid protein that turn up in the brain before symptoms arise. © 2020 Scientific American,
Keyword: Alzheimers; Genes & Behavior
Link ID: 27191 - Posted: 04.15.2020
Peter Rhys-Evans For the past 150 years, scientists and laypeople alike have accepted a “savanna” scenario of human evolution. The theory, primarily based on fossil evidence, suggests that because our ancestral ape family members were living in the trees of East African forests, and because we humans live on terra firma, our primate ancestors simply came down from the trees onto the grasslands and stood upright to see farther over the vegetation, increasing their efficiency as hunter-gatherers. In the late 19th century, anthropologists only had a few Neanderthal fossils to study, and science had very little knowledge of genetics and evolutionary changes. So this savanna theory of human evolution became ingrained in anthropological dogma and has remained the established explanation of early hominin evolution following the genetic split from our primate cousins 6 million to 7 million years ago. But in 1960, a different twist on human evolution emerged. That year, marine biologist Sir Alister Hardy wrote an article in New Scientist suggesting a possible aquatic phase in our evolution, noting Homo sapiens’s differences from other primates and similarities to other aquatic and semi-aquatic mammals. In 1967, zoologist Desmond Morris published The Naked Ape, which explored different theories about why modern humans lost their fur. Morris mentioned Hardy’s “aquatic ape” hypothesis as an “ingenious” theory that sufficiently explained “why we are so nimble in the water today and why our closest living relatives, the chimpanzees, are so helpless and quickly drown.” © 1986–2020 The Scientist
Keyword: Evolution
Link ID: 27190 - Posted: 04.15.2020
Our ability to study networks within the nervous system has been limited by the tools available to observe large volumes of cells at once. An ultra-fast, 3D imaging technique called SCAPE microscopy, developed through the National Institutes of Health (NIH)’s Brain Research through Advancing Innovative Technologies (BRAIN) Initiative, allows a greater volume of tissue to be viewed in a way that is much less damaging to delicate networks of living cells. In a study published in Science, researchers used SCAPE to watch for the first time how the mouse olfactory epithelium — the part of the nervous system that directly perceives smells — reacted in real time to complex odors. They found that those nerve cells may play a larger and more complex role in interpreting smells than was previously understood. “This is an elegant demonstration of the power of BRAIN Initiative technologies to provide new insights into how the brain decodes information to produce sensations, thoughts, and actions,” said Edmund Talley, Ph.D., program director, National Institute of Neurological Disorders and Stroke (NINDS), a part of NIH. The SCAPE microscope was developed in the laboratory of Elizabeth M.C. Hillman, Ph.D., professor of biomedical engineering and radiology and principal investigator at Columbia’s Zuckerman Institute in New York City. “SCAPE microscopy has been incredibly enabling for studies where large volumes need to be observed at once and in real time,” said Dr. Hillman. “Because the cells and tissues can be left intact and visualized at high speeds in three dimensions, we are able to explore many new questions that could not be studied previously.”
Keyword: Brain imaging; Chemical Senses (Smell & Taste)
Link ID: 27189 - Posted: 04.14.2020
by Alla Katsnelson Several regions in the outer layer of the brain are thicker in children and young adults with autism than in their typical peers, a new study finds. The differences are greatest in girls, in children aged 8 to 10 years, and in those with a low intelligence quotient (IQ)1. During typical development, the brain’s outer layer, called the cerebral cortex, thickens until about age 2 and then grows gradually thinner into adolescence as the brain matures. The new study, one of the largest to investigate cortical thickness in autism, aligns with others that indicate this trajectory differs in people with the condition. The findings suggest that brain structure does not change in a uniform way in autism, but instead varies with factors such as age, gender and IQ, says lead researcher Mallar Chakravarty, assistant professor of psychiatry at McGill University in Montreal, Canada. These variations could help explain the inconsistent findings about cortical thickness and autism seen in earlier studies that did not consider such factors, says Christine Wu Nordahl, associate professor of psychiatry and behavioral sciences at the University of California, Davis MIND Institute, who was not involved in the work. “I think this is the type of study we need to be doing as a field, more and more,” she says. The researchers began with unprocessed magnetic resonance imaging (MRI) brain scans of 3,145 participants from previous studies conducted at multiple institutions. © 2020 Simons Foundation
Keyword: Autism; Brain imaging
Link ID: 27188 - Posted: 04.14.2020
By Simon Makin Our recollection of events is usually not like a replay of digital video from a security camera—a passive observation that faithfully reconstructs the spatial and sensory details of everything that happened. More often memory segments what we experience into a string of discrete, connected events. For instance, you might remember that you went for a walk before lunch at a given time last week without recalling the soda bottle strewn on the sidewalk, the crow cawing in the oak tree in your yard or the chicken salad sandwich you ate upon your return. Your mind designates a mental basket for “walk” and a subsequent bin for “lunch” that, once accessed, make many of these finer details available. This arrangement raises the question of how the brain performs such categorization. A new study by neuroscientist Susumu Tonegawa of the Massachusetts Institute of Technology and his colleagues claims to have discovered the neural processing that makes this organization of memory into discrete units possible. The work has implications for understanding how humans generalize knowledge, and it could aid efforts to develop AI systems that learn faster. A brain region called the hippocampus is critical for memory formation and also seems to be involved in navigation. Neurons in the hippocampus called “place” cells selectively respond to being in specific locations, forming a cognitive map of the environment. Such spatial information is clearly important for “episodic” (autobiographical rather than factual) memory. But so, too, are other aspects of experience, such as changing sensory input. There is evidence that neurons in the hippocampus encode sensory changes by altering the frequency at which they fire, a phenomenon termed “rate remapping.” According to research by neuroscientist Loren Frank of the University of California, San Francisco, and his colleagues, such changes may also encode information about where an animal has been and where it is going, enabling rate remapping to represent trajectories of travel. © 2020 Scientific American
Keyword: Learning & Memory
Link ID: 27187 - Posted: 04.14.2020
Rebecca Schiller When behavioural scientist Dr Pragya Agarwal moved from Delhi to York more than 20 years ago, her first priority was to blend in. As a single parent, a woman of colour and an academic, she worked hard to “water down” the things that made her different from those around her. Yet the more she tried to fit in, the more Agarwal began to ask herself why humans appear programmed to create “in groups” and distrust those on the outside. “Unconscious bias has become a buzzword in recent years,” explains Agarwal. “We are all biased and, though some biases can be harmless, many aren’t.” These are the issues she unravels in her book Sway: Unravelling Unconscious Bias, and she confronts some uncomfortable truths along the way. Advertisement Agarwal argues that humans aren’t naturally rational creatures, and with our brains constantly bombarded with information, we rely on cognitive short cuts: patterns of learned thinking based on what has worked for us in the past, the messages we receive from others and our evolutionary programming. “Cognitive short cuts evolved to help us survive,” she says. “The problem is that we still have these responses and they don’t work well in the modern world.” In our tribal past, the consequences of wrongly assuming that an outsider was peaceful or free from disease could be so damaging that being overcautious became a human evolutionary strategy. The result is the tendency to generalise: speedily assigning those around us to groups based on race, academic status, social class or gender and ignoring details that contradict our existing beliefs. Once we’ve placed a person in a box, Agarwal suggests we are more inclined to choose the dehumanising and dangerous approach of treating them according to the stereotypes we associate with that box rather than as an individual. It’s an experience the author has had herself. © 2020 Guardian News & Media Limited
Keyword: Attention
Link ID: 27186 - Posted: 04.14.2020
By Alexandra Horowitz Recently, in communities under quarantine or stay-at-home orders, residents have looked out their windows to find wild animals that usually stay on the fringes of the city or emerge only at night suddenly appearing in daylight in the middle of the street. The reason is us: Human activity disturbs animals. Even our presence — simply observing, as bird-watchers, or field biologists, or nature-loving hikers — changes their behavior. The ecologist Carl Safina (author of “Beyond Words” and “Song for the Blue Ocean”) is no agnostic observer. He sees humans as destroying the world for nonhuman animals, to say nothing of destroying the animals themselves, and would like us to stop, please. The question for him, and for anyone with this conviction, is: Short of quarantining the human race, what’s the best way to do this? Fifty years ago, the biologist Robert Payne first eavesdropped on a humpback whale community and heard whale song. He spread the word about their ethereal, beautiful forms of communication, and the world looked at whales differently. Since that time, whaling has sharply declined. Today, many advocates for animals appeal to species’ cognitive abilities to argue for their better treatment. They’re so smart or humanlike, the argument goes, we should be treating them better. Such is the vestige of the scala naturae that has awarded all lives a certain value — with humans on top, of course. © 2020 The New York Times Company
Keyword: Evolution
Link ID: 27185 - Posted: 04.14.2020
By Pragya Agarwal If you have seen the documentary Free Solo, you will be familiar with Alex Honnold. He ascends without protective equipment of any kind in treacherous landscapes where, above about 15 meters, any slip is generally lethal. Even just watching him pressed against the rock with barely any handholds makes me nauseous. In a functional magnetic resonance imaging (fMRI) test with Honnold, neurobiologist Jane Joseph found there was near zero activation in his amygdala. This is a highly unusual brain reaction and may explain why Alex feels no threat in free solo climbs that others wouldn’t dare attempt. But this also shows how our amygdala activates in that split second to warn us, and why it plays an important role in our unconscious biases. Having spent many years researching unconscious bias for my book, I have realized that it remains problematic to pinpoint as it is hidden and is often in complete contrast to what are our expected beliefs. Neuroimaging research is beginning to give us more insight into the formation of our unconscious biases. Recent fMRI neuroscience studies demonstrate that people use different areas of the brain when reasoning about familiar and unfamiliar situations. The neural zones that respond to stereotypes primarily include the amygdala, the prefrontal cortex, the posterior cingulate and the anterior temporal cortex, and that they are described as all “lighting up like a Christmas tree” when stereotypes are activated (certain parts of the brain become more activated than others during certain tasks). People also use different areas of the brain when reasoning about familiar and unfamiliar situations. When we meet someone new, we are not merely focusing on our verbal interaction. © 2020 Scientific American,
Keyword: Attention; Brain imaging
Link ID: 27184 - Posted: 04.13.2020
By Andrew Solomon For nearly 30 years — most of my adult life — I have struggled with depression and anxiety. While I’ve never felt alone in such commonplace afflictions — the family secret everyone shares — I now find I have more fellow sufferers than I could have ever imagined. Within weeks, the familiar symptoms of mental illness have become universal reality. A new poll from the Kaiser Family Foundation found nearly half of respondents said their mental health was being harmed by the coronavirus pandemic. Nearly everyone I know has been thrust in varying degrees into grief, panic, hopelessness and paralyzing fear. If you say, “I’m so terrified I can barely sleep,” people may reply, “What sensible person isn’t?” But that response can cause us to lose sight of the dangerous secondary crisis unfolding alongside the more obvious one: an escalation in both short-term and long-term clinical mental illness that may endure for decades after the pandemic recedes. When everyone else is experiencing depression and anxiety, real, clinical mental illness can get erased. While both the federal and local governments (some alarmingly slower than others) have responded to the spread of the coronavirus in critical ways, acknowledgment of the mental illness vulnerabilities has been cursory. Gov. Andrew Cuomo, who has so far enlisted more than 8,000 mental health providers to help New Yorkers in distress, is a fortunate exception. The Chinese government moved psychologists and psychiatrists to Wuhan during the first stage of self-quarantine. No comparable measures have been initiated by our federal government. The unequal treatment of the two kinds of health — physical over mental — is consonant with our society’s ongoing disregard for psychological stability. Insurance does not offer real parity of coverage, and treatment for mood disorders is generally deemed a luxury. But we are in a dual crisis of physical and mental health, and those facing psychiatric challenges deserve both acknowledgment and treatment. © 2020 The New York Times Company
Keyword: Depression; Stress
Link ID: 27183 - Posted: 04.13.2020
By Austin Frakt OxyContin, and the aggressive, misleading way that Purdue Pharma marketed it, might have been even more damaging than was previously understood. Recent research shows how the company focused its marketing in states with lighter prescription regulation — to devastating effect. Also, a new version of OxyContin introduced a decade ago — which was meant to reduce harm — had unintended consequences. Besides contributing to heroin overdoses, it led to hepatitis C and other infections. Careful studies are only now starting to reveal the extent of the damage. OxyContin is an opioid painkiller that Purdue Pharma first brought to the U.S. market in 1996. Its chief innovation was its 12-hour timed release of oxycodone. This made it ripe for abuse, since by crushing or dissolving OxyContin pills, abusers of the drug could ingest the entire dose at once. Several studies have pointed to Purdue’s aggressive marketing of OxyContin as a significant contributor to the opioid epidemic. The marketing took various forms, including calling and visiting doctors; paying them for meals and travel; providing gifts; and funding pain treatment groups that urged liberalization of opioid prescribing. Some of the company’s marketing messages minimized the potential for OxyContin to lead to addiction, for which it paid over $600 million in fines in 2007. A National Bureau of Economic Research working paper published last fall sheds light on Purdue’s role. The researchers, economists from the University of Pennsylvania, the University of Notre Dame and the RAND Corporation, looked at variations in prescribing regulations that led Purdue to market OxyContin more aggressively in some states than in others. © 2020 The New York Times Company
Keyword: Drug Abuse
Link ID: 27182 - Posted: 04.13.2020
By Jennifer Couzin-Frankel In college in the 1990s, Alix Timko wondered why she and her friends didn’t have eating disorders. “We were all in our late teens, early 20s, all vaguely dissatisfied with how we looked,” says Timko, now a psychologist at Children’s Hospital of Philadelphia. Her crowd of friends matched the profile she had seen in TV dramas—overachievers who exercised regularly and whose eating was erratic, hours of fasting followed by “a huge pizza.” “My friends and I should have had eating disorders,” she says. “And we didn’t.” It was an early clue that her understanding of eating disorders was off the mark, especially for the direst diagnosis of all: anorexia nervosa. Anorexia is estimated to affect just under 1% of the U.S. population, with many more who may go undiagnosed. The illness manifests as self-starvation and weight loss so extreme that it can send the body into a state resembling hibernation. Although the disorder also affects boys and men, those who have it are most often female, and about 10% of those affected die. That’s the highest mortality rate of any psychiatric condition after substance abuse, on par with that of childhood leukemia. With current treatments, about half of adolescents recover, and another 20% to 30% are helped. As a young adult, Timko shared the prevailing view of the disease: that it develops when girls, motivated by a culture that worships thinness, exert extreme willpower to stop themselves from eating. Often, the idea went, the behavior arises in reaction to parents who are unloving, controlling, or worse. But when Timko began to treat teens with anorexia and their families, that narrative crumbled—and so did her certainties about who is at risk. Many of those young people “don’t have body dissatisfaction, they weren’t on a diet, it’s not about control,” she found. “Their mom and dad are fabulous and would move heaven and Earth to get them better.” © 2020 American Association for the Advancement of Science
Keyword: Anorexia & Bulimia
Link ID: 27181 - Posted: 04.10.2020
By Jan Hoffman Anxious times — like a pandemic — can lead to unhealthy but self-soothing habits, whether it’s reaching for a bag of potato chips, more chocolate or another glass of wine. But some stress-reducing behaviors are alarming to medical experts right now — namely vaping and smoking of tobacco or marijuana. Because the coronavirus attacks the lungs, this is exactly the moment, they say, when people should be tapering — or better yet, stopping — their use of such products, not escalating them. ”Quitting during this pandemic could not only save your life, but by preventing the need for your treatment in a hospital, you might also save someone else’s life,” said Dr. Jonathan Winickoff, director of pediatric research at the Tobacco Research and Treatment Center at Massachusetts General Hospital. On Thursday, Dr. Winickoff joined the Massachusetts attorney general, Maura Healey, to issue an advisory alerting the public and particularly young people that smoking and vaping can also exacerbate the risks of spreading Covid-19. “You bring this device or cigarette to your mouth to inhale and you do so repeatedly,” explained Dr. Winickoff, who is also a professor at Harvard Medical School. “You touch the cartridge. You put it next to your face. You are spreading whatever is in your hand into your body. At the same time, many of my patients who smoke or vape have increased coughing or expectorating. And that’s a recipe for increased spread.” Studies already amply show that cigarette smoking weakens the immune system and compromises lung function. Research into the health effects of vaping is limited because the devices are relatively new, but studies suggest that e-cigarettes may cause inflammation in the airways and lungs. © 2020 The New York Times Company
Keyword: Drug Abuse
Link ID: 27180 - Posted: 04.10.2020
by Michael Marshall Some people with autism have an unusually large head: This fact has been known since autism was first described in the 1940s. But debate about this finding has raged ever since. How many people with autism have a large head? What causes the enlargement? And does it have any bearing on outcome? Here is what researchers do and do not know about head size in autism. What proportion of people with autism have a large head? When Leo Kanner first described 11 children with autism in a 1943 paper, he noted many unusual features. “Five had relatively large heads,” he reported, and he said no more on the matter. But the sample size was small. Many other scientists noted the same link over the following decades. A 1999 review estimated that 20 percent of people with autism have statistically large head size, or ‘macrocephaly’1. In 2011, the Autism Phenome Project refined this estimate to 15 percent of autistic boys2. The team followed boys with autism from their diagnosis throughout childhood. They focused on whether head size is disproportionate to the rest of the body, rather than simply large. The researchers call this ‘disproportionate megalencephaly’ and say it marks a distinct subgroup of autistic people. “We’ve defined a big-brain form of autism,” says lead investigator David Amaral, distinguished professor of psychiatry and behavioral sciences at the University of California, Davis MIND Institute. No one contests the 15 percent figure, but scientists differ in their interpretation of the finding. “It only applies to a small proportion of children with autism,” says Katarzyna Chawarska, Emily Fraser Beede Professor of Child Psychiatry at Yale University. © 2020 Simons Foundation
Keyword: Autism; Brain imaging
Link ID: 27179 - Posted: 04.10.2020
By Lydia Denworth, It is lunchtime on a Sunday in January. At a long table inside a delicatessen in midtown Manhattan, a group of young people sit together over sandwiches and salads. Most of them have their phones out. One boy wears headphones around his neck. But there is less conversation than you might expect from a typical group of friends: One of the boys seems to talk only to himself, and a girl looks anxious and occasionally flaps her hands. The young people in this group are all on the spectrum. They met through a program organized by the nonprofit Actionplay, in which young people with autism or other disabilities work together to write and stage a musical. Each Sunday, the members refine characters and the script, block scenes and compose songs—and then some of them head across the street to have lunch together. “You meet other people just like you,” says Lexi Spindel, 15. The members share a group text in which they call themselves the Wrecking Crew. A few months ago, six of the girls went to see the movie “Frozen II” together. And Lexi and Actionplay veteran Adelaide DeSole, 21, spent a long afternoon at the Spindels’ apartment over the holiday season. The two young women played games and watched “SpongeBob SquarePants” and “Kung Fu Panda” on television. “That was the first time my daughter had a friend over,” says Lexi’s father, Jay Spindel. “That never happened before Actionplay.” © 2020 Simons Foundation
Keyword: Autism
Link ID: 27178 - Posted: 04.10.2020
Amy Schleunes The brains of Australopithecus afarensis, a hominin species that lived in eastern Africa more than 3 million years ago, were organized in a manner similar to those of apes, report the authors of a study published on April 1 in Science Advances, but they also indicate a slow growth period like that found in modern humans. “The fact that protracted brain growth emerged in hominins as early as 3.3 Ma ago could suggest that it characterized all of subsequent hominin evolutionary history,” the authors write in the paper, though brain development patterns in hominins may not have followed a linear trajectory in the evolutionary process that led to modern humans. Whatever the evolutionary pattern, they say, the extended brain growth period in A. afarensis “provided a basis for subsequent evolution of the brain and social behavior in hominins and was likely critical for the evolution of a long period of childhood learning.” P. Gunz et al., “Australopithecus afarensis endocasts suggest ape-like brain organization and prolonged brain growth,” Science Advances, doi:10.1126/sciadv.aaz4729, 2020. © 1986–2020 The Scientist
Keyword: Evolution
Link ID: 27177 - Posted: 04.10.2020


.gif)

