Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 26300

Christof Koch A future where the thinking capabilities of computers approach our own is quickly coming into view. We feel ever more powerful machine-learning (ML) algorithms breathing down our necks. Rapid progress in coming decades will bring about machines with human-level intelligence capable of speech and reasoning, with a myriad of contributions to economics, politics and, inevitably, warcraft. The birth of true artificial intelligence will profoundly affect humankind’s future, including whether it has one. The following quotes provide a case in point: “From the time the last great artificial intelligence breakthrough was reached in the late 1940s, scientists around the world have looked for ways of harnessing this ‘artificial intelligence’ to improve technology beyond what even the most sophisticated of today’s artificial intelligence programs can achieve.” Advertisement “Even now, research is ongoing to better understand what the new AI programs will be able to do, while remaining within the bounds of today’s intelligence. Most AI programs currently programmed have been limited primarily to making simple decisions or performing simple operations on relatively small amounts of data.” These two paragraphs were written by GPT-2, a language bot I tried last summer. Developed by OpenAI, a San Francisco–based institute that promotes beneficial AI, GPT-2 is an ML algorithm with a seemingly idiotic task: presented with some arbitrary starter text, it must predict the next word. The network isn’t taught to “understand” prose in any human sense. Instead, during its training phase, it adjusts the internal connections in its simulated neural networks to best anticipate the next word, the word after that, and so on. Trained on eight million Web pages, its innards contain more than a billion connections that emulate synapses, the connecting points between neurons. When I entered the first few sentences of the article you are reading, the algorithm spewed out two paragraphs that sounded like a freshman’s effort to recall the gist of an introductory lecture on machine learning during which she was daydreaming. The output contains all the right words and phrases—not bad, really! Primed with the same text a second time, the algorithm comes up with something different. © 2019 Scientific American,

Keyword: Consciousness; Robotics
Link ID: 26894 - Posted: 12.12.2019

Thomas R. Sawallis and Louis-Jean Boë Sound doesn’t fossilize. Language doesn’t either. Even when writing systems have developed, they’ve represented full-fledged and functional languages. Rather than preserving the first baby steps toward language, they’re fully formed, made up of words, sentences and grammar carried from one person to another by speech sounds, like any of the perhaps 6,000 languages spoken today. So if you believe, as we linguists do, that language is the foundational distinction between humans and other intelligent animals, how can we study its emergence in our ancestors? Happily, researchers do know a lot about language – words, sentences and grammar – and speech – the vocal sounds that carry language to the next person’s ear – in living people. So we should be able to compare language with less complex animal communication. And that’s what we and our colleagues have spent decades investigating: How do apes and monkeys use their mouth and throat to produce the vowel sounds in speech? Spoken language in humans is an intricately woven string of syllables with consonants appended to the syllables’ core vowels, so mastering vowels was a key to speech emergence. We believe that our multidisciplinary findings push back the date for that crucial step in language evolution by as much as 27 million years. The sounds of speech Say “but.” Now say “bet,” “bat,” “bought,” “boot.” The words all begin and end the same. It’s the differences among the vowel sounds that keep them distinct in speech. © 2010–2019, The Conversation US, Inc.

Keyword: Language; Evolution
Link ID: 26893 - Posted: 12.12.2019

By Rachel E. Gross In the 1960s, manufacturers of the new birth-control pill imagined their ideal user as feminine, maternal and forgetful. She wanted discretion. She was married. And she wanted visible proof that her monthly cycle was normal and that she wasn’t pregnant. In 2019, the user of the pill is perceived as an altogether different person. She’s unwed, probably would prefer to skip her period and is more forthright about when it’s that time of the month. As such, many birth-control brands now come in brightly colored rectangular packs that make no effort to be concealed. But one part of the equation remains: the week of placebo pills, in which hormones are abruptly withdrawn and a woman experiences what looks and feels a lot like her regular period — blood, cramps and all — but isn’t. Physicians have widely described this pseudoperiod as medically unnecessary. So why do millions still endure it? That’s largely the legacy of two men: John Rock and David Wagner. First there’s Rock, a Harvard fertility expert and a developer of the pill. There’s a longstanding myth that Rock, a Catholic, designed the pill in the 1950s with the church in mind and included a week of hormonal withdrawal — and therefore bleeding — to make his invention seem more natural. In fact, the thought never crossed his mind, the Rutgers University historian Margaret Marsh says. Instead, it was Gregory (Goody) Pincus, the other developer of the pill, who suggested that the pill be given as a 20-days-on, 5-days-off regimen. Pincus wanted to provide women in his trials with reassurance that they weren’t pregnant, and to know himself that the pill was working as a contraceptive. Rock agreed. After the F.D.A. approved the pill in 1960, however, those few days of light bleeding took on a new significance. Anticipating the church’s opposition, Rock became not just a researcher but also an advocate. In his 1963 book “The Time Has Come: A Catholic Doctor’s Proposals to End the Battle Over Birth Control,” he argued that the pill was merely a scientific extension of the church-sanctioned “rhythm method.” It “completely mimics” the body’s own hormones, he wrote, to extend the “safe period” in which a woman could have intercourse and not become pregnant. © 2019 The New York Times Company

Keyword: Hormones & Behavior; Sexual Behavior
Link ID: 26892 - Posted: 12.12.2019

By Diana Kwon MDMA, or ecstasy, once had the reputation of exclusively being an illicit party drug popular at raves and dance clubs. That view has changed in recent years. The substance, known for its ability to produce feelings of euphoria and affection for others, has developed a new identity as a promising therapeutic tool. Researchers are currently investigating MDMA-assisted therapy as a potential treatment for post-traumatic stress disorder in late-stage clinical trials. The drug’s capacity to enhance sociability has also led to studies investigating its benefits for other conditions, such as social anxiety in individuals with autism spectrum disorder. Despite the promise of its therapeutic benefits, concern persists among some scientists that MDMA could be abused because its pleasurable effects can make it addictive. “By no means [does the drug] have the addictive liability of methamphetamine or certain opioids,” says Robert Malenka, a professor of psychiatry and behavioral sciences at Stanford University. “But it does have abuse potential.” A new study by Malenka and his team suggests it may be possible to circumvent this risk. The findings, published today in Science Translational Medicine, reveal that MDMA’s sociability-enhancing abilities and its pleasurable properties are controlled by distinct pathways in the brain—at least in mice. That insight opens the possibility of developing a safer version of the drug. Previous research by Malenka’s group and others had revealed that MDMA stimulated the release of both serotonin and dopamine in the brain. The existing evidence suggested the drug’s effects on sociability were linked to serotonin and its addictive potential to dopamine, but the extent to which these pathways were distinct was unknown. “Separating out the prosocial from the addictive effects has tremendous implications for drug development,” says Boris Heifets, an anesthesiologist at Stanford and lead author of the latest study. A key question is, “Can we make something with the same kind of prosocial effect that maybe isn’t as prone to abuse?” © 2019 Scientific American

Keyword: Drug Abuse; Depression
Link ID: 26891 - Posted: 12.12.2019

By Nicholas Bakalar Sleeping a lot may increase the risk for stroke, a new study has found. Chinese researchers followed 31,750 men and women whose average age was 62 for an average of six years, using physical examinations and self-reported data on sleep. They found that compared with sleeping (or being in bed trying to sleep) seven to eight hours a night, sleeping nine or more hours increased the relative risk for stroke by 23 percent. Sleeping less than six hours a night had no effect on stroke incidence. The study, in Neurology, also found that midday napping for more than 90 minutes a day was associated with a 25 percent increased risk for stroke compared with napping 30 minutes or less. And people who both slept more than nine hours and napped more than 90 minutes were 85 percent more likely to have a stroke. The study controlled for smoking, drinking, exercise, family history of stroke, body mass index and other health and behavioral characteristics. The reason for the association is unclear, but long sleep duration is associated with increased inflammation, unfavorable lipid profiles and increased waist circumference, factors known to increase cardiovascular risk. © 2019 The New York Times Company

Keyword: Stroke; Sleep
Link ID: 26890 - Posted: 12.12.2019

By Sharon Begley, STAT Even allowing for the fact that these were lilliputian brains, they were not behaving at all according to plan. From the first days of the tiny lab-grown organs’ development, primitive “progenitor cells” romped out of their birthplaces in the deep interior and quickly turned into neurons and glia, specialized cells that do the brain’s heavy lifting, from thinking and feeling and moving to boring old neurological housekeeping. But the cells were jumping the gun. In healthy developing human brains, progenitor cells spend a good chunk of prenatal existence simply reproducing, vastly increasing their numbers and postponing becoming other brain cells. The impatient progenitor cells, however, were in cerebral organoids—minuscule 3-D versions of the brain—created from the cells of people with Huntington’s disease in hopes of mimicking the patients’ actual brain development decades earlier. It was new evidence that, in their understanding of this devastating genetic illness, scientists know only half the story: In addition to being a neurodegenerative disease, it is also neurodevelopmental, starting in the womb. These recent findings and other research are spurring a radical rethinking of Huntington’s, with implications for the age when any potential cure is likely to be most effective. “It’s not conclusive, but there is suggestive evidence that neurodevelopment is altered in Huntington’s disease,” said neurobiologist Mahmoud Pouladi of the National University of Singapore, who led the organoid work. If so, then if scientists discover a way to repair the mutant gene or remove the aberrant molecules it makes, “the earlier you intervene the better it should be.” In contrast, today’s most-watched clinical trials in Huntington’s include only adults, often middle-aged ones, reflecting the belief that most mutation carriers can reach their 30s or beyond cerebrally unscathed. In fact, doctors and advocacy groups strongly discourage genetic testing for Huntington’s in anyone under 18, presuming there’s nothing to be gained. According to the genetic-testing guidelines from the Huntington’s Disease Society of America, “Predictive testing of minors currently has no medical benefits and the possibility for psychosocial harm and lowered self-esteem is high.” © 2019 Scientific American

Keyword: Huntingtons
Link ID: 26889 - Posted: 12.11.2019

New results from the largest long-term study of brain development and children's health raise provocative questions about obesity and brain function. Does excess body weight somehow reduce brain regions that regulate planning and impulse control? Is obesity a result of that brain difference? Or are eating habits, lifestyle, family circumstances and genetics to blame? Previous studies in children and adults have had conflicting results. The new research doesn't settle the matter and outside experts cautioned that misinterpreting it could unfairly perpetuate weight stigma. But an editorial published with the study Monday in JAMA Pediatrics called it an important addition to mounting evidence of a link between weight, brain structure and mental function. If follow-up research confirms the findings, it could lead to new ways to prevent obesity that target improved brain function. "We don't know which direction these relationships go nor do they suggest that people with obesity are not as smart as people at a healthy weight,"said Dr. Eliana Perrin, a Duke University pediatrics professor who co-wrote the editorial. The federally-funded study involved 3,190 U.S. children aged 9 and 10. They had height and weight measurements, MRI brain scans and computer-based tests of mental function including memory, language, reasoning and impulse control. Nearly 1,000 kids — almost 1 in 3 —were overweight or obese, similar to national statistics. Inflammatory changes early in life ©2019 CBC/Radio-Canada

Keyword: Obesity
Link ID: 26888 - Posted: 12.11.2019

By Eva Frederick Many human grandmothers love to spoil their grandchildren with attention and treats, and for good reason: Studies have shown that having a living grandmother increases a child’s chance of survival. Now, new research shows the same may be true for killer whales. By providing young animals with some freshly caught salmon now and then—or perhaps with knowledge on where to find it—grannies increase their grand-offspring’s chance of survival. The new study is the first direct evidence in nonhuman animals of the “grandmother hypothesis.” The idea posits that females of some species live long after they stop reproducing to provide extra care for their grandchildren. “It’s very cool that these long-lived cetaceans have what looks like a postfertile life stage,” says Kristen Hawkes, an anthropologist at the University of Utah in Salt Lake City who has dedicated much of her career to studying the grandmother effect; she was not involved in the new study. Women usually go through menopause between ages 45 and 55, even though they may live to age 80, 90, or older. Studies in modern-day hunter-gatherer communities as well as in populations in Finland and Canada show that older women can help increase the number of children their daughters have, and boost the survival rates of their grandchildren. Dan Franks, a computer scientist and biologist at the University of York in the United Kingdom, wanted to know whether this grandmother effect occurs in other species as well. © 2019 American Association for the Advancement of Science

Keyword: Sexual Behavior; Evolution
Link ID: 26887 - Posted: 12.10.2019

By Nayef Al-Rodhan Facebook recently announced it had acquired CTRL-Labs, a U.S. start-up working on wearable tech that allows people to control digital devices with their brain. The social media company is only the latest in a long string of firms investing in what has come to be termed “neurotechnology.” Earlier this year Neuralink, a company backed by Elon Musk, announced that it hopes to begin human trials for computerized brain implants. These projects may seem like science fiction, but this drive to get more out of our brains is nothing new—from tea, caffeine and nicotine, to amphetamines and the narcolepsy drug Modafinil, drugs have long been used as rudimentary attempts at cognitive enhancement. And in our tech-driven world, the drive to cognitively enhance is stronger than ever—and is leading us to explore new and untested methods. In today’s hypercompetitive world, everyone is looking for an edge. Improving memory, focus or just the ability to work longer hours are all key to getting ahead, and a drug exists to improve each of them. In 2017, 30 percent of Americans said they had used “smart drug” supplements, known as nootropics, at least once that year, even if studies repeatedly demonstrate that they have a negligible effect on intellect. Advertisement For some, however, nootropics are not enough, and so they turn to medical-grade stimulants. The most famous of these is Adderall, which boosts focus and productivity far more than commercial nootropics. A well-established black market thrives on university campuses and in financial centers, supplying these drugs to people desperate to gain a competitive edge. © 2019 Scientific American

Keyword: Learning & Memory; Drug Abuse
Link ID: 26886 - Posted: 12.10.2019

By Jonah Engel Bromwich Pete Frates, a former college baseball player whose participation in the social media phenomenon known as the Ice Bucket Challenge helped raise more than $100 million toward fighting amyotrophic lateral sclerosis, commonly known as A.L.S. or Lou Gehrig’s disease, died on Monday at his home in Beverly, Mass. He was 34. His death was announced in a statement by Boston College, his alma mater. Quoting his family, it said he died “after a heroic battle with A.L.S.” Mr. Frates learned he had the disease in 2012. A.L.S. attacks the body’s nerve cells and leads to full paralysis. Patients are typically expected to live for two to five years from the time of diagnosis. Mr. Frates did not create the Ice Bucket Challenge, in which participants dumped buckets of ice water over their heads while pledging to donate money to fight A.L.S. But a Facebook video in July 2014 showing him doing his version of the challenge — in which he bobbed his head to Vanilla Ice’s song “Ice Ice Baby” — prompted a surge in participation that summer, to where it became a viral sensation. LeBron James, Bill Gates, Oprah Winfrey and other celebrities stepped forward to be drenched, and millions of others followed suit. Mr. Frates became one of the most visible supporters of the effort, and in August 2014 he completed the challenge again (this time with ice water) at Fenway Park, along with members of the Boston Red Sox organization. The videos were inescapable for anyone on Facebook, and the A.L.S. Association, a Washington-based nonprofit that works to fight the disease, received more than $115 million. In 2015 the organization released an infographic showing how those funds were being spent. About $77 million, or 67 percent, of the money was used for research that ultimately identified the NEK1 gene, which contributes to the disease. The finding gave scientists guidance in developing treatment drugs. © 2019 The New York Times Company

Keyword: ALS-Lou Gehrig's Disease
Link ID: 26885 - Posted: 12.10.2019

By Andrea Petersen Anne Firmender, 74, was working with her psychologist to come up with a list of her positive attributes. “I cook for others,” said Ms. Firmender. “It’s giving,” encouraged the psychologist, Dimitris Kiosses. “Good kids,” continued Ms. Firmender, who has four grown children and four grandchildren. “And great mother,” added Dr. Kiosses. Ms. Firmender smiled. Dr. Kiosses typed up the list and handed a printout to Ms. Firmender to take home. “When you’re feeling down and hard on yourself, you can remind yourself of your strengths,” he told her. Ms. Firmender, who has a history of mental health problems, was in therapy for depression. But she also has mild cognitive impairment and can have trouble remembering what day it is. So Dr. Kiosses was treating her with a novel approach called Problem Adaptation Therapy, or PATH. The therapy, developed at Weill Cornell Medicine in New York City and White Plains, N.Y., focuses on solving tangible problems that fuel feelings of sadness and hopelessness. It incorporates tools, like checklists, calendars, signs and videos, to make it accessible for people with memory issues. A caregiver is often involved. The approach is one of several new psychotherapies to treat anxiety and depression in people with cognitive impairments, including early to moderate dementia. Another, the Peaceful Mind program, developed by researchers at Baylor College of Medicine and elsewhere for patients with anxiety and dementia, simplifies traditional cognitive behavioral therapy and focuses on scheduling pleasurable activities and skills, like deep breathing. Therapy sessions are short and take place in patients’ homes. A program designed by researchers at University College London gives cards to patients to take home to remind them of key strategies. One that says “Stop and Think” prompts them to pause when they have panicky and unhelpful thoughts to help keep those thoughts from spiraling and creating more anxiety. © 2019 The New York Times Company

Keyword: Alzheimers; Depression
Link ID: 26884 - Posted: 12.09.2019

Andrew Anthony Katrina Karkazis, a senior research fellow at Yale University, is a cultural anthropologist working at the intersection of science, technology, gender studies and bioethics. With Rebecca Jordan-Young, a sociomedical scientist, she has written Testosterone: An Unauthorised Biography. It is a critique of both popular and scientific understandings of the hormone, and how they have been used to explain, or even defend, inequalities of power. You suggest that testosterone is understood as an exclusively male hormone, even though it’s also found in women. But surely no scientist believes this. No, what we’re saying is that the hormone has a century-long biography and identity that continues to be that of a male sex hormone. That language is used by authoritative sources in the US like the National Library of Medicine, but also in many media articles. It’s an argument that has to do with how the hormone is understood, which then shapes the kinds of research questions that get asked, what kinds of research get done or not done. There’s actually almost no research on the relationship between testosterone and aggression in women. That is a consequence of the framing of the hormone as having to do with men, masculinity, behaviours understood and framed as masculine. It’s the idea that because men generally have more testosterone, somehow that makes it more relevant in men. But the truth is we know very little about it. You write that testosterone’s authorised biography is about libido, aggression and masculinity. Does this mean that testosterone is not about these things? I think that it’s still very widely understood as the driver of all things masculine. When people think about testosterone, aggression is one of the first things that comes to mind. But when you look at the evidence, there’s not good evidence at all. In fact, it’s very weak regarding the relationship between endogenous testosterone [ie testosterone that originates within an organism] and aggression. So it’s an artefact of the ideology of testosterone that we continue to believe that it drives aggression, because aggression has been framed as a masculine behaviour and testosterone has been framed as a masculine hormone. © 2019 Guardian News & Media Limited

Keyword: Hormones & Behavior; Sexual Behavior
Link ID: 26883 - Posted: 12.09.2019

By Aaron E. Carroll and Austin Frakt Both of us have sleep apnea, and both of us receive treatment that makes a world of difference. It could make a big difference in your life, too. Sleep apnea is quite common, with estimates that it affects up to 17 percent of men 50 to 70, and 10 percent of men 30 to 49. But there’s a problem. In the American health system, we often make it hard for people to get care, and the same is true here. Obstructive sleep apnea is when the upper airway collapses during sleep, leading to periods of, well, not breathing. About 24 million Americans have sleep apnea and don’t know it, research suggests, and many who do know don’t get treatment. The consequences can be severe. It’s a leading cause of vehicle accidents, as apnea-afflicted drivers fall asleep behind the wheel. Snoring and sleep apnea are on the same spectrum and are associated with Type 2 diabetes in adults. Treatment is associated with improvements in insulin resistance. Having sleep apnea, and not treating it, increases the risk of postoperative cardiovascular surgery complications. Treating sleep apnea improves sleep duration and quality. People who sleep better are much happier and healthier in general. Reducing snoring also helps partners sleep better. How hard is it to get used to a mask? We were treated with continuous positive airway pressure (CPAP). It’s intrusive, though not nearly as much as we had feared. Each night we strap on masks connected to CPAP machines. The modern machines are silent. And we both use masks that cover only our nostrils, though others need full face masks. The air that the machines push through the masks keeps our airways open. It takes some getting used to, but we adapted within a week. This isn’t to say that it’s not a big deal for many people — it can be. But it’s not as scary as many fear. © 2019 The New York Times Company

Keyword: Sleep
Link ID: 26882 - Posted: 12.09.2019

By Steve Taylor In the second half of the 19th century, scientific discoveries—in particular, Darwin’s theory of evolution—meant that Christian beliefs were no longer feasible as a way of explaining the world. The authority of the Bible as an explanatory text was fatally damaged. The new findings of science could be utilized to provide an alternative conceptual system to make sense of the world—a system that insisted that nothing existed apart from basic particles of matter, and that all phenomena could be explained in terms of the organization and the interaction of these particles. One of the most fervent of late 19th century materialists, T.H. Huxley, described human beings as “conscious automata” with no free will. As he explained in 1874, “Volitions do not enter into the chain of causation…. The feeling that we call volition is not the cause of a voluntary act, but the symbol of that state of the brain which is the immediate cause." This was a very early formulation of an idea that has become commonplace amongst modern scientists and philosophers who hold similar materialist views: that free will is an illusion. According to Daniel Wegner, for instance, “The experience of willing an act arises from interpreting one’s thought as the cause of the act.” In other words, our sense of making choices or decisions is just an awareness of what the brain has already decided for us. When we become aware of the brain’s actions, we think about them and falsely conclude that our intentions have caused them. You could compare it to a king who believes he is making all his own decisions, but is constantly being manipulated by his advisors and officials, who whisper in his ear and plant ideas in his head. © 2019 Scientific American

Keyword: Consciousness
Link ID: 26881 - Posted: 12.07.2019

By Carolyn Gramling Exceptionally preserved skulls of a mammal that lived alongside the dinosaurs may be offering scientists a glimpse into the evolution of the middle ear. The separation of the three tiny middle ear bones — known popularly as the hammer, anvil and stirrup — from the jaw is a defining characteristic of mammals. The evolutionary shift of those tiny bones, which started out as joints in ancient reptilian jaws and ultimately split from the jaw completely, gave mammals greater sensitivity to sound, particularly at higher frequencies (SN: 3/20/07). But finding well-preserved skulls from ancient mammals that can help reveal the timing of this separation is a challenge. Now, scientists have six specimens — four nearly complete skeletons and two fragmented specimens — of a newly described, shrew-sized critter dubbed Origolestes lii that lived about 123 million years ago. O. lii was part of the Jehol Biota, an ecosystem of ancient wetlands-dwellers that thrived between 133 million and 120 million years ago in what’s now northeastern China. The skulls on the nearly complete skeletons were so well-preserved that they were able to be examined in 3-D, say paleontologist Fangyuan Mao of the Chinese Academy of Sciences in Beijing and colleagues. That analysis suggests that O. lii’s middle ear bones were fully separated from its jaw, the team reports online December 5 in Science. Fossils from an older, extinct line of mammals have shown separated middle ear bones, but this newfound species would be the first of a more recent lineage to exhibit this evolutionary advance. © Society for Science & the Public 2000–2019

Keyword: Hearing; Evolution
Link ID: 26880 - Posted: 12.07.2019

By Laura Sanders Call it a comeback — maybe. After being shelved earlier this year for lackluster preliminary results, a drug designed to slow Alzheimer’s progression is showing new signs of life. A more in-depth look at the data from two clinical trials suggests that patients on the biggest doses of the drug, called aducanumab, may indeed benefit, the company reported December 5. People who took the highest amounts of the drug declined about 30 percent less, as measured by a commonly used Alzheimer’s scale, than people who took a placebo, Samantha Haeberlein of the biotechnology company Biogen reported at the Clinical Trials on Alzheimer’s Disease meeting in San Diego. With these encouraging results in hand, Biogen, based in Cambridge, Mass., plans to seek drug approval from the U.S. Food and Drug Administration in early 2020. The results are “exhilarating, not just to the scientific community but our patients as well,” Sharon Cohen, a behavioral neurologist at the Toronto Memory Program, said during a panel discussion at the meeting. Cohen participated in the clinical trials and has received funding from Biogen. The presentation marks “an important moment for the Alzheimer’s field,” says Rebecca Edelmayer, director of scientific engagement for the Alzheimer’s Association in Chicago. Alzheimer’s disease slowly kills cells in the brain, gradually erasing people’s abilities to remember, navigate and think clearly. Current Alzheimer’s medicines can hold off symptoms temporarily, but don’t fight the underlying brain destruction. A treatment that could actually slow or even stop the damage would have a “huge impact for patients and their caregivers,” she says. © Society for Science & the Public 2000–2019

Keyword: Alzheimers
Link ID: 26879 - Posted: 12.06.2019

By Kelly Servick When Samantha Budd Haeberlein, Biogen’s head of clinical development, took the stage in San Diego, California, before a room full of Alzheimer’s disease researchers and physicians this morning, she knew she had some explaining to do. In October, the pharmaceutical company, based in Cambridge, Massachusetts, unexpectedly revived an experimental Alzheimer’s drug that it had declared a failure 7 months earlier. Ever since, scientists and industry analysts have been hungry for more detail about two large clinical trials meant to prove that Biogen’s drug, an antibody called aducanumab, slows down cognitive decline in the early stages of disease. At the Clinical Trials on Alzheimer’s Disease congress today, Budd Haeberlein tried to clarify what has emboldened the company to apply to the U.S. Food and Drug Administration (FDA) for market approval for aducanumab early next year. After analyzing more patient data than were available at the time of a discouraging preliminary analysis, she explained, the company found evidence that the higher of two tested doses led to 22% less cognitive decline after 78 weeks than a placebo in one trial. However, the other trial failed to show any benefit, leaving some researchers with a grim outlook on the drug. “I surely don’t think that it should be given market approval on the basis of these data,” says Robert Howard, a psychiatrist at University College London who has run clinical trials of potential Alzheimer’s treatments. More positive results from a subset of patients that weren’t preselected at the trial’s launch are not convincing, he says. “[Biogen has] broken all the rules, really, about how you analyze data and report it.” © 2019 American Association for the Advancement of Science.

Keyword: Alzheimers
Link ID: 26878 - Posted: 12.06.2019

By Anisha Kalidindi The room is pitch black. Every light, from the power button on the computer to the box controlling the microscope, is covered with electrical tape. I feel a gush of air as the high-powered AC kicks on, offsetting the heat emitted from the microscope’s lasers. I take my mouse out of its cage and get ready to image its brain. I’m wearing a red headlamp so I can see, but it is still quite dim. I peer closely at my lab notebook and note the two positions: –1, +2. I recite them repeatedly in a hushed tone, so I don’t forget; it is 1 A.M., after all. I hook the mouse up to the stage of the microscope and then use my handy toothpick to make sure its head position is correct. While there are many unsung heroes of science—veterinarians, lab technicians, graduate students (I might be a bit biased with this one!)—these aren’t the ones I’m talking about. I’m talking about a toothpick that played a significant role in my research project. Advertisement I am lucky enough to have access to a cutting-edge microscope and several other pieces of expensive equipment in my lab. But can also find things you might never guess were used in science: red-light headlamps, black electrical tape, and toothpicks. Using the microscope, I can take a picture of a mouse’s living, working brain through a literal window: a piece of glass that replaces a small piece of the animal’s skull. To image the mouse, we affix a plastic bar on the front of its head and then secure the bar to a head-mounting device on the stage under the microscope lens. Using this mount, we can precisely position the head up and down and right to left. This is where our problem starts. © 2019 Scientific American

Keyword: Learning & Memory
Link ID: 26877 - Posted: 12.06.2019

Richard Harris Scientists know that if they transfuse blood from a young mouse to an old one, then they can stave off or even reverse some signs of aging. But they don't know what in the blood is responsible for this remarkable effect. Researchers now report that they've identified hundreds of proteins in human blood that wax and wane in surprising ways as we age. The findings could provide important clues about which substances in the blood can slow aging. The scientists studied nearly 3,000 proteins in blood plasma that was drawn from more than 4,000 people with a span of ages from 18 to 95. The project focused on proteins that change in both men and women. "When we went into this, we assumed you aged gradually, so we would see these changes taking place relatively steadily as individuals get older," said Tony Wyss-Coray, a professor of neurology at Stanford University. Instead, Wyss-Coray and his colleagues report in Nature Medicine on Thursday that these proteins change in three distinct waves, the first of which happens "very surprisingly" during our 30s, peaking around age 34. "Then we found a second wave around 60, and then we found a third one, the most prominent one, really around 80 years of age," Wyss-Coray said. (An earlier version of their paper is freely available on the bioRxiv preprint server.) This observation raises a host of questions about the biology of aging. What age-related transition is occurring in our 30s? And what do the changes in the blood actually mean? "Most of the proteins in the blood are actually from other tissue sources," he said. "So we can start to ask where ... these proteins come from and if they change with age," he said. For example, in proteins traced back to the liver, "that would tell us that the liver is aging." © 2019 npr

Keyword: Alzheimers
Link ID: 26876 - Posted: 12.06.2019

By Viorica Marian Psycholinguistics is a field at the intersection of psychology and linguistics, and one if its recent discoveries is that the languages we speak influence our eye movements. For example, English speakers who hear candle often look at a candy because the two words share their first syllable. Research with speakers of different languages revealed that bilingual speakers not only look at words that share sounds in one language but also at words that share sounds across their two languages. When Russian-English bilinguals hear the English word marker, they also look at a stamp, because the Russian word for stamp is marka. Even more stunning, speakers of different languages differ in their patterns of eye movements when no language is used at all. In a simple visual search task in which people had to find a previously seen object among other objects, their eyes moved differently depending on what languages they knew. For example, when looking for a clock, English speakers also looked at a cloud. Spanish speakers, on the other hand, when looking for the same clock, looked at a present, because the Spanish names for clock and present—reloj and regalo—overlap at their onset. The story doesn’t end there. Not only do the words we hear activate other, similar-sounding words—and not only do we look at objects whose names share sounds or letters even when no language is heard—but the translations of those names in other languages become activated as well in speakers of more than one language. For example, when Spanish-English bilinguals hear the word duck in English, they also look at a shovel, because the translations of duck and shovel—pato and pala, respectively—overlap in Spanish. © 2019 Scientific American

Keyword: Language; Attention
Link ID: 26875 - Posted: 12.06.2019