Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Lindsay Grace The World Health Organization’s description of “gaming disorder” as an “addictive behavior disorder” includes a vague description of how much digital gaming is too much. The WHO warns that “people who partake in gaming should be alert to the amount of time they spend on gaming activities.” At what point does a leisure activity turn into an addiction? Games researchers are no strangers to complaints about the dangers of too much game playing. Video games have been blamed for causing aggression, unemployment and even the vitamin D deficiency called rickets. Games have also, of course, been championed for improving surgical skills, encouraging pro-social behavior, aiding in cancer treatment and helping develop new AIDS medications. New forms of popular media are often targets of public concern, going back to dime-store novels, comic books and jazz, all the way through rock ’n’ roll and rap. But those fears eventually wane, and society embraces work like “Maus,” the first graphic novel to be a National Book Award finalist and rapper Kendrick Lamar, who won a Pulitzer Prize earlier this year. Digital video games can be exceptionally enticing and engaging. Regarding the risk of addiction, it is interesting to analyze the WHO’s warnings about excessive gaming in the wider context of leisure. As part of the Games for Change conference, I and others who study psychology, serious games and youth advocacy will be talking about the myths of games, media and technology addiction. © 2010–2018, The Conversation US, Inc.
Keyword: Drug Abuse
Link ID: 25149 - Posted: 06.28.2018
Shawna Williams The US Food and Drug Administration today (June 25) approved for the first time a marijuana-derived drug, Epidiolex, for the treatment of two rare forms of epilepsy. The drug contains cannabidiol, or CBD, and does not make users high while reducing the rate of seizures in patients with Dravet or Lennox-Gastaut syndromes, clinical trials show. “In my practice, I often see patients with these highly treatment-resistant epilepsies who have tried and failed existing therapies and are asking about CBD,” says Orrin Devinsky of NYU Langone Health, a lead investigator in the trials, in a statement released by the company that makes Epidiolex. “I am delighted that my physician colleagues and I will now have the option of a prescription cannabidiol that has undergone the rigor of controlled trials and been approved by the FDA to treat both children and adults.” Both Dravet and Lennox-Gastaut are relatively severe forms of epilepsy that can be fatal, STAT News notes. While there are other drugs approved to treat Lennox-Gastaut, there had previously been none for Dravet. Some parents have used unapproved CBD oils to treat their children. In a statement released today, FDA notes that it “has taken recent actions against companies distributing unapproved CBD products. . . . We’ll continue to take action when we see the illegal marketing of CBD-containing products with unproven medical claims.” © 1986 - 2018 The Scientist Magazine®
Keyword: Epilepsy; Drug Abuse
Link ID: 25148 - Posted: 06.27.2018
By David Grimm —As soon as the big yellow school bus pulls into the parking lot of the Oregon National Primate Research Center (ONPRC) here, it’s clear that many of the high school students on board don’t know what they’ve signed up for. They know that science happens somewhere on this wooded, 70-hectare campus west of Portland—and that they may get to see monkeys—but everything else is a mystery. “Are we going to go into some giant underground lair?” asks a lanky sophomore in a hoodie, imagining that the center is set up like a video game or Jurassic Park. Diana Gordon is here to disabuse him of both notions. As the education and outreach coordinator of the country’s largest primate research center, she spends her days guiding students, Rotary clubs, and even wedding parties through the facility. Here, visitors see monkeys in their habitats and meet scientists—all while learning, Gordon hopes, that the animals are well-treated and the research is critical for human health. “If we don’t speak up, there’s only one side being heard,” she says. “The side that wants to shut us down.” That side has been racking up victories recently. In the past 6 months, animal activist groups have won bipartisan support in Congress to scuttle monkey and dog studies at top U.S. research facilities; they have also helped pass two state bills that compel researchers to adopt out lab animals at the end of experiments. The public itself seems to be turning against animal research: A Gallup poll released last year revealed that only 51% of U.S. adults find such studies morally acceptable, down from 65% in 2001. © 2018 American Association for the Advancement of Scienc
Keyword: Animal Rights
Link ID: 25147 - Posted: 06.27.2018
Sukanya Charuchandra At times a predominant layer of the developing brain, the subplate disappears in the adult human brain—or so researchers believed. In findings published in Cell Stem Cell on June 21, scientists propose that neurons from the human subplate, which underlies the tissue that will become the cortex, relocate into the cortex. The researchers found high levels of a protein, known to help cells migrate into the cortex, in stem cell–derived subplate neurons. These relocated subplate cells may be associated with neurological diseases. “A lot of the genes associated with autism are first expressed in the subplate,” M. Zeeshan Ozair, a coauthor on the paper, says in a statement. “And if subplate neurons don’t die but instead become part of the cortex, they will carry those mutations with them.” M.Z. Ozair et al., “hPSC modeling reveals that fate selection of cortical deep projection neurons occurs in the subplate,” Cell Stem Cell, doi:10.1016/j.stem.2018.05.024, 2018. © 1986 - 2018 The Scientist Magazine
Keyword: Development of the Brain
Link ID: 25146 - Posted: 06.27.2018
By Nicola Twilley On a foggy February morning in Oxford, England, I arrived at the John Radcliffe Hospital, a shiplike nineteen-seventies complex moored on a hill east of the city center, for the express purpose of being hurt. I had an appointment with a scientist named Irene Tracey, a brisk woman in her early fifties who directs Oxford University’s Nuffield Department of Clinical Neurosciences and has become known as the Queen of Pain. “We might have a problem with you being a ginger,” she warned when we met. Redheads typically perceive pain differently from those with other hair colors; many also flinch at the use of the G-word. “I’m sorry, a lovely auburn,” she quickly said, while a doctoral student used a ruler and a purple Sharpie to draw the outline of a one-inch square on my right shin. Wearing thick rubber gloves, the student squeezed a dollop of pale-orange cream into the center of the square and delicately spread it to the edges, as if frosting a cake. The cream contained capsaicin, the chemical responsible for the burn of chili peppers. “We love capsaicin,” Tracey said. “It does two really nice things: it ramps up gradually to become quite intense, and it activates receptors in your skin that we know a lot about.” Thus anointed, I signed my disclaimer forms and was strapped into the scanning bed of a magnetic-resonance-imaging (MRI) machine. The machine was a 7-Tesla MRI, of which there are fewer than a hundred in the world. The magnetic field it generates (teslas are a unit of magnetic strength) is more than four times as powerful as that of the average hospital MRI machine, resulting in images of much greater detail. As the cryogenic units responsible for cooling the machine’s superconducting magnet clicked on and off in a syncopated rhythm, the imaging technician warned me that, once he slid me inside, I might feel dizzy, see flashing lights, or experience a metallic taste in my mouth. “I always feel like I’m turning a corner,” Tracey said. She explained that the magnetic field would instantly pull the proton in each of the octillions of hydrogen atoms in my body into alignment. Then she vanished into a control room, where a bank of screens would allow her to watch my brain as it experienced pain. © 2018 Condé Nast.
Keyword: Pain & Touch; Brain imaging
Link ID: 25145 - Posted: 06.26.2018
By Ann Gibbons Being smart is a double-edged sword. Intelligent people appear to live longer, but many of the genes behind brilliance can also lead to autism, anxiety, and depression, according to two new massive genetic studies. The work also is one of the first to identify the specific cell types and genetic pathways tied to intelligence and mental health, potentially paving the way for new ways to improve education, or therapies to treat neurotic behavior. The studies provide some of the first “hard evidence of the many genes and pathways” that work together in complex ways to build smart brains and keep them in balance, says geneticist Peter Visscher of the Queensland Brain Institute at The University of Queensland in Brisbane, Australia, who was not involved in the work. Researchers have long known that people often inherit intelligence and some personality disorders from their parents. (Environmental factors such as education and stress also profoundly shape intelligence and mental health.) But geneticists have had trouble identifying more than a handful of genes associated with intelligence. Last year, researchers used new statistical methods that can detect strong associations between genes and specific traits to analyze health and genetic records in huge data sets. This led to the discovery of 52 genes linked to intelligence in 80,000 people. © 2018 American Association for the Advancement of Science
Keyword: Genes & Behavior; Schizophrenia
Link ID: 25144 - Posted: 06.26.2018
By Lisa Feldman Barrett Jasanoff’s big message in “The Biological Mind” is you are not your brain. Or rather, you are not merely your brain — your body and the broader circumstances of your life also make you who you are. Jasanoff reminds us that the brain is not some mystical machine — it’s a gooey, bloody tangle of cells, dripping with chemicals. But we mythologize brains, creating false boundaries that divorce them from bodies and the outside world, blinding us to the biological nature of the mind. These divisions, Jasanoff contends, are why neuroscience has failed to make a real difference in anyone’s life. Unfortunately, the book’s own divisions between body versus brain, and nature versus nurture, reinforce the very dualisms that Jasanoff indicts. He gives examples of the ways our bodies and the world around us affect our thoughts, feelings and actions, but not how body and world become biologically embedded to constitute a mind. Missing is a discussion of how the workings of your body necessarily and irrevocably shape your brain’s structure and function, and vice versa. The artificial boundary between brain and world also goes largely unmentioned. In real life, the experiences we have from infancy onward impact the brain’s wiring. For example, childhood poverty and adversity fundamentally alter brain development, leaving an indelible mark that increases people’s risk of illness in adulthood. This is fascinating and profound stuff, but it mostly goes unexamined in Jasanoff’s book. Still, “The Biological Mind” is chock-full of fun facts that entertain. And best of all, it makes you think. I found myself debating with Jasanoff in my head as I read — surely a sign of a worthy book. © 2018 The New York Times Company
Keyword: Learning & Memory
Link ID: 25143 - Posted: 06.26.2018
By Elizabeth Gamillo Why does a wild rabbit flee when a person approaches it, but a domestic rabbit sticks around for a treat? A new study finds that domestication may have triggered changes in the brains of these—and perhaps other—animals that have helped them adapt to their new, human-dominated environment. The new study provides “specific and new insights” into the ongoing debate over the physiological factors shaping domestication and evolution, says Marcelo Sánchez-Villagra, a professor of paleobiology at the University of Zurich in Switzerland who was not involved with the work. The leader of the research team, animal geneticist Leif Andersson of Uppsala University in Sweden and Texas A&M University in College Station, thinks the process of domestication has led to changes in brain structure that allow the rabbit to be less nervous around humans. To find out, he and colleagues took MRI scans of the brains of eight wild and eight domestic rabbits and compared the results. The team found that the amygdala, a region of the brain that processes fear and anxiety, is 10% smaller in domesticated rabbits than in wild rabbits. Meanwhile, the medial prefrontal cortex, which controls responses to aggressive behavior and fear, is 11% larger in domesticated rabbits. The researchers also found that the brains of domesticated rabbits are less able to process information related to fight-or-flight responses because they have less white matter than their feral cousins do. White matter handles information processing. When a wild rabbit is in danger, more white matter is needed for faster reflexes and for learning what to be afraid of. © 2018 American Association for the Advancement of Science.
Keyword: Evolution; Emotions
Link ID: 25142 - Posted: 06.26.2018
Erika Engelhaupt The first scientific experiment on hormones took an approach that sounds unscientific: lopping off roosters’ testicles. It was 1848, and Dr. Arnold Berthold castrated two of his backyard roosters. The cocks’ red combs faded and shrank, and the birds stopped chasing hens. Then things got really weird. The doctor castrated two more roosters and implanted a testicle from each into the other’s abdomen. As Randi Hutter Epstein writes in a new book, each rooster “had nothing between his drumsticks but a lone testicle in his gut — yet he turned back into a full-fledged hen-chaser, red comb and all.” It was the first glimpse that certain body parts must produce internal secretions, as hormones were first known, and that these substances — and not just nerves — were important to the body’s control systems. Today, we know that hormones are chemical messengers shaping everything from sex and development to sleep, stress, mood, metabolism and behavior. Yet few of us know much about these powerful substances coursing through our bodies. That ignorance makes Aroused — titled for the Greek meaning of the word hormone — an invaluable guide. Epstein, a medical writer and M.D., tells the history of hormone research from that first rooster experiment, but cleverly moves back and forth through time, avoiding any hint of dry recitation. She explores the scientists who discovered and deciphered the effects of important hormones, as well as the personal stories of how people’s lives have been profoundly changed by these chemicals. |© Society for Science & the Public 2000 - 2018
Keyword: Hormones & Behavior
Link ID: 25141 - Posted: 06.26.2018
By Seth Mnookin In February 1981, a British psychiatrist named Lorna Wing published an academic paper highlighting a 1944 clinical account of “autistic psychopathy” by a recently deceased Austrian physician named Hans Asperger. It wasn’t an obvious piece of work to single out: As Wing acknowledged, Asperger’s study had received almost no attention from English-language researchers in the decades since publication. That was about to change. Wing argued that the disorder that Asperger had described was a unique syndrome, distinct from autism, and should be considered as one of “a wider group of conditions which have, in common, impairment of development of social interaction, communication and imagination.” Wing, whose daughter had been diagnosed with autism in the 1950s, understood from her own experience that this was a disorder with multiple gradations, which affected people across the full spectrum of intellectual abilities. But this was a radical notion: At the time, one of the dominant paradigms for understanding autism was that the condition was caused by “refrigerator mothers” — emotionally frigid women who were not warm enough to nurture developing children. It’s impossible to know why Wing chose to ground her report in Asperger’s rather flimsy research — his paper, after all, had referenced just four patients — rather than relying solely on her own, significantly more impressive work. (It is worth pointing out that then, as now, virtually all eponymous psychiatric conditions were named after men.) Whatever her motivation, Wing’s efforts were successful: “Asperger’s syndrome,” the term she proposed, soon entered the clinical vernacular. By the 1990s, it was recognized around the world as an accepted diagnosis — and autism was no longer viewed as a singular condition. © 2018 The New York Times Company
Keyword: Autism
Link ID: 25140 - Posted: 06.26.2018
By Marcus Woo Some laughs are genuine reactions to hilarity. Others are more contrived—fake, even. But, according to a new study, people can usually tell real laughs from fake ones, regardless of cultural differences. In the first cross-cultural experiment of its kind, researchers asked 884 people from 21 different cultures in six regions around the world, from Peru to South Korea, to listen to recordings of real, spontaneous laughter, and fake, “volitional” laughter recorded from college-aged, U.S. women. On average, nearly two-thirds of listeners in each culture could tell the difference, the team reports in a study accepted for publication in Psychological Science. Genuine chuckles were typically higher pitched and louder, analysis of the sound files revealed. Similar characteristics are seen in cries of pain and anguish, the researchers say, suggesting that laughing is a more emotional and primal response that emerged early in human evolution. A fake laugh, however, is a deliberate response that likely evolved later with speech, the team says. © 2018 American Association for the Advancement of Science.
Keyword: Emotions
Link ID: 25139 - Posted: 06.26.2018
By Austin Frakt One of the lighter moments along my journey to receiving a sleep apnea diagnosis was learning that “heroic snoring” is a clinical term. It sounds more like an oddball super power — snores that can be clearly heard through walls. Many of us have such a snorer in our lives, and some endure the disruption it causes nightly. We hardly need research to appreciate the difficulties this poses. Yet some studies on it have been done, and they document that snoring can lead to marital disruption, and that snorers’ bed partners can experience insomnia, headaches and daytime fatigue. But heroic (and less-than-heroic) snoring can also be a sign of an even deeper problem: obstructive sleep apnea, which is marked by a collapse of the upper airway leading to shallow breathing or breathing cessation that causes decreases in blood oxygen. Sleep apnea can be downright deadly, and not just for those who have it. It’s associated with a greater risk of depression, heart attacks, strokes and other cardiovascular conditions, as well as insulin resistance. As I learned, there’s no reason to meekly accept sleep apnea: There are many treatment options that can control it. The stakes are not small. In the last five years, crashes involving an Amtrak train in South Carolina, a Long Island Rail Road train, a New Jersey Transit train and a Metro-North train in the Bronx have resulted in multiple deaths, hundreds of injuries and tens of millions of dollars in property damage. Undiagnosed or untreated sleep apnea were blamed in each case. And these are far from the only sleep apnea-related accidents involving trains, buses, tractor-trailers and automobiles. Up to 30 percent of motor vehicle crashes are caused by sleepy drivers. Drivers with sleep apnea are nearly five times more likely to be involved in a motor vehicle accident than other drivers. One study found that 20 percent of American truck drivers admit to falling asleep at traffic lights. © 2018 The New York Times Company
Keyword: Sleep
Link ID: 25138 - Posted: 06.25.2018
Jennifer Ouellette Gerardo Ortiz remembers well the time in 2010 when he first heard his Indiana University colleague John Beggs talk about the hotly debated “critical brain” hypothesis, an attempt at a grand unified theory of how the brain works. Ortiz was intrigued by the notion that the brain might stay balanced at the “critical point” between two phases, like the freezing point where water turns into ice. A condensed matter physicist, Ortiz had studied critical phenomena in many different systems. He also had a brother with schizophrenia and a colleague who suffered from epilepsy, which gave him a personal interest in how the brain works, or doesn’t. Ortiz promptly identified one of the knottier problems with the hypothesis: It’s very difficult to maintain a perfect tipping point in a messy biological system like the brain. The puzzle compelled him to join forces with Beggs to investigate further. Ortiz’s criticism has beleaguered the theory ever since the late Danish physicist Per Bak proposed it in 1992. Bak suggested that the brain exhibits “self-organized criticality,” tuning to its critical point automatically. Its exquisitely ordered complexity and thinking ability arise spontaneously, he contended, from the disordered electrical activity of neurons. Bak’s canonical example of a self-organized critical system is a simple sandpile. If you drop individual grains of sand on top of a sandpile one by one, each grain has a chance of causing an avalanche. Bak and colleagues showed that those avalanches will follow a “power law,” with smaller avalanches occurring proportionally more frequently than larger ones. So if there are 100 small avalanches in which 10 grains slide down the side of the sandpile during a given period, there will be 10 larger avalanches involving 100 grains in the same period, and just one large avalanche involving 1,000 grains. When a huge avalanche collapses the whole pile, the base widens, and the sand begins to pile up again until it returns to its critical point, where, again, avalanches of any size may occur. The sandpile is incredibly complex, with millions or billions of tiny elements, yet it maintains an overall stability. All Rights Reserved © 2018
Keyword: Development of the Brain; Evolution
Link ID: 25137 - Posted: 06.25.2018
Paul Biegler explains. Mind-reading machines are now real, prising open yet another Pandora’s box for ethicists. As usual, there are promises of benefit and warnings of grave peril. The bright side was front and centre at the Society for Neuroscience annual meeting in Washington DC in November 2017. It was part of a research presentation led by Omid Sani from the University of Southern California. Sani and his colleagues studied six people with epilepsy who had electrodes inserted into their brains to measure detailed electrical patterns. It is a common technique to help neurosurgeons find where seizures start. The study asked patients, who can be alert during the procedure, to report their mood during scanning. That allowed the researchers to link the patients’ moods with their brainwave readings. Using sophisticated algorithms, the team claimed to predict patients’ feelings from their brainwaves alone. That could drive a big shift in the treatment of mental illness, say researchers. Deep brain stimulation (DBS), where electrodes implanted in the brain give circuits a regular zap, has been successful in Parkinson’s disease. It is also being trialled in depression; but the results, according to a 2017 report in Lancet Psychiatry, are patchy. Sani and colleagues suggest their discovery could bump up that success rate. A portable brain decoder may be available within a generation.
Keyword: Brain imaging
Link ID: 25136 - Posted: 06.25.2018
By Gretchen Reynolds The question of whether young children should use their heads on the soccer field has been a contentious one in recent years. In 2015, U.S. Youth Soccer, the organization that oversees most of the country’s leagues for children and teenagers, announced a ban on heading in games and practices by participants younger than 11, citing concerns that the play might contribute to concussions. In response, some soccer authorities pointed out that young players would be late to learn an essential soccer skill and that concussions from heading are rare in that age group regardless. Now a study presented last month at the annual convention of the American College of Sports Medicine may help quell doubts about the current regulations, which went into effect in 2016. According to studies of experienced adult soccer players, heading can generate impact forces almost equivalent to those of a helmet-to-helmet football tackle. But less attention has been directed at heading by young players and the attendant cognitive effects, if any. Last year, however, researchers in Puerto Rico gained permission to work with 30 boys and girls there, ages 9 to 11, who played in a local youth league. (Children this age are allowed to head in Puerto Rico.) The youngsters took a series of cognitive tests and were then outfitted with a specialized headband that recorded head movements and related impacts while they played. Most of the children wound up heading the ball at least once over the course of three games. Data from the headbands indicates their brains were subjected to acceleration forces ranging from 16 to 60 Gs. In adult players, 60 Gs during heading would be considered forceful enough to cause a concussion, although none of the children in the study received a concussion diagnosis. Most of the impacts were what researchers call “subconcussive,” or below the 60 G threshold. © 2018 The New York Times Company
Keyword: Brain Injury/Concussion; Development of the Brain
Link ID: 25135 - Posted: 06.25.2018
Veronique Greenwood The question most of genetics tries to answer is how genes connect to the traits we see. One person has red hair, another blonde hair; one dies at age 30 of Huntington’s disease, another lives to celebrate a 102nd birthday. Knowing what in the vast expanse of the genetic code is behind traits can fuel better treatments and information about future risks and illuminate how biology and evolution work. For some traits, the connection to certain genes is clear: Mutations of a single gene are behind sickle cell anemia, for instance, and mutations in another are behind cystic fibrosis. But unfortunately for those who like things simple, these conditions are the exceptions. The roots of many traits, from how tall you are to your susceptibility to schizophrenia, are far more tangled. In fact, they may be so complex that almost the entire genome may be involved in some way, an idea formalized in a theory put forward last year. Starting about 15 years ago, geneticists began to collect DNA from thousands of people who shared traits, to look for clues to each trait’s cause in commonalities between their genomes, a kind of analysis called a genome-wide association study (GWAS). What they found, first, was that you need an enormous number of people to get statistically significant results — one recent GWAS seeking correlations between genetics and insomnia, for instance, included more than a million people. Second, in study after study, even the most significant genetic connections turned out to have surprisingly small effects. The conclusion, sometimes called the polygenic hypothesis, was that multiple loci, or positions in the genome, were likely to be involved in every trait, with each contributing just a small part. (A single large gene can contain several loci, each representing a distinct part of the DNA where mutations make a detectable difference.) All Rights Reserved © 2018
Keyword: Development of the Brain; Genes & Behavior
Link ID: 25134 - Posted: 06.25.2018
By Neuroskeptic Do scientists have a responsibility to make their work accessible to the public? “Public Engagement”, broadly speaking, means scientists communicating about science to non-scientists. Blogs are a form of public engagement, as are (non-academic) books. Holding public talks or giving interviews would also count as such. Recently, it has become fashionable to say that it is important for scientists to engage the public, and that this engagement should be encouraged. I agree completely: we do need to encourage it, and we need to overcome the old-fashioned view that it is somehow discreditable or unprofessional for scientists to fraternize with laypeople. However, some advocates of engagement go further than I’d like. It is sometimes said that every researcher actually has a responsibility to engage the public about the work that they do. Speaking about my own experience in neuroscience in the UK, this view is certainly in the air if not explicitly stated, and I think most researchers would agree. Public engagement and ‘broader impact’ sections now appear as mandatory sections of many grant applications, for instance. In my view, making public engagement a duty for all scientists is wrong. Quite simply, scientists are not trained to do public engagement, and it isn’t what they signed up to do when they chose that career. Some scientists (like me) want to do it anyway, and they should be encouraged (if I say so myself), but many don’t want to. Cajoling the latter into doing engagement is futile. A half-baked public engagement exercise helps no-one.
Keyword: Miscellaneous
Link ID: 25133 - Posted: 06.25.2018
By Sarah DeWeerdt, The Research on Autism and Development (RAD) Laboratory is located in a Tetris-like maze of brown wooden buildings, not far from the main campus of the University of California, San Diego. The lab itself is a nondescript warren of small beige rooms. But everything else about it is extraordinary. The first clue is a T-shirt one of the lab’s young interns wears on this sunny day in April, featuring the RAD Lab’s motto: “We play mind games.” One of the newer recruits, 20-year-old Naseem Baramki-Azar, sports a “Super Mario Bros.” shirt. A half-dozen other lab members huddle around computer screens displaying none of the usual fare of charts or spreadsheets: Instead, they’re hard at work making cartoon moles pop out of molehills, or fat spaceships careen toward the top of a computer screen. The lab’s director, Jeanne Townsend, and associate director, Leanne Chukoskie, periodically poke their heads in to check on the progress. The two women, a generation apart, are a study in contrasts. Townsend is reserved, with dark-framed square glasses; Chukoskie is a fast-talker with a California blond ponytail. But they finish each other’s sentences when they talk about their quest: to develop video games that can help children with autism. The project has stretched the two neuroscientists in unfamiliar directions. “I find myself doing a lot of computer science these days,” Chukoskie says. They are also fledgling entrepreneurs. Last year, they launched a startup, BrainLeap Technologies, also based in San Diego. That step, Chukoskie says, filled her with a mix of unenthusiastic “eh” and dread-filled “ugh.” Despite their discomfort, these two scientists are part of a growing cadre braving video-game development in search of novel therapies for autism. © 2018 American Association for the Advancement of Science
Keyword: Autism
Link ID: 25132 - Posted: 06.23.2018
by Katie Herzog • On Wednesday, Vox published an article entitled "How a Pseudopenis-packing Hyena Smashes the Patriarchy’s Assumptions: Lessons from Female Spotted Hyenas for the #MeToo Era." The piece, by Katherine J. Wu, a graduate student in microbiology and immunobiology, broadly explores how the spotted hyena could be used as a model for humankind. The bottom line: Humans get it wrong; hyenas get it right. "Unlike most other mammals," Wu writes, "spotted hyenas (Crocuta crocuta) live in matriarchal societies led by alpha females. In these clans throughout sub-Saharan Africa, females do the majority of the hunting, dictate the social structure, and raise cubs as single mothers. Even the highest-ranking male in the group is subservient to the most junior female. Accordingly, male spotted hyenas have evolved to be comparatively diminutive, weighing about 12 percent less than females—a feature uncommon even among matrilines." Sounds great. Unfortunately, it's not exactly true, according to Oliver Höner, a research scientist at the Leibniz Institute for Zoo and Wildlife and the co-founder of the Spotted Hyena Project, a research project based in Tanzania. A tweet by the Hyena Project was featured in Wu's article (much to Höner's chagrin), and when I saw him getting salty about Wu's work on Twitter, I reached out to ask what she got wrong. There was plenty in that paragraph alone. Höner says: © Index Newspapers LLC
Keyword: Sexual Behavior
Link ID: 25131 - Posted: 06.23.2018
Richard Harris One of the enduring mysteries of biology is why so much of the DNA in our chromosomes appears to be simply junk. In fact, about half of the human genome consists of repetitive bits of DNA that cut and paste themselves randomly into our chromosomes, with no obvious purpose. A study published Thursday finds that some of these snippets may actually play a vital role in the development of embryos. The noted biologist Barbara McClintock, who died in 1992, discovered these odd bits of DNA decades ago in corn, and dubbed them "jumping genes." (She won a Nobel prize for that finding in 1983.) McClintock's discovery stimulated generations of scientists to seek to understand this bizarre phenomenon. Some biologists have considered these weird bits of DNA parasites, since they essentially hop around our chromosomes and infect them, sometimes disrupting genes and leaving illness in their wake. But Miguel Ramalho-Santos, a biologist at the University of California, San Francisco, doesn't like that narrative. "It seemed like a waste of this real estate in our genome — and in our cells — to have these elements and not have them there for any particular purpose," Ramalho-Santos says. "So we just asked a very simple question: Could they be doing something that's actually beneficial?" He and his colleagues focused on a jumping gene called LINE-1; all told, copies of it make up a whopping 20 percent of our entire DNA. Ramalho-Santos' lab studies embryos, so the team wondered whether LINE-1 played any role in prompting a single fertilized egg to develop into an embryo. © 2018 npr
Keyword: Development of the Brain
Link ID: 25130 - Posted: 06.23.2018


.gif)

