Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Laura Sanders Ketamine, a drug that has shown promise in quickly easing depression, doesn’t actually do the job itself. Instead, depression relief comes from one of the drug’s breakdown products, a new study in mice suggests. The results, published May 4 in Nature, identify a potential depression-fighting drug that works quickly but without ketamine’s serious side effects or potential for abuse. The discovery “could be a major turning point,” says neuroscientist Roberto Malinow of the University of California, San Diego. “I’m sure that drug companies will look at this very closely.” Depression is a pernicious problem with few good treatments. Traditional antidepressants don’t work for everyone, and when the drugs do work, they can take weeks to kick in. Ketamine, developed in the 1960s as a sedative for people and now used commonly by veterinarians to knock out animals, can ease depression in minutes, not weeks, small studies show. But the new study suggests that a metabolite of ketamine — not the drug itself — fights depression. Inside the body, ketamine gets converted into a slew of related molecules. One of these breakdown molecules, a chemical called (2R,6R)-hydroxynorketamine, is behind the benefits, neuropharmacologist Todd Gould of the University of Maryland School of Medicine in Baltimore and colleagues found. On its own, a single dose of (2R,6R)-HNK reduced signs of depression in mice, restoring their drive to search for a hidden platform in water, to try to escape a shock and to choose sweet water over plain. A type of ketamine that couldn’t be broken down easily into HNKs didn’t ease signs of depression in mice. Finding that a breakdown product, and not ketamine itself, was behind the results was a big surprise, Gould says. © Society for Science & the Public 2000 - 2016
By Ann Gibbons We may not be raring to go on a Monday morning, but humans are the Energizer Bunnies of the primate world. That’s the conclusion of a new study that, for the first time, measures precisely how many calories humans and apes burn each day. Compared with chimpanzees and other apes, our revved-up internal engines burn calories 27% faster, according to a paper in Nature this week. This higher metabolic rate equips us to quickly fuel energy-hungry brain cells, sustaining our bigger brains. And lest we run out of gas when food is short, the study also found that humans are fatter than other primates, giving us energy stores to draw on in lean times. “The brilliant thing here is showing for the first time that we do have a higher metabolic rate, and we do use more energy,” says paleoanthropologist Leslie Aiello, president of the Wenner-Gren Foundation for Anthropological Research in New York City. “Humans during evolution have become more and more hypermetabolic,” says biological anthropologist Carel van Schaik of the University of Zurich in Switzerland. “We turned up the thermostat.” For decades, researchers assumed that “there weren’t any differences in the rate at which different species burned calories,” says biological anthropologist Herman Pontzer of Hunter College in New York City, lead author of the new study. Comparing humans and other primates, they saw little difference in basal metabolic rate, which reflects the total calories used by our organs while we are at rest. © 2016 American Association for the Advancement of Science
By Marta Zaraska Scientists and laypeople alike have historically attributed political beliefs to upbringing and surroundings, yet recent research shows that our political inclinations have a large genetic component. The largest recent study of political beliefs, published in 2014 in Behavior Genetics, looked at a sample of more than 12,000 twin pairs from five countries, including the U.S. Some were identical and some fraternal; all were raised together. The study reveals that the development of political attitudes depends, on average, about 60 percent on the environment in which we grow up and live and 40 percent on our genes. “We inherit some part of how we process information, how we see the world and how we perceive threats—and these are expressed in a modern society as political attitudes,” explains Peter Hatemi, who is a genetic epidemiologist at the University of Sydney and lead author of the study. The genes involved in such complex traits are difficult to pinpoint because they tend to be involved in a huge number of bodily and cognitive processes that each play a minuscule role in shaping our political attitudes. Yet a study published in 2015 in the Proceedings of the Royal Society B managed to do just that, showing that genes encoding certain receptors for the neurotransmitter dopamine are associated with where we fall on the liberal-conservative axis. Among women who were highly liberal, 62 percent were carriers of certain receptor genotypes that have previously been associated with such traits as extroversion and novelty seeking. Meanwhile, among highly conservative women, the proportion was only 37.5 percent. © 2016 Scientific American
By Jessica Lahey Before she became a neuroscientist, Mary Helen Immordino-Yang was a seventh-grade science teacher at a school outside Boston. One year, during a period of significant racial and ethnic tension at the school, she struggled to engage her students in a unit on human evolution. After days of apathy and outright resistance to Ms. Immordino-Yang’s teaching, a student finally asked the question that altered her teaching — and her career path — forever: “Why are early hominids always shown with dark skin?” With that question, one that connected the abstract concepts of human evolution and the very concrete, personal experiences of racial tension in the school, her students’ resistance gave way to interest. As she explained the connection between the effects of equatorial sunlight, melanin and skin color and went on to explain how evolutionary change and geography result in various human characteristics, interest blossomed into engagement, and something magical happened: Her students began to learn. Dr. Immordino-Yang’s eyes light up as she recounts this story in her office at the Brain and Creativity Institute at the University of Southern California. Now an associate professor of education, psychology and neuroscience, she understands the reason behind her students’ shift from apathy to engagement and, finally, to deep, meaningful learning. Her students learned because they became emotionally engaged in material that had personal relevance to them. Emotion is essential to learning, Dr. Immordino-Yang said, and should not be underestimated or misunderstood as a trend, or as merely the “E” in “SEL,” or social-emotional learning. Emotion is where learning begins, or, as is often the case, where it ends. Put simply, “It is literally neurobiologically impossible to think deeply about things that you don’t care about,” she said. © 2016 The New York Times Company
By Virginia Morell After defeating other males in boxing matches and winning a territorial roost—and a bevy of females—a male Seba’s short-tailed bat (Carollia perspicillata, pictured) might think his battles for reproductive rights are over. But the defeated males of this neotropical species have a trick up their sleeve: clandestine matings with willing females. The tactic works, and now researchers know why. Scientists studied bats in a captive colony in Switzerland, removing alpha males from their harems for 3 days, and examining their sperm—as well as that of their rivals. A previous study showed that the sneaky males have faster, longer lived sperm, which gives them a leg-up on the alpha male. Researchers had suspected this was because the sneakers produced this supersperm to compete. But the new study finds that after the 3 days of abstinence, the alpha male’s sperm is as agile and vigorous as that of his rivals. Thus, the team reports today in the Journal of Experimental Biology, the sneaky males aren’t generating special sperm—they just mate less, so their sperm is in better shape when it comes time to race to the egg. © 2016 American Association for the Advancement of Science.
By John Horgan I had to ask Anthony Bossis about bad trips. Bossis, a psychologist at New York University, belongs to an intrepid cadre of scientists reviving research into psychedelics’ therapeutic potential. I say “reviving” because research on psychedelics thrived in the 1950s and 1960s before being crushed by a wave of anti-psychedelic hostility and legislation. Psychedelics such as LSD, psilocybin and mescaline are still illegal in the U.S. But over the past two decades, researchers have gradually gained permission from federal and other authorities to carry out experiments with the drugs. Together with physicians Stephen Ross and Jeffrey Guss, Bossis has tested the potential of psilocybin—the primary active ingredient of “magic mushrooms”--to alleviate anxiety and depression in cancer patients. Journalist Michael Pollan described the work of Bossis and others in The New Yorker last year. Pollan said researchers at NYU and Johns Hopkins had overseen 500 psilocybin sessions and observed “no serious adverse effects.” Many subjects underwent mystical experiences, which consist of "feelings of unity, sacredness, ineffability, peace and joy," as well as the conviction that you have discovered "an objective truth about reality." Pollan’s report was so upbeat that I felt obliged to push back a bit, pointing out that not all psychedelic experiences—or mystical ones--are consoling. In The Varieties of Religious Experience, William James emphasized that some mystics have “melancholic” or “diabolical” visions, in which ultimate reality appears terrifyingly alien and uncaring. Taking psychedelics in a supervised research setting doesn’t entirely eliminate the risk of a bad trip. That lesson emerged from a study in the early 1990s by psychiatrist Rick Strassman, who injected dimethyltryptamine, DMT, into human volunteers. © 2016 Scientific American
By Gretchen Reynolds Young rats prone to obesity are much less likely to fulfill that unhappy destiny if they run during adolescence than if they do not, according to a provocative new animal study of exercise and weight. They also were metabolically healthier, and had different gut microbes, than rats that keep the weight off by cutting back on food, the study found. The experiment was done in rodents, not people, but it does raise interesting questions about just what role exercise may play in keeping obesity at bay. For some time, many scientists, dieting gurus and I have been pointing out that exercise by itself tends to be ineffective for weight loss. Study after study has found that if overweight people start working out but do not also reduce their caloric intake, they shed little if any poundage and may gain weight. The problem, most scientists agree, is that exercise increases appetite, especially in people who are overweight, and also can cause compensatory inactivity, meaning that people move less over all on days when they exercise. Consequently, they wind up burning fewer daily calories, while also eating more. You do the math. But those discouraging studies involved weight loss. There has been much less examination of whether exercise might help to prevent weight gain in the first place and, if it does, how it compares to calorie restriction for that purpose. So for the new study, which was published last week in Medicine & Science in Sports & Exercise, researchers at the University of Missouri in Columbia and other schools first gathered rats from a strain that has an inborn tendency to become obese, starting in adolescence. (Adolescence is also when many young people begin to add weight.) © 2016 The New York Times Company
Link ID: 22178 - Posted: 05.04.2016
By Helen Briggs BBC News The Labrador retriever, known as one of the greediest breeds of dog, is hard-wired to overeat, research suggests. The dog is more likely to become obese than other breeds partly because of its genes, scientists at Cambridge University say. The gene affected is thought to be important in controlling how the brain recognises hunger and the feeling of being full after eating. The research could help in the understanding of human obesity. "About a quarter of pet Labradors carry this gene [difference]," lead researcher Dr Eleanor Raffan told the BBC. "Although obesity is the consequence of eating more than you need and more than you burn off in exercise, actually there's some real hard-wired biology behind our drive to eat," she added. Lifestyle factors Canine obesity mirrors the human obesity epidemic, with lifestyle factors such as lack of exercise and high-calorie food both implicated - as well as genetics. As many as two in three dogs (34-59%) in rich countries are now overweight. The Labrador has the highest levels of obesity and has been shown to be more obsessed with food than other breeds. Researchers screened more than 300 Labradors kept as pets or assistance dogs for known obesity genes in the study, published in the journal Cell Metabolism. The international team found that a change in a gene known as POMC was strongly linked with weight, obesity and appetite in Labradors and Flat-Coated retrievers. In both breeds, for each copy of the gene carried, the dog was on average 2kg heavier. Other breeds of dog - from the Shih Tzu to the Great Dane - were also screened, but the genetic difference was not found. However, the variation was more common in Labradors working as assistance dogs, which the researchers say might be because these dogs are easier to train by rewarding with food. © 2016 BBC.
By Sarah Kaplan The ancient Greeks spoke of a mythological society composed entirely of warrior women. The medieval traveler John Mandeville wrote of a place whose female rulers "never would suffer man to dwell amongst them." "Paradise Island," home of Wonder Woman, was a feminist utopia where no one with a Y chromosome was allowed. Sadly, those places only exist in fiction. But something like them does exist in the real world. It's in a wetland in rural Ohio. And it's full of salamanders. "They’re pretty incredible," said Robert Denton, a biologist at Ohio State who studies an unusual group of salamander species that literally don't need men. These creatures – all female – reproduce by cloning themselves. To keep their gene pool diverse, they sometimes "steal" sperm left behind on trees and leaves by male salamanders of other species and incorporate that DNA into their offspring. Most sexually reproducing organisms have two sets of chromosomes to make up their genome – one from each parent. But one of these strange salamanders can have between two and five times that much genetic material lying in wait within her cells. It's as if they have multiple genomes to fall back on, and that's made them incredibly successful. "Polyploid" salamanders have been around some 6 million years, Denton said — far longer than most other animal species that reproduce asexually. Since a lack of diversity means having a smaller arsenal of genetic variation to fall back on when living conditions change, these groups usually go extinct relatively quickly. © 1996-2016 The Washington Post
by Susan Milius There’s nothing like a guy doing all the child care to win female favor, even among giant water bugs. Thumbnail-sized Appasus water bugs have become an exemplar species for studying paternal care. After mating, females lay eggs on a male’s back and leave him to swim around for weeks tending his glued-on load. For an A. major water bug, lab tests show an egg burden can have the sweet side of attracting more females, researchers in Japan report May 4 in Royal Society Open Science. Given a choice of two males, females strongly favored, and laid more eggs on, the one already hauling around 10 eggs rather than the male that researchers had scraped eggless. Females still favored a well-egged male even when researchers offered two males that a female had already considered, but with their egg-carrying roles switched from the previous encounter. That formerly spurned suitor this time triumphed. A similar preference, though not as clear-cut, showed up in the slightly smaller and lighter A. japonicus giant water bug. “We conclude that sexual selection plays an important role in the maintenance of elaborate paternal care,” says study coauthor Shin-ya Ohba of Nagasaki University. © Society for Science & the Public 2000 - 2016
By Emily Benson Baby birds are sometimes known to shove their siblings out of the nest to gain their parents’ undivided attention, but barn owl chicks appear to be more altruistic. Scientists recorded the hissing calls of hungry and full barn owl nestlings (Tyto alba, pictured), then played the sounds back to single chicks settled in nests stocked with mice. The young owls that heard the squawks of their hungry kin delayed eating each rodent by an average of half an hour; those that heard cries indicating their invisible nest-mate was full ate the mice more quickly. The findings suggest that barn owl chicks give hungrier siblings a chance to eat first even when the nest is full of food, the researchers will report in an upcoming issue of Behavioral Ecology and Sociobiology. So is it true altruism? Maybe not. Nestlings may share food in exchange for help with grooming or to get the first crack at a later meal, the team says, suggesting a possible ulterior motive. © 2016 American Association for the Advancement of Science
Link ID: 22174 - Posted: 05.04.2016
Laura Sanders Iron, says aging expert Naftali Raz, is like the Force. It can be good or bad, depending on the context. When that context is the human brain, though, scientists wrangle over whether iron is a dark force for evil or a bright source of support. Some iron is absolutely essential for the brain. On that, scientists agree. But recent studies suggest to some researchers that too much iron, and the chemical reactions that ensue, can be dangerous or deadly, especially to nerve cells in the vulnerable brain area that deteriorates with Parkinson’s disease. Yet other work raises the possibility that those cells die because of lack of iron, rather than too much. “There are a lot of surprises in this field,” says iron biologist Nancy Andrews of Duke University. The idea that too much iron is dangerous captivates many researchers, including analytical neurochemist Dominic Hare of the University of Technology Sydney. “All of life is a chemical reaction,” he says, “so the start of disease is a chemical reaction as well.” And as Raz points out, reactions involving iron are both life-sustaining and dangerous. “Iron is absolutely necessary for conducting the very fundamental business in every cell,” says Raz, of Wayne State University in Detroit. It helps produce energy-storing ATP molecules. And that’s a dirty job, throwing off dangerous free radicals that can cause cellular mayhem as energy is made. But those free radicals are not the most worrisome aspect of iron, Hare believes. “The reaction that is much more dangerous is the reaction you get when iron and dopamine come together,” he says. © Society for Science & the Public 2000 - 2016.
Link ID: 22173 - Posted: 05.03.2016
By Sarah Kaplan Scientists have known for a while that stereotypes warp our perceptions of things. Implicit biases — those unconscious assumptions that worm their way into our brains, without our full awareness and sometimes against our better judgment — can influence grading choices from teachers, split-second decisions by police officers and outcomes in online dating. We can't even see the world without filtering it through the lens of our assumptions, scientists say. In a study published Monday in the journal Nature Neuroscience, psychologists report that the neurons that respond to things such as sex, race and emotion are linked by stereotypes, distorting the way we perceive people's faces before that visual information even reaches our conscious brains. "The moment we actually glimpse another person ... [stereotypes] are biasing that processing in a way that conforms to our already existing expectations," said Jonathan Freeman, a psychology professor at New York University and one of the authors of the report. Responsibility lies in two far-flung regions of the brain: the orbital frontal cortex, which rests just above the eyes and is responsible for rapid visual predictions and categorizations, and the fusiform cortex, which sits in the back of the brain and is involved in recognizing faces. When Freeman and his co-author, Ryan Stolier, had 43 participants look at images of faces in a brain scanner, they noticed that neurons seemed to be firing in similar patterns in both parts of the brain, suggesting that information from each part was influencing the other.
By Jennifer Jolly Every January for the past decade, Jessica Irish of Saline, Mich., has made the same New Year’s Resolution: to “cut out late night snacking and lose 30 pounds.” Like millions of Americans, Ms. Irish, 31, usually makes it about two weeks. But this year is different. “I’ve already lost 18 pounds,” she said, “and maintained my diet more consistently than ever. Even more amazing — I rarely even think about snacking at night anymore.” Ms. Irish credits a new wearable device called Pavlok for doing what years of diets, weight-loss programs, expensive gyms and her own willpower could not. Whenever she takes a bite of the foods she wants to avoid, like chocolate or Cheez-Its, she uses the Pavlok to give herself a lightning-quick electric shock. “Every time I took a bite, I zapped myself,” she said. “I did it five times on the first night, two times on the second night, and by the third day I didn’t have any cravings anymore.” As the name suggests, the $199 Pavlok, worn on the wrist, uses the classic theory of Pavlovian conditioning to create a negative association with a specific action. Next time you smoke, bite your nails or eat junk food, one tap of the device or a smartphone app will deliver a shock. The zap lasts only a fraction of a second, though the severity of the shock is up to you. It can be set between 50 volts, which feels like a strong vibration, and 450 volts, which feels like getting stung by a bee with a stinger the size of an ice pick. (By comparison, a police Taser typically releases about 50,000 volts.) Other gadgets and apps dabble in behavioral change by way of aversion therapy, such as the $49 MotivAider that is worn like a pager, or the $99 RE-vibe wristband. Both can be set to vibrate at specific intervals as a reminder of a habit to break or a goal to reach. The $80 Lumo Lift posture coach is a wearable disk that vibrates when you slouch. The $150 Spire clip-on sensor tracks physical activity and state of mind by detecting users’ breathing patterns. If it detects you’re stressed or anxious, it vibrates or sends a notification to your smartphone to take a deep breath. © 2016 The New York Times Company
Keyword: Learning & Memory
Link ID: 22171 - Posted: 05.03.2016
By Julia Shaw In the last couple of years memory science has really upped its game. I generally write about social processes that can change our memories, but right now I can’t help but get excited that memory science is getting an incredible new toy to play with. A toy that I believe will revolutionise how we talk about, and deal with, memory. This not-so-new sounding, but totally-newly-applied, neuroscience toy is ultrasound. Ultrasound is also called sonography and is essentially a type of ‘medical sonar’. It has revolutionized medicine since the 1940s, giving us the ability to look into the body in a completely safe way (without leaving icky radiation behind, like xrays). Beyond predicting whether your baby shower will be blue or pink, lesser known applications of ultrasound include the ability to essentially burn and destroy cells inside your body. As such, it has been successfully used to do surgery without making any cuts into the human body. This is a technique that has been used to remove cancerous cells while not affecting any of the surrounding tissue, and without any of the side-effects associated with other kinds of cancer treatment. This is referred to by scientist Yoav Medan as focused ultrasound. If you are unfamiliar with this, you need to watch this TED talk. Non-invasive procedures like this are the future of surgery. Non-invasive procedures are also the future of neuroscience. It is at this point that we find ourselves at the application of this astonishing science to memory research. © 2016 Scientific American
By Karen Weintraub The four members of Asperger’s Are Us decided a long time ago that their main goal would be to amuse themselves. But after nearly a decade of laughing and writing punch lines together, Asperger’s Are Us, which is probably the only comedy troupe made up of people on the autism spectrum, is on the cusp of comedic success. A documentary about the group premiered at the SXSW conference in Austin in March and was recently sold to Netflix. The troupe is also preparing for its first national tour this summer. Comedy might be a surprising choice for someone with Asperger’s syndrome, since stereotypically, people with autism are generally regarded as socially awkward loners. But the four men in the group bonded at summer camp 11 years ago, when one was a counselor and the other three were campers, and are clearly great friends. An “Aspergers Are Us” performance from 2011. Talking recently via Skype, Noah Britton, the former counselor, settles giant black rabbit ears onto his head. Jack Hanke, another member of the troupe, dons his favorite sombrero – the black one he took with him to Oxford University during his recent junior year abroad – accessorized with a red sombrero on top. They slip into their usual banter when asked what they thought of the film, named for the group, which will be shown publicly for the first time on Friday at the Somerville Theater outside of Boston. “I liked the four weird guys in it,” Mr. Britton said. “It was better than ‘Jaws 2,’ but not as good as ‘Jaws 3,’” Mr. Hanke insisted. “I found it kind of annoying myself,” added Ethan Finlan, another member of the group. The fourth member, who changed his first name to New Michael to distinguish himself from his father, Michael Ingemi, didn’t want to join the call. © 2016 The New York Times Company
Link ID: 22169 - Posted: 05.03.2016
By GINA KOLATA Danny Cahill stood, slightly dazed, in a blizzard of confetti as the audience screamed and his family ran on stage. He had won Season 8 of NBC’s reality television show “The Biggest Loser,” shedding more weight than anyone ever had on the program — an astonishing 239 pounds in seven months. When he got on the scale for all to see that evening, Dec. 8, 2009, he weighed just 191 pounds, down from 430. Dressed in a T-shirt and knee-length shorts, he was lean, athletic and as handsome as a model. “I’ve got my life back,” he declared. “I mean, I feel like a million bucks.” Mr. Cahill left the show’s stage in Hollywood and flew directly to New York to start a triumphal tour of the talk shows, chatting with Jay Leno, Regis Philbin and Joy Behar. As he heard from fans all over the world, his elation knew no bounds. But in the years since, more than 100 pounds have crept back onto his 5-foot-11 frame despite his best efforts. In fact, most of that season’s 16 contestants have regained much if not all the weight they lost so arduously. Some are even heavier now. Yet their experiences, while a bitter personal disappointment, have been a gift to science. A study of Season 8’s contestants has yielded surprising new discoveries about the physiology of obesity that help explain why so many people struggle unsuccessfully to keep off the weight they lose. Kevin Hall, a scientist at a federal research center who admits to a weakness for reality TV, had the idea to follow the “Biggest Loser” contestants for six years after that victorious night. The project was the first to measure what happened to people over as long as six years after they had lost large amounts of weight with intensive dieting and exercise. © 2016 The New York Times Company
Link ID: 22168 - Posted: 05.02.2016
Patricia Neighmond Hoping to keep your mental edge as you get older? Look after your heart, a recent analysis suggests, and your brain will benefit, too. A research team led by Hannah Gardener, an epidemiologist at the University of Miami, analyzed a subset of data from the Northern Manhattan Study, a large, ongoing study of risk factors for stroke among whites, blacks and Hispanics living in the Washington Heights neighborhood of New York City. The scientists wanted to see how people in their 60s and 70s would do on repeated tests of memory and mental acuity six years later — and, specifically, what sort of subtle differences a heart-healthy lifestyle might make to the brain, beyond the prevention of strokes. Their findings appear in a recent issue of the Journal of the American Heart Association. In this particular study, the researchers started with more than a thousand people who'd had their cardiovascular health assessed using measures that the American Heart Association has dubbed Life's Simple 7. These seven factors known to benefit the heart and blood vessels include maintaining a normal body weight and good nutrition, not smoking, getting exercise regularly and keeping blood pressure, cholesterol and blood sugar levels under control. To measure thinking skills, Gardener's team used a variety of tests of memory, judgement, the ability to plan, mental quickness and other sorts of problem solving. The results were striking: Across all demographic groups, the people who had higher scores on the measures of cardiovascular health did better on the mental tests than those who scored low. © 2016 npr
Link ID: 22167 - Posted: 05.02.2016
By Scott Barry Kaufman "Just because a diagnosis [of ADHD] can be made does not take away from the great traits we love about Calvin and his imaginary tiger friend, Hobbes. In fact, we actually love Calvin BECAUSE of his ADHD traits. Calvin’s imagination, creativity, energy, lack of attention, and view of the world are the gifts that Mr. Watterson gave to this character." -- The Dragonfly Forest In his 2004 book "Creativity is Forever", Gary Davis reviewed the creativity literature from 1961 to 2003 and identified 22 reoccurring personality traits of creative people. This included 16 "positive" traits (e.g., independent, risk-taking, high energy, curiosity, humor, artistic, emotional) and 6 "negative" traits (e.g., impulsive, hyperactive, argumentative). In her own review of the creativity literature, Bonnie Cramond found that many of these same traits overlap to a substantial degree with behavioral descriptions of Attention Deficit Hyperactive Disorder (ADHD)-- including higher levels of spontaneous idea generation, mind wandering, daydreaming, sensation seeking, energy, and impulsivity. Research since then has supported the notion that people with ADHD characteristics are more likely to reach higher levels of creative thought and achievement than people without these characteristics (see here, here, here, here, here, here, here, here, here, and here). Recent research by Darya Zabelina and colleagues have found that real-life creative achievement is associated with the ability to broaden attention and have a “leaky” mental filter-- something in which people with ADHD excel. © 2016 Scientific American
Link ID: 22166 - Posted: 05.02.2016
Symptoms of depression that steadily increase over time in older age could indicate early signs of dementia, scientists have said. Other patterns of symptoms, such as chronic depression, appear not to be linked, a study found. Dutch researchers looked at different ways depression in older adults progressed over time and how this related to any risk. They concluded worsening depression may signal the condition is taking hold. The research, published in The Lancet Psychiatry, followed more than 3,000 adults aged 55 and over living in the Netherlands. All had depression but no symptoms of dementia at the start of the study. Dr M Arfan Ikram of the Erasmus University Medical Center in Rotterdam said depressive symptoms that gradually increase over time appear to be a better predictor of dementia later in life than other paths of depression. "There are a number of potential explanations, including that depression and dementia may both be symptoms of a common underlying cause, or that increasing depressive symptoms are on the starting end of a dementia continuum in older adults," he said. Only the group whose symptoms of depression increased over time were found to be at increased risk of dementia - about one in five of people (55 out of 255) in this group developed dementia. Others who had symptoms that waxed and waned or stayed the same were not at increased risk. For example, in those who experienced low but stable levels of depression, around 10% went on to develop dementia. The exact nature of depression on dementia risk remains unknown. © 2016 BBC