Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Philip Ball James Frazer’s classic anthropological study The Golden Bough1 contains a harrowing chapter on human sacrifice in rituals of crop fertility and harvest among historical cultures around the world. Frazer describes sacrificial victims being crushed under huge toppling stones, slow-roasted over fires and dismembered alive. Frazer’s methods of analysis wouldn't all pass muster among anthropologists today (his work was first published in 1890), but it is hard not to conclude from his descriptions that what industrialized societies today would regard as the most extreme psychopathy has in the past been seen as normal — and indeed sacred — behaviour. In almost all societies, killing within a tribe or clan has been strongly taboo; exemption is granted only to those with great authority. Anthropologists have suspected that ritual human sacrifice serves to cement power structures — that is, it signifies who sits at the top of the social hierarchy. The idea makes intuitive sense, but until now there has been no clear evidence to support it. In a study published in Nature2, Joseph Watts, a specialist in cultural evolution at the University of Auckland in New Zealand, and his colleagues have analysed 93 traditional cultures in Austronesia (the region that loosely embraces the many small and island states in the Pacific and Indonesia) as they were before they were influenced by colonization and major world religions (generally in the late 19th and early 20th centuries). © 2016 Nature Publishing Group
Keyword: Aggression; Evolution
Link ID: 22070 - Posted: 04.05.2016
Feel like you haven’t slept in ages? If you’re one of the 5 per cent of the population who has severe insomnia – trouble sleeping for more than a month – then your brain’s white matter might be to blame. The cell bodies and synapses of our brain cells make up our brain’s grey matter, while bundles of their tails that connect one brain region to another make up the white matter. These nerve cell tails – axons – are cloaked in a fatty myelin sheath that helps transmit signals. Radiologist Shumei Li from Guangdong No. 2 Provincial People’s Hospital in Guangzhou, China, and her team, scanned the brains of 30 healthy sleepers and 23 people with severe insomnia using diffusion tensor imaging MRI. This imaging technique lights up the white matter circuitry. Axons unsheathed They found that in the brains of the people with severe insomnia, the regions in the right hemisphere that support learning, memory, smell and emotion were less well connected compared with healthy sleepers. They attribute this break down in circuitry to the loss of the myelin sheath in the white matter. A study in November suggested that smoking could be one cause for myelin loss. The team also found that the insomniacs had poorer connections in the white matter of the thalamus, a brain region that regulates consciousness, alertness and sleep. The study proposes a potential mechanism for insomnia but there could be other factors, says Max Wintermark, a radiologist at Stanford. He says it’s not possible to say whether the poor connections are the cause of result of insomnia. © Copyright Reed Business Information Ltd.
Keyword: Sleep
Link ID: 22069 - Posted: 04.05.2016
Laura Sanders NEW YORK — Sometimes forgetting can be harder than remembering. When people forced themselves to forget a recently seen image, select brain activity was higher than when they tried to remember that image. Forgetting is often a passive process, one in which the memory slips out of the brain, Tracy Wang of the University of Texas at Austin said April 2 at the annual meeting of the Cognitive Neuroscience Society. But in some cases, forgetting can be deliberate. Twenty adults saw images of faces, scenes and objects while an fMRI scanner recorded their brains’ reactions to the images. If instructed to forget the preceding image, people were less likely to remember that image later. Researchers used the scan data to build a computer model that could infer how strongly the brain responds to each particular kind of image. In the ventral temporal cortex, a part of the brain above the ear, brain patterns elicited by a particular image were stronger when a participant was told to forget the sight than when instructed to remember it. Of course, everyone knows that it’s easy to forget something without even trying. But these results show that intentional forgetting isn’t a passive process — the brain has to actively work to wipe out a memory on purpose. Citations T.H. Wang et al. Forgetting is more work than remembering. Annual meeting of the Cognitive Neuroscience Society, New York City, April 2, 2016. © Society for Science & the Public 2000 - 2016
Keyword: Learning & Memory
Link ID: 22068 - Posted: 04.05.2016
Mo Costandi This spectacular image – which took the best part of a year to create – shows the fine structure of a nerve terminal at high resolution, revealing, for the very first time, an intricate network of fine filaments that controls the movements of synaptic vesicles. The brain is soft and wet, with the consistency of a lump of jelly. Yet, it is the most complex and highly organized structure that we know of, containing hundreds of billions of neurons and glial cells, and something on the order of one quadrillion synaptic connections, all of which are arranged in a very specific manner. This high degree of specificity extends down to the deepest levels of brain organization. Just beneath the membrane at the nerve terminal, synaptic vesicles store neurotransmitter molecules, and await the arrival of a nervous impulse, whereupon they fuse with the membrane and release their contents into the synaptic cleft, the miniscule gap at the junction between nerve cells, and diffuse across it to bind to receptor protein molecules embedded at the surface of the partner cell. 3D model of a nerve terminal in atomic detail The process of neurotransmitter release is tightly orchestrated. Ready vesicles are ‘docked’ in the ‘active zone’ lying beneath the cell membrane, and are depleted when they fuse with the membrane, only to be replenished from a reservoir of pre-prepared vesicles located further inside the cell. Spent vesicles are quickly pulled back out of the membrane, reformed, refilled with neurotransmitter molecules, and then returned to the reservoir, so that they can be shuttled into the active zone when needed. An individual nerve cell may use up hundreds, or perhaps thousands, of vesicles every second, and so this recycling process occurs continuously to maintain the signalling between nerve cells. © 2016 Guardian News and Media Limited
Keyword: Development of the Brain
Link ID: 22067 - Posted: 04.04.2016
Helen Shen Clamping an electrode to the brain cell of a living animal to record its electrical chatter is a task that demands finesse and patience. Known as ‘whole-cell patch-clamping’, it is reputedly the “finest art in neuroscience”, says neurobiologist Edward Boyden, and one that only a few dozen laboratories around the world specialize in. But researchers are trying to demystify this art by turning it into a streamlined, automated technique that any laboratory could attempt, using robotics and downloadable source code. “Patch-clamping provides a unique view into neural circuits, and it’s a very exciting technique but is really underused,” says neuroscientist Karel Svoboda at the Howard Hughes Medical Institute’s Janelia Research Campus in Ashburn, Virginia. “That’s why automation is a really, really exciting direction.” On 3 March, Boyden, at the Massachusetts Institute of Technology in Cambridge, and his colleagues published detailed instructions on how to assemble and operate an automated system for whole-cell patch-clamping1, a concept that they first described in 20122. The guide represents the latest fruits of Boyden’s partnership with the laboratory of Craig Forest, a mechanical engineer at the Georgia Institute of Technology in Atlanta who specializes in robotic automation for research. © 2016 Nature Publishing Group
Keyword: Brain imaging
Link ID: 22066 - Posted: 04.04.2016
By DONALD G. McNEIL Jr The World Health Organization said on Thursday that there is “strong scientific consensus” that Zika virus is a cause of microcephaly, unusually small heads with brain damage in infants, as well as other neurological disorders. Yet a surge in microcephaly has been reported only in Brazil; a small increase was reported in French Polynesia, and a cluster of 32 cases is now under investigation in Colombia. For proof of the connection between infection with the virus and birth defects, scientists are waiting for the results of a large study of 5,000 pregnant women, most of them in Colombia. Women with past Zika infections will be compared with similar women without infections to see if they have more microcephalic children. The epidemic peaked in Colombia in early February, according to the W.H.O. Most of the women in the study are due to give birth in May and June. Virtually all public health agencies already believe the virus is to blame for these birth defects and are giving medical advice based on that assumption. Here are the lines of evidence they cite. As early as last August, hospitals in northeast Brazil realized that something unheard of was happening: Neonatal wards that normally saw one or two microcephalic babies a year were seeing five or more at the same time. Doctors learned from the mothers that many of them had had Zika symptoms months earlier. © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 22065 - Posted: 04.04.2016
The mystery is starting to untangle. It has long been known that twisted fibres of a protein called tau collect in the brain cells of people with Alzheimer’s, but their exact role in the disease is unclear. Now a study in mice has shown how tau interferes with the strengthening of connections between neurons – the key mechanism by which we form memories. In healthy cells, the tau protein helps to stabilise microtubules that act as rails for transporting materials around the cell. In people with Alzheimer’s, these proteins become toxic, but an important unanswered question is what forms of tau are toxic: the tangles may not be the whole story. In the new study, Li Gan and her colleagues at the Gladstone Institute of Neurological Disease in San Francisco found that the brains of those with Alzheimer’s have high levels of tau with a particular modification, called acetylated tau. They then looked at what acetylated tau does in a mouse model of Alzheimer’s, finding that it accumulates at synapses – the connections between neurons. When we form memories, synapses become strengthened through extra receptors inserted into the cell membranes, and this heightens their response. But acetylated tau depletes another protein called KIBRA, which is essential for this synapse-strengthening mechanism. “We’re excited because we think we now have a handle on the link between tau and memory,” says Gan. “We’re also cautious because we know this may not be the only link. It’s still early days in understanding the mechanism.” © Copyright Reed Business Information Ltd.
Keyword: Alzheimers; Learning & Memory
Link ID: 22064 - Posted: 04.04.2016
By Rachel Zelniker, A long dark winter can be mentally and physically exhausting, but a recent study published in the journal of Clinical Psychological Science challenges the idea that it's making people depressed. Seasonal affective disorder (SAD) is commonly believed to affect a significant portion of the population in the Northern Hemisphere during the darker winter months. As many as 35 per cent of Canadians complain of having the "winter blues," according to the Centre for Addiction and Mental Health. Another 10 to 15 per cent have a mild form of seasonal depression, while about two to five per cent of Canadians will have a severe, clinical form of SAD. The disorder is based on the theory that some depressions occur seasonally in response to reduced sunlight — but recent research says that theory may be unsubstantiated. "We conducted a study using data that looked at the relationship between depression in a fairly large sample of people distributed over several degrees latitude in the United States," said Steven G. LoBello, a psychology professor at Auburn University in Montgomery, Ala., and one of the study's authors. "We looked across the four seasons to see if there was an association with sunlight, and we simply didn't find a direct relationship with sunlight, the seasons, or latitude." LoBello's study does not look at populations north of the 49th parallel, but he is confident his findings hold. "We cite in our paper a paper by [Vidje Hansen] that looked at this problem in Norway, which is north of the Arctic Circle, and they experience the polar night." According to LeBello, that research "did not find any relationship between an increase in depression and the duration of the polar night." A "seasonal pattern" modifier for depression diagnoses was officially added to the Diagnostic and Statistical Manual of Mental Disorders (DSM) in 1987. ©2016 CBC/Radio-Canada.
Keyword: Depression; Biological Rhythms
Link ID: 22063 - Posted: 04.04.2016
By Emily Underwood More than 99% of clinical trials for Alzheimer’s drugs have failed, leading many to wonder whether pharmaceutical companies have gone after the wrong targets. Now, research in mice points to a potential new target: a developmental process gone awry, which causes some immune cells to feast on the connections between neurons. “It is beautiful new work,” which “brings into light what’s happening in the early stage of the disease,” says Jonathan Kipnis, a neuroscientist at the University of Virginia School of Medicine in Charlottesville. Most new Alzheimer’s drugs aim to eliminate β amyloid, a protein that forms telltale sticky plaques around neurons in people with the disease. Those with Alzheimer’s tend to have more of these deposits in their brains than do healthy people, yet more plaques don’t always mean more severe symptoms such as memory loss or poor attention, says Beth Stevens of Boston Children’s Hospital, who led the new work. What does track well with the cognitive decline seen in Alzheimer’s disease—at least in mice that carry genes that confer high risk for the condition in people—is a marked loss of synapses, particularly in brain regions key to memory, Stevens says. These junctions between nerve cells are where neurotransmitters are released to spark the brain’s electrical activity. Stevens has spent much of her career studying a normal immune mechanism that prunes weak or unnecessary synapses as the brain matures from the womb through adolescence, allowing more important connections to become stronger. In this process, a protein called C1q sets off a series of chemical reactions that ultimately mark a synapse for destruction. After a synapse has been “tagged,” immune cells called microglia—the brain’s trash disposal service—know to “eat” it, Stevens says. © 2016 American Association for the Advancement of Science
Keyword: Alzheimers; Neuroimmunology
Link ID: 22062 - Posted: 04.01.2016
Quirin Schiermeier & Alison Abbott The ability to study brain processes in real time is one of the goals of the Human Brain Project's newly-released computing tools. Europe’s major brain-research project has unveiled a set of prototype computing tools and called on the global neuroscience community to start using them. The move marks the end of the 30-month ramp-up phase of the Human Brain Project (HBP), and the start of its operational phase. The release of the computing platforms — which include brain-simulation tools, visualization software and a pair of remotely accessible supercomputers to study brain processes in real time — could help to allay concerns about the €1-billion (US$1.1-billion) project’s benefits to the wider scientific community. “The new platforms open countless new possibilities to analyse the human brain,” said Katrin Amunts, a neuroscientist at the Jülich Research Centre in Germany and a member of the project’s board of directors, at a press conference on 30 March. “We are proud to offer the global brain community a chance to participate.” But it is not clear how the platforms — some freely accessible, others available only on the success of a peer-reviewed application — will resonate with brain researchers outside the project. “At this point, no one can say whether or not the research platforms will be a success,” says Andreas Herz, chair of computational neuroscience at the Ludwig Maximilian University of Munich in Germany. © 2016 Nature Publishing Group
Keyword: Brain imaging
Link ID: 22061 - Posted: 04.01.2016
By BENEDICT CAREY Some scientists studying the relationship between contact sports and memory or mood problems later in life argue that cumulative exposure to hits that cause a snap of the head — not an athlete’s number of concussions — is the most important risk factor. That possibility is particularly worrisome in football, in which frequent “subconcussive” blows are unavoidable. On Thursday, researchers based at Boston University reported the most rigorous evidence to date that overall exposure to contact in former high school and college football players could predict their likelihood of experiencing problems like depression, apathy or memory loss years later. The finding, appearing in The Journal of Neurotrauma, is not conclusive, the authors wrote. Such mental problems can stem from a variety of factors in any long life. Yet the paper represents researchers’ first attempt to precisely calculate cumulative lifetime exposure to contact in living players, experts said. Previous estimates had relied in part on former players’ memories of concussions, or number of years played. The new paper uses more objective measures, including data from helmet accelerometer studies, and provides a glimpse of where the debate over the risk of contact sports may next play out, the experts said. “They used a much more refined and quantitative approach to estimate exposure than I’ve seen in this area,” said John Meeker, a professor of environmental health sciences at the University of Michigan School of Public Health, who was not a part of the research team. But he added, “Their methods will have to be validated in much larger studies; this is very much a preliminary finding.” The study did not address the risk of chronic traumatic encephalopathy, or C.T.E., a degenerative scarring in the brain tied to head blows, which can be diagnosed only after death. © 2016 The New York Times Company
Keyword: Brain Injury/Concussion
Link ID: 22060 - Posted: 04.01.2016
Meghan Rosen Despite massive public health campaigns, the rise in worldwide obesity rates continues to hurtle along like a freight train on greased tracks. In 2014, more than 640 million men and women were obese (measured as a body mass index of 30 or higher). That’s up from 105 million in 1975, researchers estimate in the April 2 Lancet. The researchers analyzed four decades of height and weight data for more than 19 million adults, and then calculated global rates based on population data. On average, people worldwide are gaining about 1.5 kilograms per decade — roughly the weight of a half-gallon of ice cream. But the road isn’t entirely rocky. During the same time period, average life expectancy also jumped: from less than 59 years to more than 71 years, George Davey Smith points out in a comment accompanying the new study. Smith, an epidemiologist at the University of Bristol in England, boils the data down to a single, seemingly paradoxical sentence: “The world is at once fatter and healthier.” © Society for Science & the Public 2000 - 2016
Keyword: Obesity
Link ID: 22059 - Posted: 04.01.2016
Noah Smith, ( How do human beings behave in response to risk? That is one of the most fundamental unanswered questions of our time. A general theory of decision-making amid uncertainty would be the kind of scientific advance that comes only a few times a century. Risk is central to financial and insurance markets. It affects the consumption, saving and business investment that moves the global economy. Understanding human behavior in the face of risk would let us reduce accidents, retire more comfortably, get cheaper health insurance and maybe even avoid recessions. A number of our smartest scientists have tried to develop a general theory of risk behavior. John von Neumann, the pioneering mathematician and physicist, took a crack at it back in 1944, when he developed the theory of expected utility along with Oskar Morgenstern. According to this simple theory, people value a possible outcome by multiplying the probability that something happens by the amount they would like it to happen. This beautiful idea underlies much of modern economic theory, but unfortunately it doesn't work well in most situations. Alternative theories have been developed for specific applications. The psychologist Daniel Kahneman won a Nobel Prize for the creation of prospect theory, which says -- among other things -- that people measure outcomes relative to a reference point. That theory does a great job of explaining the behavior of subjects in certain lab experiments, and can help account for the actions of certain inexperienced consumers. But it is very difficult to apply generally, because the reference points are hard to predict in advance and may shift in unpredictable ways.
Keyword: Attention; Emotions
Link ID: 22058 - Posted: 04.01.2016
By Nicholas Bakalar Stress in childhood may be linked to hardening of the arteries in adulthood, new research suggests. Finnish researchers studied 311 children 12 to 18 years old, scoring their levels of stress according to a variety of components, including the family’s economic circumstances, the emotional environment in the home, whether parents engaged in healthy behaviors, stressful events (such as divorce, moves or death of a family member) and parental concerns about the child’s social adjustment. Using these criteria, they calculated a stress score. When the members of the group were 40 to 46 years old, they used computed tomography to measure coronary artery calcification, a marker of atherosclerosis and a risk factor for cardiovascular disease. The study, in JAMA Pediatrics, controlled for sex, cholesterol, body mass index and other factors, but still found that the higher the childhood stress score, the greater the risk for coronary artery calcification. The study is observational, and the data is based largely on parental reports, which can be biased. Still, its long follow-up time and careful control of other variables gives it considerable strength. There are plausible mechanisms for the connection, including stress-induced increases in inflammation, which in animal models have been linked to a variety of ailments. “I think that economic conditions are important here,” said the lead author, Dr. Markus Juonala, a professor of internal medicine at the University of Turku in Finland. “Public health interventions should focus on how to intervene in better ways with people with higher stress and lower socioeconomic status.” © 2016 The New York Times Company
Keyword: Stress; Development of the Brain
Link ID: 22057 - Posted: 04.01.2016
By Elizabeth Pennisi The “brrreeet” you hear in the video above is not coming from this broadbill’s beak, but rather from its wings. Charles Darwin marveled at “instrumental music” of birds—from the rattled quills of peacocks to the wing-drumming of grouse and the wing “booming” of night-jars. But those percussive noises are no match for the definitive tones generated by the three Smithornis broadbills (S. rufolateralis, S. capensis, and S. sharpei) that live in remote forests in sub-Saharan Africa. One bird acoustics specialist was so intrigued in 1986 by a recording of this “song,” that he vowed to hear it for himself. More than 2 years ago, he and his colleagues tracked two of these species down in the wild. Synchronized high-speed video and acoustic recordings revealed the downstroke of the wings produces the tones as the bird flies in a meter-wide oval from its perch and back again. At first the researchers thought the outermost flight feathers flutter to make the sounds, but studies of a wing and of the feathers themselves in a wind tunnel showed that the inner flight feathers are “singing” the most, the team reports today in the Journal of Experimental Biology. The tones may scale with the species’ body and feather size, with the bigger ones producing deeper tones, the researchers suggest. The wing tones seemed to have replaced vocal singing, they note, and are likely unique to this group of birds. Audible 100 meters away in dense forest, they represent yet another innovation for communicating with one’s peers. © 2016 American Association for the Advancement of Science
Keyword: Sexual Behavior; Animal Communication
Link ID: 22056 - Posted: 04.01.2016
Ewen Callaway Homo floresiensis, the mysterious and diminutive species found in Indonesia in 2003, is tens of thousands of years older than originally thought — and may have been driven to extinction by modern humans. After researchers discovered H. floresiensis, which they nicknamed the hobbit, in Liang Bua cave on the island of Flores, they concluded that its skeletal remains were as young as 11,000 years old. But later excavations that have dated more rock and sediment around the remains now suggest that hobbits were gone from the cave by 50,000 years ago, according to a study published in Nature on 30 March1. That is around the time that modern humans moved through southeast Asia and Australia. “I can’t believe that it is purely coincidence, based on what else we know happens when modern humans enter a new area,” says Richard Roberts, a geochronologist at the University of Wollongong, Australia. He notes that Neanderthals vanished soon after early modern humans arrived in Europe from Africa. Roberts co-led the study with archaeologist colleague Thomas Sutikna (who also helped coordinate the 2003 dig), and Matthew Tocheri, a paleoanthropologist at Lakehead University in Thunder Bay, Canada. The first hobbit fossil, known as LB1, was found in 20032 beneath about 6 metres of dirt and rock. Its fragile bones were too precious for radiocarbon dating, so the team collected nearby charcoal, on the assumption that it had accrued at the same time as the bones. That charcoal was as young as 11,000 years old, researchers reported at the time3, 4. “Somehow these tiny people had survived on this island 30,000 years after modern humans arrived,” says Roberts. “We were scratching our heads. It couldn’t add up.” © 2016 Nature Publishing Group,
Keyword: Evolution
Link ID: 22055 - Posted: 03.31.2016
By Jordana Cepelewicz The bacteria that inhabit our guts have become key players for neuroscientists. A growing body of research links them to a wide array of mental and neurological disorders—from anxiety and depression to schizophrenia and Alzheimer’s disease. Now a study in mice published this week in Nature Medicine suggests that striking the right microbial balance could cause changes in the immune system that significantly reduce brain damage after a stroke—the second leading cause of both death and disability for people around the globe. (Scientific American is part of Springer Nature.) Experts have known for some time that stroke severity is influenced by the presence of two types of cell, found abundantly within the intestine, that calibrate immune responses: Regulatory T cells have a beneficial inflammatory effect, protecting an individual from stroke. But gamma delta T cells produce a cytokine that causes harmful inflammation after a stroke. A team of researchers at Weill Cornell Medical College and Memorial Sloan Kettering Cancer Center set about investigating whether they could tilt the balance of these cells in the favor of beneficial cells by tinkering with the body’s bacterial residents. To do so, they bred two colonies of mice: One group’s intestinal flora was resistant to antibiotics whereas the other’s gut bacteria was vulnerable to treatment. As a result, when given a combination of antibiotics over the course of two weeks, only the latter’s microbiota underwent change. The researchers then obstructed the cerebral arteries of the mice, inducing an ischemic stroke (the most common type). They found that subsequent brain damage was 60 percent smaller in the drug-susceptible mice than it was in the other group. © 2016 Scientific American,
Keyword: Stroke
Link ID: 22054 - Posted: 03.31.2016
By Ariana Eunjung Cha LAS VEGAS — Jamie Tyler was stressed. He had just endured a half-hour slog through airport security and needed some relief. Many travelers in this situation might have headed for the nearest bar or popped an aspirin. But Tyler grabbed a triangular piece of gadgetry from his bag and held it to his forehead. As he closed his eyes, the device zapped him with low-voltage electrical currents. Within minutes, Tyler said, he was feeling serene enough to face the crowds once again. This is no science fiction. The Harvard-trained neurobiologist was taking advantage of one of his own inventions, a device called Thync, which promises to help users activate their body's “natural state of energy or calm” — for a retail price of a mere $199. Americans’ obsession with wellness is fueling a new category of consumer electronics, one that goes far beyond the ubiquitous Fitbits and UP activity wristbands that only passively monitor users' physical activity. The latest wearable tech, to put it in the simplest terms, is about hacking your brain. These gadgets claim to be able to make you have more willpower, think more creatively and even jump higher. One day, their makers say, the technology may even succeed in delivering on the holy grail of emotions: happiness. There’s real, peer-reviewed science behind the theory driving these devices. It involves stimulating key regions of the brain — with currents or magnetic fields — to affect emotions and physical well-being.
Keyword: Emotions
Link ID: 22053 - Posted: 03.31.2016
By Matthew Hutson Earlier this month, a computer program called AlphaGo defeated a (human) world champion of the board game Go, years before most experts expected computers to rival the best flesh-and-bone players. But then last week, Microsoft was forced to silence its millennial-imitating chatbot Tay for blithely parroting Nazi propaganda and misogynistic attacks after just one day online, her failure a testimony to the often underestimated role of human sensibility in intelligent behavior. Why are we so compelled to pit human against machine, and why are we so bad at predicting the outcome? As the number of jobs susceptible to automation rises, and as Stephen Hawking, Elon Musk, and Bill Gates warn that artificial intelligence poses an existential threat to humanity, it’s natural to wonder how humans measure up to our future robot overlords. But even those tracking technology’s progress in taking on human skills have a hard time setting an accurate date for the uprising. That’s in part because one prediction strategy popular among both scientists and journalists—benchmarking the human brain with digital metrics such as bits, hertz, and million instructions per section, or MIPS—is severely misguided. And doing so could warp our expectations of what technology can do for us and to us. Since their development, digital computers have become a standard metaphor for the mind and brain. The comparison makes sense, in that brains and computers both transform input into output. Most human brains, like computers, can also manipulate abstract symbols. (Think arithmetic or language processing.) But like any metaphor, this one has limitations.
Keyword: Brain imaging; Robotics
Link ID: 22052 - Posted: 03.31.2016
By David Z. Hambrick Nearly a century after James Truslow Adams coined the phrase, the “American dream” has become a staple of presidential campaign speeches. Kicking off her 2016 campaign, Hillary Clinton told supporters that “we need to do a better job of getting our economy growing again and producing results and renewing the American dream.” Marco Rubio lamented that “too many Americans are starting to doubt” that it is still possible to achieve the American dream, and Ted Cruz asked his supporters to “imagine a legal immigration system that welcomes and celebrates those who come to achieve the American dream.” Donald Trump claimed that “the American dream is dead” and Bernie Sanders quipped that for many “the American dream has become a nightmare.” But the American dream is not just a pie-in-the-sky notion—it’s a scientifically testable proposition. The American dream, Adams wrote, “is not a dream of motor cars and high wages merely, but a dream of social order in which each man and each woman shall be able to attain to the fullest stature of which they are innately capable…regardless of the fortuitous circumstances of birth or position.” In the parlance of behavioral genetics—the scientific study of genetic influences on individual differences in behavior—Adams’ idea was that all Americans should have an equal opportunity to realize their genetic potential. A study just published in Psychological Science by psychologists Elliot Tucker-Drob and Timothy Bates reveals that this version of the American dream is in serious trouble. Tucker-Drob and Bates set out to evaluate evidence for the influence of genetic factors on IQ-type measures (aptitude and achievement) that predict success in school, work, and everyday life. Their specific question was how the contribution of genes to these measures would compare at low versus high levels of socioeconomic status (or SES), and whether the results would differ across countries. The results reveal, ironically, that the American dream is more of a reality for other countries than it is for America: genetic influences on IQ were uniform across levels of SES in Western Europe and Australia, but, in the United States, were much higher for the rich than for the poor. © 2016 Scientific American
Keyword: Genes & Behavior; Intelligence
Link ID: 22051 - Posted: 03.30.2016