Links for Keyword: Intelligence

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 302

By JoAnna Klein A macaw named Poncho starred in movies like “102 Dalmatians,” “Dr. Doolittle” and “Ace Ventura: Pet Detective” before retiring in England. She recently celebrated her 90th birthday. Alex, an African grey parrot who lived to 31, knew colors, shapes and numbers, and communicated using basic expressions. He could do what toddlers only do after a certain stage of development — know when something is hidden from view. And they’re just two of the many parrots in the world who have surprised us with their intelligence, skills and longevity. “Nature does these experiments for us, and then we have to go and ask, how did this happen?” said Dr. Claudio Mello, a neuroscientist at Oregon Health and Science University. So he and a team of nearly two dozen scientists looked for clues in the genome of the blue-fronted Amazon parrot in Brazil, his home country. After comparing its genome with those of dozens of other birds, the researchers’ findings suggest that evolution may have made parrots something like the humans of the avian world. In some ways, the long-lived feathered friends are as genetically different from other birds as humans are from other primates. Their analysis, published Thursday in Current Biology, also highlights how two very different animals — parrots and humans — can wind up finding similar solutions to problems through evolution. A general rule of life span in birds and other animals is the bigger or heavier you are, the longer you live. A small bird like a finch may live five to eight years, while bigger ones like eagles or cranes can live decades. The blue-fronted Amazon and some other parrots are even more exceptional, in that they can live up to 66 years — in some cases outliving their human companions. © 2018 The New York Times Company

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25764 - Posted: 12.08.2018

By Carl Zimmer To demonstrate how smart an octopus can be, Piero Amodio points to a YouTube video. It shows an octopus pulling two halves of a coconut shell together to hide inside. Later the animal stacks the shells together like nesting bowls — and carts them away. “It suggests the octopus is carrying these tools around because it has some understanding they may be useful in the future,” said Mr. Amodio, a graduate student studying animal intelligence at the University of Cambridge in Britain. But his amazement is mixed with puzzlement. For decades, researchers have studied how certain animals evolved to be intelligent, among them apes, elephants, dolphins and even some birds, such as crows and parrots. But all the scientific theories fail when it comes to cephalopods, a group that includes octopuses, squid and cuttlefish. Despite feats of creativity, they lack some hallmarks of intelligence seen in other species. “It’s an apparent paradox that’s been largely overlooked in the past,” said Mr. Amodio. He and five other experts on animal intelligence explore this paradox in a paper published this month in the journal Trends in Ecology and Evolution. For scientists who study animal behavior, intelligence is not about acing a calculus test or taking a car apart and putting it back together. Intelligence comprises sophisticated cognitive skills that help an animal thrive. That may include the ability to come up with solutions to the problem of finding food, for example, or a knack for planning for some challenge in the future. Intelligent animals don’t rely on fixed responses to survive — they can invent new behaviors on the fly. © 2018 The New York Times Company

Related chapters from BN8e: Chapter 6: Evolution of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25741 - Posted: 12.01.2018

Shawna Williams In 1987, political scientist James Flynn of the University of Otago in New Zealand documented a curious phenomenon: broad intelligence gains in multiple human populations over time. Across 14 countries where decades’ worth of average IQ scores of large swaths of the population were available, all had upward swings—some of them dramatic. Children in Japan, for example, gained an average of 20 points on a test known as the Wechsler Intelligence Scale for Children between 1951 and 1975. In France, the average 18-year-old man performed 25 points better on a reasoning test in 1974 than did his 1949 counterpart.1 Flynn initially suspected the trend reflected faulty tests. Yet in the ensuing years, more data and analyses supported the idea that human intelligence was increasing over time. Proposed explanations for the phenomenon, now known as the Flynn effect, include increasing education, better nutrition, greater use of technology, and reduced lead exposure, to name but four. Beginning with people born in the 1970s, the trend has reversed in some Western European countries, deepening the mystery of what’s behind the generational fluctuations. But no consensus has emerged on the underlying cause of these trends. A fundamental challenge in understanding the Flynn effect is defining intelligence. At the dawn of the 20th century, English psychologist Charles Spearman first observed that people’s average performance on a variety of seemingly unrelated mental tasks—judging whether one weight is heavier than another, for example, or pushing a button quickly after a light comes on—predicts our average performance on a completely different set of tasks. Spearman proposed that a single measure of general intelligence, g, was responsible for that commonality. © 1986 - 2018 The Scientist

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 25714 - Posted: 11.24.2018

By Victoria Gill Science correspondent, BBC News Clever, tool-using crows have surprised scientists once again with remarkable problem-solving skills. In a task designed to test their tool-making prowess, New Caledonian crows spontaneously put together two short, combinable sticks to make a longer "fishing rod" to reach a piece of food. The findings are published in the journal Scientific Reports. Scientists say the demonstration is a "window into how another animals' minds work". How do you test a bird's tool-making skills? New Caledonian crows are known to spontaneously use tools in the wild. This task, designed by scientists at the Max Planck Institute for Ornithology in Seewiesen, Germany, and the University of Oxford, presented the birds with a novel problem that they needed to make a new tool in order to solve. It involved a "puzzle box" containing food behind a door that left a narrow gap along the bottom. With the food deep inside the box and only short sticks - too short to reach the food - the crows were left to work out what to do. The sticks were designed to be combinable - one was hollow to allow the other to slot inside. And with no demonstration or help, four out of the eight crows inserted one stick into another and used the resulting longer tool to fish for and extract the food from the box. "They have never seen this compound tool, but somehow they can predict its properties," explained one of the lead researchers, Prof Alex Kacelnik. "So they can predict what something that does not yet exist would do if they made it. Then they can make it and they can use it. © 2018 BBC

Related chapters from BN8e: Chapter 6: Evolution of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25615 - Posted: 10.25.2018

By Piercarlo Valdesolo Earlier this year, a research team led by Dr. Sven Karlsson published the largest scale study on the causes of human intelligence. They found an intriguing pattern of results: Focusing on arithmetic and linguistic tests, genetics predicted over 26% of people’s responses. Namely, individuals with a long allele of the 4-GTTLR gene got more right answers on the arithmetic, mental rotation, and semantic memory tasks than did individuals with the short version of the gene. In contrast, education explained only 4% of people’s responses. Describing the work, Karlsson wrote “We believe this is an interesting result! Our findings indicate that, contrary to certain previous assumptions, basic cognitive capabilities—mental rotation, math and language—really have a strong heritable component. Intelligence in adulthood seems to be predicted by genes early in life… things like education and effort play a small role once you take into account the role of genetics.” How did you react to the description above? Hopefully you haven’t already tweeted about it: it’s completely made up. A genetic basis for intelligence is a politically fraught scientific idea about which you had likely developed an opinion before reading about the fictitious Dr. Karlsson. You might think it obviously so that genes play an important role in shaping all traits, including intelligence. Or you might think that genes play a trivial role in comparison to socialization and learning. The ease with which you accepted the brief synopsis of research above as true likely depends on these existing beliefs. If the findings are consistent with your beliefs, you might have quickly accepted its truth value. If inconsistent, then you might have been tempted to either dismiss the finding out of hand, or perhaps dig deeper into the article to find some disqualifying error in method or analysis. These are reactions that psychologists have known about for decades. Motivated reasoning, confirmation bias, selective attention. We are equipped with a range of psychological processes that inoculate us from the threat of information that pokes up against our worldviews and beliefs, and attract us to information consistent with our beliefs. © 2018 Scientific American

Related chapters from BN8e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 1: An Introduction to Brain and Behavior
Link ID: 25468 - Posted: 09.20.2018

By Karen Weintraub New Caledonian crows are known for their toolmaking, but Alex Taylor and his colleagues wanted to understand just how advanced they could be. Crows from New Caledonia, an island in the South Pacific, can break off pieces of a branch to form a hook, using it to pull a grub out of a log, for instance. Once, in captivity, when a New Caledonian male crow had taken all the available hooks, its mate Betty took a straight piece of wire and bent it to make one. “They are head and shoulders above almost every other avian subjects” at toolmaking, said Irene Pepperberg, an avian cognition expert and research associate in Harvard University’s department of psychology. “These crows are just amazing.” Dr. Taylor, a researcher at the University of Auckland in New Zealand, and several European colleagues wondered how the crows, without an ability to talk and showing no evidence of mimicry, might learn such sophisticated toolmaking. Perhaps, the scientists hypothesized in a new paper published Thursday in Scientific Reports, they used “mental template matching,” where they formed an image in their heads of tools they’d seen used by others and then copied it. “Could they look at a tool and just based on mental image of the tool — can they recreate that tool design?” Dr. Taylor said. “That’s what we set out to test, and that’s what our results show.” In a series of steps, the researchers taught the birds to feed pieces of paper into a mock vending machine to earn food rewards. The scientists chose a task that was similar enough to something the animals do in the wild — while also brand new. The birds had never seen card stock before, but learned how to rip it into big or little shapes after being shown they would get a reward for the appropriate size. The template used to show the birds the right size of paper was not available to them when they made their “tools,” yet the crows were able to use their beaks to tear off bits of paper, which they sometimes held between their feet for leverage. © 2018 The New York Times Company

Related chapters from BN8e: Chapter 6: Evolution of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25161 - Posted: 06.29.2018

Susan Milius A little brain can be surprisingly good at nothing. Honeybees are the first invertebrates to pass a test of recognizing where zero goes in numerical order, a new study finds. Even small children struggle with recognizing “nothing” as being less than one, says cognitive behavioral scientist Scarlett Howard of the Royal Melbourne Institute of Technology in Australia. But honeybees trained to fly to images of greater or fewer dots or whazzits tended to rank a blank image as less than one, Howard and colleagues report in the June 8 Science. Despite decades of discoveries, nonhuman animals still don’t get due credit outside specialist circles for intelligence, laments Lars Chittka of Queen Mary University of London, who has explored various mental capacities of bees. For the world at large, he emphasizes that the abilities described in the new paper are “remarkable.” Researchers recognize several levels of complexity in grasping zero. Most animals, or maybe all, can understand the simplest level — just recognizing that the absence of something differs from its presence, Howard says. Grasping the notion that absence could fit into a sequence of quantities, though, seems harder. Previously, only some primates such as chimps and vervet monkeys, plus an African gray parrot named Alex, have demonstrated this level of understanding of the concept of zero (SN: 12/10/16, p. 22). |© Society for Science & the Public 2000 - 2018

Related chapters from BN8e: Chapter 1: Biological Psychology: Scope and Outlook; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior
Link ID: 25069 - Posted: 06.08.2018

By Ruth Williams | The sun’s ultraviolet (UV) radiation is a major cause of skin cancer, but it offers some health benefits too, such as boosting production of essential vitamin D and improving mood. Today (May 17), a report in Cell adds enhanced learning and memory to UV’s unexpected benefits. Researchers have discovered that, in mice, exposure to UV light activates a molecular pathway that increases production of the brain chemical glutamate, heightening the animals’ ability to learn and remember. “The subject is of strong interest, because it provides additional support for the recently proposed theory of ultraviolet light’s regulation of the brain and central neuroendocrine system,” dermatologist Andrzej Slominski of the University of Alabama who was not involved in the research writes in an email to The Scientist. “It’s an interesting and timely paper investigating the skin-brain connection,” notes skin scientist Martin Steinhoff of University College Dublin’s Center for Biomedical Engineering who also did not participate in the research. “The authors make an interesting observation linking moderate UV exposure to . . . [production of] the molecule urocanic acid. They hypothesize that this molecule enters the brain, activates glutaminergic neurons through glutamate release, and that memory and learning are increased.” While the work is “fascinating, very meticulous, and extremely detailed,” says dermatologist David Fisher of Massachusetts General Hospital and Harvard Medical School, “it does not imply that UV is actually good for you. . . . Across the board, for humanity, UV really is dangerous.” © 1986-2018 The Scientist

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 7: Vision: From Eye to Brain
Link ID: 24993 - Posted: 05.18.2018

Mariarosaria Taddeo and Luciano Floridi. Cyberattacks are becoming more frequent, sophisticated and destructive. Each day in 2017, the United States suffered, on average, more than 4,000 ransomware attacks, which encrypt computer files until the owner pays to release them1. In 2015, the daily average was just 1,000. In May last year, when the WannaCry virus crippled hundreds of IT systems across the UK National Health Service, more than 19,000 appointments were cancelled. A month later, the NotPetya ransomware cost pharmaceutical giant Merck, shipping firm Maersk and logistics company FedEx around US$300 million each. Global damages from cyberattacks totalled $5 billion in 2017 and may reach $6 trillion a year by 2021 (see go.nature.com/2gncsyg). Countries are partly behind this rise. They use cyberattacks both offensively and defensively. For example, North Korea has been linked to WannaCry, and Russia to NotPetya. As the threats escalate, so do defence tactics. Since 2012, the United States has used ‘active’ cyberdefence strategies, in which computer experts neutralize or distract viruses with decoy targets, or break into a hacker’s computer to delete data or destroy the system. In 2016, the United Kingdom announced a 5-year, £1.9-billion (US$2.7-billion) plan to combat cyber threats. NATO also began drafting principles for active cyberdefence, to be agreed by 2019. The United States and the United Kingdom are leading this initiative. Denmark, Germany, the Netherlands, Norway and Spain are also involved (see go.nature.com/2hebxnt). © 2018 Macmillan Publishers Limited,

Related chapters from BN8e: Chapter 1: Biological Psychology: Scope and Outlook; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior; Chapter 13: Memory, Learning, and Development
Link ID: 24873 - Posted: 04.17.2018

By TARA PARKER-POPE Today’s teenagers have been raised on cellphones and social media. Should we worry about them or just get out of their way? A recent wave of student protests around the country has provided a close-up view of Generation Z in action, and many adults have been surprised. While there has been much hand-wringing about this cohort, also called iGen or the Post-Millennials, the stereotype of a disengaged, entitled and social-media-addicted generation doesn’t match the poised, media-savvy and inclusive young people leading the protests and gracing magazine covers. There’s 18-year-old Emma González, whose shaved head, impassioned speeches and torn jeans have made her the iconic face of the #NeverAgain movement, which developed after the 17 shooting deaths in February at Marjory Stoneman Douglas High School in Parkland, Fla. Naomi Wadler, just 11, became an overnight sensation after confidently telling a national television audience she represented “African-American girls whose stories don’t make the front page of every national newspaper.” David Hogg, a high school senior at Stoneman Douglas, has weathered numerous personal attacks with the disciplined calm of a seasoned politician. Sure, these kids could be outliers. But plenty of adolescent researchers believe they are not. “I think we must contemplate that technology is having the exact opposite effect than we perceived,” said Julie Lythcott-Haims, the former dean of freshmen at Stanford University and author of “How to Raise an Adult.” “We see the negatives of not going outside, can’t look people in the eye, don’t have to go through the effort of making a phone call. There are ways we see the deficiencies that social media has offered, but there are obviously tremendous upsides and positives as well.” “I am fascinated by the phenomenon we are seeing in front of us, and I don’t think it’s unique to these six or seven kids who have been the face of the Parkland adolescent cohort,” says Lisa Damour, an adolescent psychologist and author of “Untangled: Guiding Teenage Girls Through the Seven Transitions Into Adulthood.” © 2018 The New York Times Company

Related chapters from BN8e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 24805 - Posted: 03.31.2018

By David Z. Hambrick, Madeline Marquardt There are advantages to being smart. People who do well on standardized tests of intelligence—IQ tests—tend to be more successful in the classroom and the workplace. Although the reasons are not fully understood, they also tend to live longer, healthier lives, and are less likely to experience negative life events such as bankruptcy. Now there’s some bad news for people in the right tail of the IQ bell curve. In a study just published in the journal Intelligence, Pitzer College researcher Ruth Karpinski and her colleagues emailed a survey with questions about psychological and physiological disorders to members of Mensa. A “high IQ society”, Mensa requires that its members have an IQ in the top two percent. For most intelligence tests, this corresponds to an IQ of about 132 or higher. (The average IQ of the general population is 100.) The survey of Mensa’s highly intelligent members found that they were more likely to suffer from a range of serious disorders. The survey covered mood disorders (depression, dysthymia, and bipolar), anxiety disorders (generalized, social, and obsessive-compulsive), attention-deficit hyperactivity disorder, and autism. It also covered environmental allergies, asthma, and autoimmune disorders. Respondents were asked to report whether they had ever been formally diagnosed with each disorder, or suspected they suffered from it. With a return rate of nearly 75%, Karpinski and colleagues compared the percentage of the 3,715 respondents who reported each disorder to the national average. © 2017 Scientific American

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 1: An Introduction to Brain and Behavior
Link ID: 24397 - Posted: 12.06.2017

By Alexander P. Burgoyne, David Z. Hambrick More than 60 years ago, Francis Crick and James Watson discovered the double-helical structure of deoxyribonucleic acid—better known as DNA. Today, for the cost of a Netflix subscription, you can have your DNA sequenced to learn about your ancestry and proclivities. Yet, while it is an irrefutable fact that the transmission of DNA from parents to offspring is the biological basis for heredity, we still know relatively little about the specific genes that make us who we are. That is changing rapidly through genome-wide association studies—GWAS, for short. These studies search for differences in people’s genetic makeup—their “genotypes”—that correlate with differences in their observable traits—their “phenotypes.” In a GWAS recently published in Nature Genetics, a team of scientists from around the world analyzed the DNA sequences of 78,308 people for correlations with general intelligence, as measured by IQ tests. The major goal of the study was to identify single nucleotide polymorphisms—or SNPs—that correlate significantly with intelligence test scores. Found in most cells throughout the body, DNA is made up of four molecules called nucleotides, referred to by their organic bases: cytosine (C), thymine (T), adenine (A), and guanine (G). Within a cell, DNA is organized into structures called chromosomes­. Humans normally have 23 pairs of chromosomes, with one in each pair inherited from each parent. © 2017 Scientific American

Related chapters from BN8e: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 23986 - Posted: 08.23.2017

Susan Milius Ravens have passed what may be their toughest tests yet of powers that, at least on a good day, let people and other apes plan ahead. Lab-dwelling common ravens (Corvus corax) in Sweden at least matched the performance of nonhuman apes and young children in peculiar tests of advanced planning ability. The birds faced such challenges as selecting a rock useless at the moment but likely to be useful for working a puzzle box and getting food later. Ravens also reached apelike levels of self-control, picking a tool instead of a ho-hum treat when the tool would eventually allow them to get a fabulous bit of kibble 17 hours later, Mathias Osvath and Can Kabadayi of Lund University in Sweden report in the July 14 Science. “The insight we get from the experiment is that [ravens] can plan for the future outside behaviors observed in the wild,” Markus Böckle, of the University of Cambridge, said in an e-mail. Böckle, who has studied ravens, coauthored a commentary in the same issue of Science. In the wild, ravens cache some of their food, but that apparent foresight could be more of a specific adaptation that evolved with diet instead of as some broader power of planning. The Lund tests, based on experiments with apes, tried to challenge ravens in less natural ways. The researchers say the birds aren’t considered much of a tool-using species in nature, nor do they trade for food. “The study for the first time in any animal shows that future planning can be used in behaviors it was not originally selected for” in evolution, Böckle says. © Society for Science & the Public 2000 - 2017.

Related chapters from BN8e: Chapter 6: Evolution of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 23835 - Posted: 07.14.2017

People with higher IQs are less likely to die before the age of 79. That’s according to a study of over 65,000 people born in Scotland in 1936. Each of the people in the study took an intelligence test at the age of 11, and their health was then followed for 68 years, until the end of 2015. When Ian Deary, of the University of Edinburgh, UK, and his team analysed data from the study, they found that a higher test score in childhood was linked to a 28 per cent lower risk of death from respiratory disease, a 25 per cent reduced risk of coronary heart disease, and a 24 per cent lower risk of death from stroke. These people were also less likely to die from injuries, digestive diseases, and dementia – even when factors like socio-economic status were taken into account. Deary’s team say there are several theories for why more intelligent people live longer, such as people with higher IQs being more likely to look after their health and less likely to smoke. They also tend to do more exercise and seek medical attention when ill. “I’m hoping it means that if we can find out what smart people do and copy them, then we have a chance of a slightly longer and healthier life,” says Dreary. But there’s evidence genetics is involved too. A recent study suggests that very rare genetic variants can play an important role in lowering intelligence, and that these may also be likely to impair a person’s health. Journal reference: British Medical Journal, DOI: 10.1136/bmj.j2708 © Copyright New Scientist Ltd.

Related chapters from BN8e: Chapter 1: Biological Psychology: Scope and Outlook; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior; Chapter 13: Memory, Learning, and Development
Link ID: 23786 - Posted: 06.29.2017

By Katie Langin No one likes a con artist. People avoid dealing with characters who have swindled them in the past, and—according to new research—birds avoid those people, too. Ravens, known more for their intelligence, but only slightly less for their love of cheese, were trained by researchers to trade a crust of bread for a morsel of cheese with human partners. When the birds then tried to broker a trade with “fair” and “unfair” partners—some completed the trade as expected, but others took the raven’s bread and kept (and ate) the cheese—the ravens avoided the tricksters in separate trials a month later. This suggests that ravens can not only differentiate between “fair” and “unfair” individuals, but they retain that ability for at least a month, the researchers write this month in Animal Behavior. Ravens have a complex social life involving friendships and rivalries. Their ability to recognize and punish dishonest individuals, even after a single encounter, may help explain how cooperation evolved in this group of birds. For people, though, the moral of the story is simple: Be nice to ravens. © 2017 American Association for the Advancement of Science.

Related chapters from BN8e: Chapter 6: Evolution of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 23709 - Posted: 06.06.2017

By David Z. Hambrick Physical similarities aside, we share a lot in common with our primate relatives. For example, as Jane Goodall famously documented, chimpanzees form lifelong bonds and show affection in much the same way as humans. Chimps can also solve novel problems, use objects as tools, and may possess “theory of mind”—an understanding that others may have different perspectives than oneself. They can even outperform humans in certain types of cognitive tasks. These commonalities may not seem all that surprising given what we now know from the field of comparative genomics: We share nearly all of our DNA with chimpanzees and other primates. However, social and cognitive complexity is not unique to our closest evolutionary cousins. In fact, it is abundant in species with which we would seem to have very little in common—like the spotted hyena. For more than three decades, the Michigan State University zoologist Kay Holekamp has studied the habits of the spotted hyena in Kenya’s Masai Mara National Reserve, once spending five years straight living in a tent among her oft-maligned subjects. One of the world’s longest-running studies of a wild mammal, this landmark project has revealed that spotted hyenas not only have social groups as complex as those of many primates, but are also capable of some of the same types of problem solving. This research sheds light on one of science’s greatest mysteries—how intelligence has evolved across the animal kingdom. According to the social brain hypothesis, intelligence has evolved to meet the demands of social life. The subject of many popular articles and books, this hypothesis posits that the complex information processing that goes along with coexisting with members of one’s own species—forming coalitions, settling disputes, trying to outwit each other, and so on—selects for larger brains and greater intelligence. By contrast, the cognitive buffer hypothesis holds that intelligence emerges as an adaption to dealing with novelty in the environment, in whatever form it presents itself. © 2017 Scientific American,

Related chapters from BN8e: Chapter 1: Biological Psychology: Scope and Outlook; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior; Chapter 13: Memory, Learning, and Development
Link ID: 23685 - Posted: 05.31.2017

Carl Zimmer In a significant advance in the study of mental ability, a team of European and American scientists announced on Monday that they had identified 52 genes linked to intelligence in nearly 80,000 people. These genes do not determine intelligence, however. Their combined influence is minuscule, the researchers said, suggesting that thousands more are likely to be involved and still await discovery. Just as important, intelligence is profoundly shaped by the environment. Still, the findings could make it possible to begin new experiments into the biological basis of reasoning and problem-solving, experts said. They could even help researchers determine which interventions would be most effective for children struggling to learn. “This represents an enormous success,” said Paige Harden, a psychologist at the University of Texas, who was not involved in the study. For over a century, psychologists have studied intelligence by asking people questions. Their exams have evolved into batteries of tests, each probing a different mental ability, such as verbal reasoning or memorization. In a typical test, the tasks might include imagining an object rotating, picking out a shape to complete a figure, and then pressing a button as fast as possible whenever a particular type of word appears. Each test-taker may get varying scores for different abilities. But over all, these scores tend to hang together — people who score low on one measure tend to score low on the others, and vice versa. Psychologists sometimes refer to this similarity as general intelligence. It’s still not clear what in the brain accounts for intelligence. Neuroscientists have compared the brains of people with high and low test scores for clues, and they’ve found a few. Brain size explains a small part of the variation, for example, although there are plenty of people with small brains who score higher than others with bigger brains. © 2017 The New York Times Company

Related chapters from BN8e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 23650 - Posted: 05.23.2017

By Ian Randall René Descartes began with doubt. “We cannot doubt of our existence while we doubt. … I think, therefore I am,” the 17th century philosopher and scientist famously wrote. Now, modern scientists are trying to figure out what made the genius’s mind tick by reconstructing his brain. Scientists have long wondered whether the brains of geniuses (especially the shapes on their surfaces) could hold clues about their owners’ outsized intelligences. But most brains studied to date—including Albert Einstein’s—were actual brains. Descartes’s had unfortunately decomposed by the time scientists wanted to study it. So with techniques normally used for studying prehistoric humans, researchers created a 3D image of Descartes’s brain (above) by scanning the impression it left on the inside of his skull, which has been kept for almost 200 years now in the National Museum of Natural History in Paris. For the most part, his brain was surprisingly normal—its overall dimensions fell within regular ranges, compared with 102 other modern humans. But one part stood out: an unusual bulge in the frontal cortex, in an area which previous studies have suggested may process the meaning of words. That’s not to say this oddity is necessarily indicative of genius, the scientists report online in the Journal of the Neurological Sciences. Even Descartes might agree: “It is not enough to have a good mind,” he wrote. “The main thing is to use it well.” © 2017 American Association for the Advancement of Science

Related chapters from BN8e: Chapter 19: Language and Lateralization; Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 15: Brain Asymmetry, Spatial Cognition, and Language; Chapter 1: An Introduction to Brain and Behavior
Link ID: 23579 - Posted: 05.06.2017

Ian Sample Science editor Tempting as it may be, it would be wrong to claim that with each generation humans are becoming more stupid. As scientists are often so keen to point out, it is a bit more complicated than that. A study from Iceland is the latest to raise the prospect of a downwards spiral into imbecility. The research from deCODE, a genetics firm in Reykjavik, finds that groups of genes that predispose people to spend more years in education became a little rarer in the country from 1910 to 1975. The scientists used a database of more than 100,000 Icelanders to see how dozens of gene variants that affect educational attainment appeared in the population over time. They found a shallow decline over the 65 year period, implying a downturn in the natural inclination to rack up qualifications. But the genes involved in education affected fertility too. Those who carried more “education genes” tended to have fewer children than others. This led the scientists to propose that the genes had become rarer in the population because, for all their qualifications, better educated people had contributed less than others to the Icelandic gene pool. Spending longer in education and the career opportunities that provides is not the sole reason that better educated people tend to start families later and have fewer children, the study suggests. Many people who carried lots of genes for prolonged education left the system early and yet still had fewer children that the others. “It isn’t the case that education, or the career opportunities it provides, prevents you from having more children,” said Kari Stefansson, who led the study. “If you are genetically predisposed to have a lot of education, you are also predisposed to have fewer children.” © 2017 Guardian News and Media Limited

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 23113 - Posted: 01.17.2017

By PETER GODFREY-SMITH Around 2008, while snorkeling and scuba diving in my free time, I began watching the unusual group of animals known as cephalopods, the group that includes octopuses, cuttlefish and squid. The first ones I encountered were giant cuttlefish, large animals whose skin changes color so quickly and completely that swimming after them can be like following an aquatic, multi-armed television. Then I began watching octopuses. Despite being mollusks, like clams and oysters, these animals have very large brains and exhibit a curious, enigmatic intelligence. I followed them through the sea, and also began reading about them, and one of the first things I learned came as a shock: They have extremely short lives — just one or two years. I was already puzzled by the evolution of large brains in cephalopods, and this discovery made the questions more acute. What is the point of building a complex brain like that if your life is over in a year or two? Why invest in a process of learning about the world if there is no time to put that information to use? An octopus’s or cuttlefish’s life is rich in experience, but it is incredibly compressed. The particular puzzle of octopus life span opens up a more general one. Why do animals age? And why do they age so differently? A scruffy-looking fish that inhabits the same patch of sea as my cephalopods has relatives who live to 200 years of age. This seems extraordinarily unfair: A dull-looking fish lives for centuries while the cuttlefish, in their chromatic splendor, and the octopuses, in their inquisitive intelligence, are dead before they are 2? There are monkeys the size of a mouse that can live for 15 years, and hummingbirds that can live for over 10. Nautiluses (who are also cephalopods) can live for 20 years. A recent Nature paper reported that despite continuing medical advances, humans appear to have reached a rough plateau at around 115 years, though a few people will edge beyond it. The life spans of animals seem to lack all rhyme or reason. © 2016 The New York Times Company

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 22951 - Posted: 12.05.2016