A reader doesn’t quite buy the notion that cognitive performance peaks in one’s mid-20s:
Yeah, bite me. Gabriel Garcia Marquez was 40 when he published his most popular and most seminal work, One Hundred Years of Solitude. You hear that? One hundred years. What, are the next 76 going to be in spent in obsolescence? His characters lived a hell of a lot longer than 24, and they were lively and smart to the very end. In the real world – Newton, Dickens, Springsteen. Sure, you can say “But they’re geniuses,” but billions of regular folk get sharper and better with age.
These kinds of studies just reinforce the concept of life as a rat race – a minute-by-minute, day-by-day competition to win at the game of life instead of lose. That’s not healthy. Life is amazing, as Carl Sagan said. It’s not a battle. There are no winners and losers. What makes true happiness? I would posit an answer of living a content life, free from anxiety and worry – including the worries about studies that say you’re over the hill when you’re still young enough to be carded at restaurants.
A 27-year-old reader:
I’ve got to question the reasoning behind seeing a lag in “seeing and doing” between a 24-year-old and a 39-year-old. Can we not chalk that up to a penchant for slightly more forethought and planning as we age? No doubt there is a bit of pruning that goes on following adolescence. (I know because the Dish told me so.) Brains shrink. Synapses slow. In the context of judgment calls, I would argue the ability to plot strategy – chess vs. checkers – increases over time.
Mostly though, I know I’m on the downhill slide. Don’t rub it in.
But maybe he isn’t; another reader points to a recent article in New Scientist in which computational linguists Michael Ramscar and Harald Baayen argue that “our brains work better with age”:
Contrary to popular belief, neuronal loss does not play a significant role in age-related changes in brain structure. Rather, consistent with our findings, most of the changes that occur as healthy brains age are difficult to distinguish from those that occur as we learn. Thus, understanding the costs and benefits of learning is critical if we are to establish the facts of cognitive aging.
For example, memory experiments show that, as we age, we “encode” less contextual information, such as what we were wearing when we learned a new fact. This makes the fact harder to recall, and is seen as a sign of cognitive decline. Yet everything we know about the way our brains learn indicates that people must inevitably become insensitive to many background details as life experience grows. This is simply because detuning our attention to irrelevant information is integral to the process we call “learning”. …
We are not arguing that the functionality of our brains stays the same as we grow older, or that cognitive decline never happens, even in healthy aging. What we do know is the changes in performance seen on tests such as the PAL task are not evidence of cognitive or physiological decline in aging brains. Instead, they are evidence of continued learning and increased knowledge. This point is critical when it comes to older people’s beliefs about their cognitive abilities. People who believe their abilities can improve with work have been shown to learn far better than those who believe abilities are fixed.
The aforementioned reader adds, “I know I would rather have a 24-year-old as a fighter pilot, but a 50-year-old as an airline pilot.” He’s not the only one: a historically-minded reader notes that “in the Second World War, the Royal Air Force limited fighter combat to pilots who were younger than 26. Their estimate was that the reflexes began to slow enough to make it a fatal risk.” Update from a reader:
First of all: as a 28-year-old gamer-scientist who plays both StarCraft 1 (SC1) and StarCraft 2 (SC2) – and both since launch – I’ve been playing SC1 for just over 16 years now! – this journal article is fucking infuriating. So much bad science, so much bad gaming reporting. First of all: the base data (Ref. 1 in the PLoS article, which you can find here), only looks at a few characteristics as to what makes a “better” gamer in SC2. It turns out: being a faster neurotic (higher APM), using high tech unit / spell casters more often and wisely. The follow-up looks at how players across different ages compare to these “characteristics” and what their league level is.
Ok, here’s my scientist part: what is their control game? So far all I’ve seen is data that shows older SC2 players do worse according to their definitions of a good SC2 player compared to younger players.
As a gamer: SC1 and SC2 are different games with almost the same base. Older players who played a lot of SC1 just do NOT play SC2 the same way because we have over a decade of training to play it differently. SC1 rewards: high APM, high macro, high tech / low spell units. And if you look at the data in the cited PLoS article, it turns out the data seems to say: hey, lower skill players are all over the place (fig 3(a) in ref), but high skill players are all concentrated in the young groups (fig 3(f) in ref).
As a scientist: do other games of similar skill/age profiles show the same age-histograms? Could it be that older people have less time to play these types of games and therefore make up less of the % of leagues?
As a gamer, on pg 6 they directly mention that older gamers do better than younger players on a direct skill that was highly rewarded in SC1. As an “older” SC1/2 gamers, this fits my first point – SC1 players who transitioned to SC2 just play differently. It’s a mechanics issue where the new rules are directly fighting against 10+ years of trained playing.
Finally: we have seen over multiple reports recently that there has been a giant uptick in cognitive ability that might be directly related to lead exposure (see all the reporting by Kevin Drum at Mother Jones, but I have heard sociologists talking about this since at least 2005). You are looking at a report that compares people from ages 16-44 (those born between 1997 – 1969). So whats the baseline cognitive ability? Yes, there could be one, but there is no control.