For just a moment, imagine being a child in the late 1990s, hunched over a Game Boy Color in the back seat of the family car, face approximately 30 centimetres (around 12.5 inches) from a screen the size of a playing card, and dedicatedly squinting at a pixelated Charmander. Parents of the time insisted these ‘gosh darn video games‘ were rotting their children’s brains. As it turns out, the truth is rather more interesting. The brain was not rotting, it was actually building what would later be confirmed by peer-reviewed neuroscience as a dedicated Pokémon brain region.
Researchers at Stanford University have confirmed that people who spent their childhoods catching, battling, and meticulously memorising hundreds of Pokémon characters, developed a dedicated region of the brain for processing them. Perhaps most fascinatingly is how this region sits in roughly the same anatomical location across every single participant tested. The visual cortex, it seems, did not merely level up during all those hours on the Game Boy: it evolved – just like those wondrous pocket monsters of the time.
Gotta Scan ’em All: The study proving Pokémon Brain
The research, titled “Extensive childhood experience with Pokémon suggests eccentricity drives organisation of visual cortex“, was led by Jesse Gomez and Michael Barnett under the supervision of Professor Kalanit Grill-Spector, and published in the peer-reviewed journal Nature Human Behaviour in May 2019. It is, without question, one of the more entertaining pieces of neuroscience to have emerged from Stanford’s School of Medicine.
The team recruited two groups of adults: eleven who had begun playing the original Nintendo Pokémon games on the hand-held Game Boy device between the ages of five and eight, continued playing throughout childhood, and revisited the game at least once in adulthood; and eleven who had never played Pokémon at all and had little to no familiarity with the franchise. Gomez was among the experienced participants and noted that he began playing at around age six or seven and continued as Nintendo released new versions. Before entering the scanner, all participants completed a naming task in which they were shown forty randomly selected Pokémon sprites and asked to identify each from a choice of five options. The experienced group significantly outperformed the novices, which will surprise nobody who once knew the difference between Jolteon and Flareon on sight.
Both groups then underwent functional Magnetic Resonance Imaging (fMRI) – a technique that maps neural activity by measuring blood flow – whilst viewing images from eight different visual categories: faces, bodies, popular cartoon characters from television of the late 1990s and early 2000s, pseudowords (letter strings resembling words but carry no meaning), Pokémon, animals, cars, and corridors. The results were, to put it in the franchise’s own parlance, super effective.
“It’s been an open question in the field why we have brain regions that respond to words and faces but not to, say, cars. It’s also been a mystery why they appear in the same place in everyone’s brain”
– Jesse Gomez, lead author, Stanford press release

What the Pokémon Brain looks like
In experienced participants, viewing Pokémon characters consistently activated a specific area of the ventral temporal cortex (VTC) – the high-level visual processing region responsible for recognising things like words, faces, and objects. More specifically, the Pokémon brain region was centred on the occipitotemporal sulcus (OTS), a fold of cortex located at the base of the brain, just behind the ears. In novice participants, no such consistent activation was found; that same patch of cortex responded instead to animals, cartoons, and words.
What is particularly striking is the consistency. Every single experienced participant showed Pokémon-selective activation in the same anatomical structure – the OTS – with peaks at the posterior and middle extent of the sulcus (the grooves of the brain). The decoding accuracy for Pokémon in experienced participants reached approximately 81%, compared to around 45% in novices. To put it bluntly: a classifier trained on brain scan data alone could identify what an experienced Pokémon player was looking at with near-perfect accuracy. Novices were, in neurological terms, barely better than chance (like trying to catch a legendary with a basic Pokéball).
Despite the results seemingly speaking for themselves, it is worth clarifying what the researchers want to achieve here. The paper itself notes how the results should not be interpreted as the emergence of a new Pokémon functional module in the OTS on par with the face-selective cortex. Rather, it demonstrates how the Pokémon brain region represents an existing stretch of cortical tissue, one that has been repurposed through childhood experience and now features a spatial consistency predictable across individuals. The researchers also confirmed that experienced participants showed no meaningful loss of processing ability for other categories such as animals, faces, or words. Basically, having a dedicated Pokédex in the brain has in no way displaced anyones ability to recognise, for example, an actual cat. Rather, the brain has simply found room for both, lending some much needed validity to how so many of us are still able to correctly guess “who’s that Pokémon” on the first try.
Why the Pokémon Brain lives behind the ears
As fascinating as this research is on the surface, the more theoretically significant finding concerns where the presence of this Pokémon brain region exists and why it ended up there rather than somewhere else. This is where the study enters the realm of genuinely novel neuroscience.
Four competing theories had been proposed to explain how the visual cortex organises itself during development: eccentricity bias (where stimuli fall on the retina), rectilinearity (how straight versus curved the stimulus is), perceived animacy, and perceived real-world size. The researchers analysed Pokémon sprites against all four metrics and generated testable predictions for where each theory would expect Pokémon-selective activation to emerge.
In so doing, only one theory matched the data: eccentricity bias. On a Game Boy screen measuring roughly 2.5 × 2.5 centimetres (about 1 × 1 inch), Pokémon sprites were small and pixelated. The paper calculates that, when foveated upon at the assumed average holding distance of 32 centimetres (around 12.5 inches), the characters subtended approximately 2 degrees of visual angle – placing them squarely within the central visual field, or fovea. The brain’s visual cortex is organised, in part, according to where on the retina stimuli are typically seen: foveally viewed stimuli activate the lateral VTC; peripherally viewed stimuli activate more medial regions. Pokémon, being small and held close on an identically sized screen by children of similar arm length, consistently activated the lateral OTS, which is where foveal-input regions reside. The other three theories – rectilinearity, animacy, and perceived size – each generated predictions that did not match the observed location of Pokémon brain activation.
In other words, the reason every former Game Boy-wielding child has a Pokémon brain region in roughly the same anatomical location is not because Pokémon are inherently special to the brain. It is because every child held the same small device at roughly the same distance from their face, repeatedly for hours a day over a period of years, during a critical developmental window. Thus, Pokémon brain.
“What was unique about Pokémon is that there are hundreds of characters, and you have to know everything about them in order to play the game successfully. The game rewards you for individuating hundreds of these little, similar-looking characters”.
– Jesse Gomez, Nature Human Behaviour (2019)

Timing the Pokémon Brain Window
One of the study’s most important implications concerns the timing of experience. Prior research in macaque monkeys at Harvard Medical School had established how new category-selective regions emerge in the visual cortex only when exposure begins in juveniles. The effect is not seen when the same training is administered to adult animals. The Stanford data are consistent with this finding in humans: the Pokémon brain region was anatomically consistent and highly decodable only in participants whose experience began between the ages of five and eight. The paper is careful to note how these findings are consistent with the existence of a critical developmental period, rather than definitively establishing one. Even so, the implication is quite clear: adults beginning their Pokémon journey today are unlikely to develop the same neural specialisation. The window, it appears, has largely closed.
Of most importance is how the results of the study have genuine scientific implications well beyond the pocket monster franchise that has made these findings so novel. In fact, the research findings now suggest how there is a critical period for sculpting dedicated representations in the human VTC; one extending to at least school age, and that what a child sees repeatedly and attentively during that window – whether faces, letters, or pixelated pocket monsters – shapes how their visual cortex is permanently organised. The researchers draw an explicit parallel to the development of reading: word-selective cortex emerges during a similar age range, also in the lateral OTS, and also driven by foveal, central-vision processing. The Pokémon brain and the reading brain, it turns out, share a developmental neighbourhood. Thus the importance of better understanding the impact of the information children consume daily (think social media) has never been clearer.
Important research caveats
One must always be critical when reviewing data from any research study, and the Pokémon brain study warrants a handful of important qualifications.
For a start, the sample size is modest: only eleven experienced participants and eleven novices. The consistency of the findings is compelling, and the statistical effects are large, but twenty-two participants does not constitute large-scale population research. The authors do note that no statistical methods were used to predetermine sample sizes, and group sizes are comparable to those reported in prior published work in the field. However, replication with larger and more demographically diverse cohorts, would strengthen the conclusions considerably.
It is also worth noting how participants were selected through self-reporting and were required to have continued playing or revisited the game in adulthood. This introduces a potential selection bias: the experienced group may represent particularly dedicated fans rather than the average childhood player. Whether more casual Pokémon engagement during childhood produces a similar – if less pronounced – Pokémon brain signature remains an open question the study does not address.
Finally, the study used 8-bit Pokémon sprites from the original Nintendo Game Boy game, with 144 images per category presented across six scanning runs. The researchers did note, however, how cartoon-format Pokémon characters – which are less pixelated – still activated the same region in a small number of subjects. This suggests the effect generalises somewhat beyond the 8-bit sprites, however, the degree to which it extends to later generations of the franchise, or to higher-resolution modern games, has not been systematically tested.

Embrace the Pokémon Brain
Gamers consistently have to endure the stereotype’s around ‘games rotting the brain’. Unlike social media that actually does this, games are often associated more with positive changes to brain activity, response times and so much more. That is not to say they are without fault, but rather how in a world of so many technological offerings, research shows how games are the least of someone’s worries. The Pokémon brain research builds on these notions by offering a positive look into the impact games may have on the brain, and in so doing, has revealed some incredible new insight into the way the brain functions.
In this sense it shows how consistent, intensive, rewarded visual experiences during a critical developmental window reshapes how the human visual cortex allocates its resources – and the Game Boy, with its fixed screen size and near-universal adoption among children of a particular era, provided a remarkably well-controlled natural experiment to demonstrate it.
So the next time someone suggests childhood gaming was/is a waste of time, simply cite this peer-reviewed literature. Those hours spent training Pokémon on a Game Boy, repeated day after day over years of childhood, did anything but rot the brain. Instead, they gave it a unique brain region that has allowed us to, years later, better understand the way the brain functions. How useful it is being able to immediately identify Pokémon at a glance in adulthood is anyone’s guess, but at least it is one more skill in a gamer’s repertoire toward becoming “the very best, like no one ever was” – and I will take it!
Owner, founder and editor-in-chief at Vamers, Hans has a vested interest in geek culture and the interactive entertainment industry. With a Masters degree in Communications and Ludology, he is well read and versed in matters relating to video games and communication media, among many other topics of interest.

















