TL;DR

  • The claim that “humans haven’t changed in 50,000 years” is empirically wrong: ancient DNA shows strong, recent selection on brain-related traits, including intelligence proxies, educational attainment, and psychiatric risk.
  • Polygenic time series in Europe and Eastern Eurasia reveal directional shifts in cognitive scores and mental-disorder risk throughout the Holocene, especially after the Neolithic.
  • New work on “recently evolved variants linked to brain and cognitive traits” finds enrichment in language areas and robust associations with intelligence and psychiatric phenotypes, implying ongoing brain-specific evolution.
  • The Y-chromosome bottleneck and sex-biased demography show that male lineages were radically culled and reshaped in the last 8–10k years, almost certainly with behavioral and social correlates.
  • Eve Theory of Consciousness (EToC) frames this as the transition from Golden Man (pre-self control system) to Eve (self-referential, psychiatric, narrative). The genetics now say: this wasn’t a one-off glitch 50kya; the self has been under continuous remodeling right into recorded history.

Companion articles: For deep dives into Akbari & Reich’s schizophrenia findings, see “Ancient DNA Shows Schizophrenia Risk Purged Over 10,000 Years” and “Holocene Minds on Hard Mode”. For understanding Golden Man as a control system, see “Golden Man as Control System”.


“Man is not a finished creation but the beginning of a process.”
— loosely adapted from Nietzsche, who would have loved ancient DNA


1. The 50,000-Year Myth#

You’ve heard the line:

“Biologically, we’re basically the same as people in the Ice Age. Evolution stopped 50,000 years ago; everything since is culture.”

It shows up in pop-evolutionary psychology, in blank-slate egalitarian rhetoric, and in a certain kind of conservative anthropology where the Upper Paleolithic is frozen as the canonical “environment of evolutionary adaptedness.”

The story, in compact form:

  • Step 1. “Modern behavior” arrives around 50–70kya: art, ornaments, complex tools.
  • Step 2. The brain is assumed to be “complete” at this point; all subsequent changes are said to be cultural.
  • Step 3. Therefore, any psychological difference across time or populations must be “just culture.”

It’s a reassuring myth because it makes everyone’s life easier:

  • Geneticists don’t have to talk about uncomfortable behavioral traits.
  • Social scientists can pretend the genotype is a constant.
  • Philosophers can treat “the human condition” as a single, timeless object.

The trouble is that the last fifteen years of ancient DNA have quietly set this story on fire.

David Reich’s group and others have shown that the genome has been under violent, ongoing selection for the last 10–15k years, including regions that strongly influence brain and behavior. The mind did not freeze at Lascaux; it has been moving.

The question is no longer whether human psychology has evolved recently, but how much and in which directions.


2. What Ancient DNA Actually Shows#

The key conceptual shift is simple:

Before: “We can’t see selection on psychological traits, so let’s assume there was none.”
After: “We can track polygenic scores for cognitive and psychiatric traits in ancient genomes across time and space.”

The second world is where we live now.

2.1 Akbari et al.: Pervasive Directional Selection#

Akbari et al. (2024) developed a time-series method for detecting selection in ancient DNA by looking for consistent allele-frequency trends across thousands of ancient West Eurasian genomes spanning ~14,000 years.

Their results:

  • 347 independent loci with >99% probability of selection—an order of magnitude more than previous scans.
  • Besides classic hits (lactase persistence, pigmentation, immune loci), they find selection on combinations of alleles that in present-day GWAS are associated with:
    • Lower risk of schizophrenia and bipolar disorder
    • Slower health decline
    • Higher cognitive performance (IQ tests, years of schooling, household income)

They are cautious about interpreting these as “direct” selection on, say, IQ or years of schooling—those are modern phenotypes—but the direction is not subtle: genome regions that today predict better cognitive outcomes and lower severe-psychiatric risk were systematically moved by selection in the Holocene.

That is not “no change in 50,000 years.” That is the genome being tuned in our grandparents’ grandparents’ grandparents’… deep past, well after the cave-painting era.

2.2 Kuijpers et al.: Polygenic Trajectories for Intelligence#

Kuijpers et al. (2022) took ~872 ancient European genomes from the Upper Paleolithic through the post-Neolithic and computed polygenic scores (PGS) for a range of traits: height, BMI, lipoproteins, cardiovascular risk, and general cognitive ability / intelligence.

Their findings:

  • After the Neolithic, European populations show increasing genetic scores for height and intelligence and lighter skin, alongside less flattering shifts (e.g., higher coronary artery disease risk via low HDL).
  • These trends persist after controlling for ancestry and drift, indicating directional selection.

In other words:

  • Upper Paleolithic Europeans are not identical to medieval or early modern Europeans in cognitive polygenic load.
  • There is a trend toward higher GCA-related polygenic scores across the Holocene in these datasets.

Again, this is not “no change.” It’s measurable, directional evolution of traits closely tied to what we call “intelligence.”

2.3 Piffer: Eastern Eurasia Joins the Party#

Davide Piffer (2025) did something analogous in Eastern Eurasian populations: computed PGS for traits including IQ, educational attainment, autism, schizophrenia and others across 1,245 ancient genomes covering the Holocene.

Headline:

  • Significant temporal trends in PGS for cognitive and psychiatric traits, consistent with directional selection, not pure drift.

The details are messy (as they should be), but the empirical bottom line rhymes with Akbari and Kuijpers:

  • Intelligence- and education-related alleles have not been randomly drifting; they’ve been caught in long, slow selection gradients.
  • Schizophrenia and other psychiatric risks show signals consistent with being pruned back, at least in some lineages.

This is already devastating to the frozen-self myth. But we’re just getting warmed up.


3. The Brain Has Been Under Active Construction#

Ancient polygenic scores tell you direction—this or that trait is being favored. Libedinsky et al. (2025) go after a deeper question:

When did the specific variants that sculpt the human brain and cognition appear, and what are they doing?

3.1 Libedinsky et al.: Recently Evolved Brain / Cognition Variants#

Libedinsky and colleagues integrate:

  • Genome dating from the Human Genome Dating Project
  • Modern GWAS hits for traits like intelligence, cortical area, psychiatric disorders

They then trace when the variants associated with these traits appeared over the last ~5 million years.

Key findings:

  • Genes with recent evolutionary modifications (late-appearing or strongly frequency-shifted alleles) are enriched for roles in brain and cognitive traits, including:
  • Intelligence (P ≈ 1.7 × 10⁻⁶)
  • Cortical surface area (P ≈ 3.5 × 10⁻⁴)
  • Psychiatric phenotypes like schizophrenia and bipolar disorder
  • These genes show elevated expression in language-related cortical areas, which are quintessentially human.

Their conclusion is blunt: recently evolved genetic variants shaped the human brain, cognition and psychiatric traits.

This is the opposite of “brain finished at 50kya.” It is a time-resolved argument that the wiring and risk profile of the human brain has been incrementally hacked well into the Holocene.

3.2 Psychiatric Genes as Evolution’s Collateral Damage#

One of the more uncomfortable patterns that keeps appearing:

  • Regions under selection for brain and cognition traits are enriched for variants associated with major psychiatric disorders.

That is:

  • The same genomic neighborhoods that give you more cortex, better test performance, or finer language are also the neighborhoods where schizophrenia, bipolar, autism risk cluster.

Add in the evidence for ancient viral sequences (endogenous retroviruses) being linked to psychiatric disorders, and you get an evolutionary story that is not clean: brains gain representational power, but in doing so expose new failure modes.

From an EToC perspective, this is gorgeous:

  • Eve—self-awareness, narrative, recursive guilt—is not a free upgrade.
  • It is built on tangled, recently cobbled molecular machinery in which every step toward richer cognition increases the space of ways a mind can break.

Evolution has continued, and it has been selecting on the very substrate of the self.


4. The Y-Chromosome Bottleneck: Male Minds Under the Guillotine#

If you want a single, rude data point against “no recent evolution,” the Y-chromosome bottleneck is it.

Karmin et al. (2015) used whole Y-chromosome sequencing to reconstruct effective male population sizes over time across regions. They (and follow-ups) found:

  • A massive, global drop in male effective population size (Ne) during the mid- to late Holocene (~8,300 BP in the Near East, ~5,000 BP in Europe, ~1,400 BP in Siberia, depending on region).
  • Female effective population sizes (mtDNA) did not show the same crash.

Interpretation:

  • This is not a human-species bottleneck. It’s a male-line bottleneck: many Y lineages were extinguished while female lineages kept diffusing.
  • The most conservative explanations lean on patrilineal social structure, extreme variance in male reproductive success, and possibly warfare and social stratification.

You do not get a multi-continent 5–10x culling of male lineages because nothing interesting was happening to male traits. At minimum, this implies that:

  • There were intense selection-like filters applied to male behavioral phenotypes (coalition-forming, aggression, discipline, status games).
  • Many alternative male “mind designs” were culled, repeatedly, in the last 10k years.

The exact balance of cultural vs biological selection on the Y is debated, but that’s beside the point: the male genealogical tree was ripped apart and rewired long after the supposed freeze date. That’s not a species with a finished self.


5. A Short Table of Inconvenient Evidence#

Let’s put some of this into a single messy but honest table.

Line of evidenceTime window (approx.)Trait(s) affectedDirection / implication
Akbari et al., directional selection in aDNALast 14k years (West Eurasia)Schizophrenia & bipolar risk, cognitive performance (IQ tests, education, income)Selection on allele combos that today predict lower severe-psychiatric risk and higher cognitive performance
Kuijpers et al., PGS trajectoriesUpper Paleolithic → post-Neolithic EuropeIntelligence, height, skin pigmentationIncrease in GCA-related scores and height, lighter skin after Neolithic, not explained by drift
Piffer, Eastern EurasiaHoloceneIQ, educational attainment, schizophrenia and othersSignificant temporal trends in PGS consistent with directional selection on cognitive and psychiatric traits
Libedinsky et al., recent brain variantsOver last 5M years, with emphasis on recentIntelligence, cortical area, psychiatric traitsRecently evolved variants enriched in brain/language areas, associated with intelligence and psychiatric risk
Y-chromosome bottleneck~8,000–1,500 BP (region-dependent)Male lineages (behavior, social roles)Drastic reduction in male Ne; implies extreme variance and strong filters on male behavioral phenotypes
Viral insertions & psychiatryAncient insertions, modern effectsSchizophrenia, bipolar, depressionAncient viral DNA sequences are statistically linked to major psychiatric disorders, adding another evolutionary layer to mental-illness risk

This is not exhaustive. It’s just what you get from glancing at the recent literature.

The shared refrain: late, ongoing, brain-touching evolution.


6. Why the “Finished Self” Myth Survived So Long#

Given all this, why does anyone still say “we haven’t changed in 50,000 years”?

A few reasons:

  1. Methodological convenience.
    For a long time, we literally couldn’t see selection on polygenic, behaviorally-relevant traits. The safe assumption for many modelers was “let’s treat genotype as constant across Holocene and focus on culture.” Reasonable then; lazy now.

  2. Ideological comfort.
    A static human nature is handy:

    • For egalitarians who fear that evolutionary change = justification for hierarchy.
    • For certain evolutionary psychologists who want a single, universal EEA story.
    • For theologians who like an unchanging “fallen but fixed” human condition.
  3. Confusing morphology with mind.
    Cranial capacity and gross anatomy do stabilize somewhat by ~100–50kya. People misread that as: “the mind stabilizes too,” ignoring that small regulatory changes and polygenic tweaks can radically alter cognition and risk profiles without touching skull shape.

  4. Data lag.
    Ancient DNA is young: Reich’s Who We Are and How We Got Here came out in 2018; the Akbari and Libedinsky papers are literally from 2024–2025. The myth is a fossil from the pre-aDNA era.

At this point, clinging to the “finished self” story requires ignoring a rising tide of direct evidence.


7. How This Interlocks with Eve Theory of Consciousness#

Eve Theory of Consciousness already argued, on cognitive and mythological grounds, that:

  • Golden Man – our pre-self ancestor – was a sophisticated control system with rich experience but no explicit, linguistically-mediated self-model.
  • Eve – the self-aware, guilt-prone, time-traveling subject – is a recent evolutionary overlay built via recursive language and selection on introspective social cognition.

The new genetic data harmonize with this in several ways:

7.1 The Self as a Moving Target#

The EToC claim is not just: “there was a moment of awakening 15kya, now we’re done.” It’s more like:

Once recursive self-modeling appears in some lineages, it becomes a target for selection itself.

Ancient DNA shows exactly this kind of dynamic environment:

  • Populations expanding, mixing, replacing each other.
  • New subsistence modes (farming, herding) changing selection pressures on planning, cooperation, time preference.
  • Selection favoring genotypes that (in modern settings) predict better cognitive performance and lower catastrophic psychiatric failure.

In EToC terms: once Eve is on the scene, we keep rebreeding her. Different cultures and ecologies sculpt different Eves, not clones of an Ice Age template.

7.2 Psychiatric Disorders as Eve’s Shadow#

EToC treats psychosis, bipolar disorder, intense mood disorders as failures or overshoots of the Eve transition: minds that can no longer keep their self-model stable.

The overlap between:

  • Brain regions under selection
  • Newly evolved variants linked to intelligence & cortical area
  • Psychiatric risk loci

is exactly what you’d expect if:

  • The same neural machinery that makes selfhood possible also makes psychosis and severe mood disorder possible.
  • Evolution has been pushing and pruning these circuits recently, not 50kya and done.

Eve is not a stable product; she is a living compromise between enough self to plan and love, and not so much self that you crack.

7.3 Sex, Warfare, and Divergent Minds#

The Y-chromosome bottleneck slots into EToC’s male-female asymmetry:

  • If female lineages are under consistent selection for social prediction, caregiving, and moral cognition (your Eve),
  • And male lineages are being culled according to coalitionary success, hierarchical violence, and risk-taking (your Bronze Age Cain),

you naturally get sex-differentiated trajectories in cognitive style, aggression, emotional regulation—again, over Holocene timescales.

No one sensible thinks you can read those trajectories directly off a SNP list. But the pattern of sex-biased demography plus recent brain evolution fits a story where the kind of self available to males and females has been shifting in nontrivial ways.


8. So What Did Change Since the Upper Paleolithic?#

Let’s be concrete. The claim is not that your mind is unrecognizable to a Cro-Magnon hunter. It’s that the distribution and failure modes of minds changed.

Plausible Holocene shifts include:

  • Average capacity for abstract and formal reasoning (schooling, law, complex technology)

    • Supported by upward trends in PGS for intelligence/educational attainment in multiple regions.
  • Risk of catastrophic psychosis and mood disorder

    • Selection appears to have pruned some of the worst schizophrenia/bipolar burdens in specific lineages, even as new risk architectures appear.
  • Sex differences in variance of behavioral phenotypes

    • Extreme variance in male reproductive success plus patrilineal social structures likely amplified certain male cognitive/behavioral styles and extinguished others.
  • Time preference, impulse control, coalitionary behavior

    • Farming, property, and complex states reward longer planning horizons and rule-following; groups that couldn’t produce enough of these traits fared poorly. Polygenic trends for educational attainment are plausible proxies here.

All of this sits atop the same basic Homo sapiens hardware, but hardware is not destiny; small changes in regulatory networks and allele frequencies shift the statistical landscape of minds.

You are still an ape running Golden Man’s control system. You are not the same ape as a 40kya Aurignacian hunter in anything but the loosest, vaguest sense.


9. Why This Matters (Beyond Nerd Satisfaction)#

This isn’t just a “well actually” to annoy people at conferences. It has teeth.

  1. It reframes “human nature.” Human nature is not a fixed point; it’s a trajectory. Treating it as frozen bakes in 20th-century ignorance. Once you accept that the brain’s genetic architecture has been moving, you can start asking how and why more intelligently.

  2. It undercuts simplistic cultural determinism. Culture matters enormously. So do differences in selection histories. The insistence that all psychological variation is “just culture” is as unscientific as crude genetic determinism, and ancient DNA removes the last excuse for it.

  3. It forces seriousness about psychiatric disorders. If the circuits that make modern selfhood possible are recent and fragile, then conditions like schizophrenia, bipolar disorder, severe depression are not random bugs—they are deep evolutionary tradeoffs. That should inform how we think about stigma, treatment, and prevention.

  4. It makes space for theories like EToC. A static-brain world has no room for an Eve-theory story where the self emerges in the Holocene. A dynamic-brain world does. You don’t have to accept EToC to see that its basic gamble—that consciousness has a history—fits much better with the genetic data than the 50kya-freeze fairy tale.

  5. It gives us a future tense. If our minds have changed in 10,000 years, they will change in the next 10,000, via both natural selection and deliberate genetic/technological intervention. We are not “the final form.” We are a mid-season patch.


FAQ#

Q1. Does this mean people from different regions have fundamentally different “selves”?
A. It means different populations have partly different selection histories on brain-related traits. That can shift statistical distributions without creating separate “species of self.” Variation within groups remains huge; but pretending there was no selection at all is now indefensible.

Q2. How solid are polygenic scores in ancient DNA?
A. They’re noisy and depend on modern GWAS, but multiple independent teams (Kuijpers, Akbari, Piffer, others) using different datasets and methods are converging on similar directional trends for key traits, which is exactly what you’d expect under genuine selection.

Q3. Could all these signals be just selection on non-cognitive traits that happen to share SNPs with cognitive ones?
A. Pleiotropy is absolutely in play—but that doesn’t rescue the “no change” claim. If the web is connected, pulling one part (e.g., disease resistance) still moves cognition and psychiatric risk along with it. The mind rides on that web.

Q4. Are we sure evolution didn’t mostly finish 200kya instead of 50kya?
A. No. The best current evidence says significant evolution continued well into the Holocene, particularly on traits tied to diet, immunity, pigmentation, height, cognition, and psychiatric risk. The exact timelines will be refined, but the direction of travel is clear: evolution never stopped.


Sources#

  1. Akbari, A. et al. “Pervasive findings of directional selection realize the promise of ancient DNA to elucidate human adaptation.” Preprint / in review, 2024.
  2. Kuijpers, Y. et al. “Evolutionary Trajectories of Complex Traits in European Populations of Modern Humans.” Frontiers in Genetics 13 (2022): 833190.
  3. Piffer, D. “Directional Selection and Evolution of Polygenic Traits in Eastern Eurasia: Insights from Ancient DNA.” Preprint, 2025.
  4. Libedinsky, I. et al. “The emergence of genetic variants linked to brain and cognitive traits in human evolution.” Cerebral Cortex 35(8) (2025): bhaf127.
  5. Karmin, M. et al. “A recent bottleneck of Y chromosome diversity coincides with a global change in culture.” Genome Research 25 (2015): 459–466.
  6. Guyon, L. et al. “Patrilineal segmentary systems provide a peaceful explanation for the post-Neolithic Y-chromosome bottleneck.” Nature Communications (2024).
  7. Heyer, E. et al. “Sex-specific demographic behaviours that shape human genetic variation.” Molecular Ecology 21 (2012): 597–612.
  8. Scientific Inquirer. “Ancient viral DNA in the human genome linked to major psychiatric disorders.” May 23, 2024.
  9. Reich, D. Who We Are and How We Got Here: Ancient DNA and the New Science of the Human Past. Pantheon, 2018. (See also his public talk on ancient DNA and human history.)
  10. Cutler, A. “Holocene Selection on Human Intelligence.” SnakeCult.net (2025).