TL;DR

  • Think of Golden Man (EToC’s pre-Eve human) as a high-level control system: perception–action loops regulating body, group, and environment, but without a reflective, narrating “I.”
  • Modern theories of animal consciousness, predictive processing, basal cognition, and bicameral minds converge on this picture: experience without introspective selfhood.
  • Golden Man shared a lot with other vertebrates: a unified world-model, affect, learning, maybe “what-it’s-like-ness,” but no capacity to treat their own mind as an object.
  • The step to Eve—the self-reflective, guilt-ridden, time-traveling narrator—is a phase transition: the controller starts modeling itself, and the system acquires a first-person story.
  • Understanding Golden Man isn’t antiquarian. It tells you what your nervous system is still doing underneath the voice in your head, and how close we remain to our last animal ancestor.

Companion articles: For how language creates the transition from Golden Man to Eve, see “The Snake and the Syntax Tree”. For scientific evidence supporting EToC, see “Ten Papers that Push EToC Toward Science”.


“The key concept was that of feedback… The effects of actions are fed back to guide future action.”
— Norbert Wiener, Cybernetics (1948)


1. Meet Golden Man#

In Eve Theory of Consciousness (EToC), Golden Man is not a poetic metaphor; it’s a proposed cognitive regime. Before Eve—before recursive self-awareness—there was a long stretch of Homo sapiens history in which brains were powerful, social, and verbal, but not yet in the business of narrating themselves.

You can picture Golden Man as:

  • Highly skilled at hunting, tracking, toolmaking, gossip, coalition-management.
  • Saturated in myth and ritual, with gods, ancestors, totems.
  • Experiencing a rich world of sights, sounds, pain, joy, fear.
  • But lacking the specific move: “I am having this experience; I might do otherwise; I am a thing I can think about.”

Julian Jaynes famously argued that ancient humans operated with a bicameral mind, guided by auditory hallucinations identified as divine commands rather than by introspective deliberation. EToC pushes something like this deeper into prehistory and ties it explicitly to selection on psychiatric and language-related traits.

The way in is cybernetics. Golden Man wasn’t “primitive” in the sense of stupid; he was non-reflective in the sense of being:

A hierarchy of control systems, not yet running a self-model.

To make that precise, we need some tools.


2. Control Systems, Not Persons#

Cybernetics and modern neuroscience both converge on the same basic picture: living things are control systems. They maintain variables (temperature, energy balance, social rank) in viable ranges using feedback and action.

At the simplest level, a thermostat:

  • Measures room temperature
  • Compares it to a set point
  • Turns heat on or off to reduce the error

Golden Man is the thermostat’s great-great-grandchild: a multi-layer, embodied control system with thousands of feedback loops.

2.1 Predictive Processing: Brains as Error-Minimizers#

Karl Friston’s predictive coding / free-energy framework treats the brain as a hierarchical generative model: it predicts sensory inputs and updates itself to reduce prediction error.

  • Higher cortical levels encode hypotheses about the causes of sensory data.
  • Lower levels send back “prediction errors” when reality doesn’t match expectations.
  • Action is chosen partly by minimizing expected future prediction error.

In this view, an organism is a kind of self-correcting hallucination that keeps itself intact by continuously explaining its inputs.

Golden Man absolutely fits here. He has:

  • Rich generative models of his environment and social world
  • Learned policies for hunting, fleeing, courting, cooperating
  • Emotional systems that encode prediction error at affective levels (“this is worse than expected,” “better than expected”)

What he doesn’t have yet is an explicit model of his own generative model: no “I am the one predicting,” no “my anger is a state I can examine and doubt.” The prediction engine runs, but without a meta-console.

2.2 Basal Cognition and the Cognitive Light Cone#

Michael Levin’s work on basal cognition and “cognitive light cones” generalizes this: any agent has a spatiotemporal region it can sense, remember, and control—a kind of “awareness radius.”

  • A bacterium has a tiny light cone: local chemicals, minutes-long memory.
  • An octopus or crow has a much larger one: objects, others, tools, possibly years.
  • A human-in-a-state has an enormous one: decades of memory, plans, institutions, myths.

Golden Man’s cognitive light cone is already huge compared to other animals:

  • He tracks game migrations and seasons.
  • He maintains social reputations and alliances.
  • He orients to gods, totems, and ancestors in a mythic geography.

But the cone is mostly world- and group-oriented, not self-oriented. Eve’s arrival is when that same cone turns inward and time-extends to include “my life,” “my sins,” “my future,” “my soul.”

2.3 Consciousness Without Cortex, and With It#

Björn Merker’s work shows that a unified conscious scene can be generated by subcortical systems (e.g., upper brainstem) even without a cerebral cortex.

  • The brainstem integrates sensory input into a world-centered “here and now” used for action control.
  • Cortex elaborates contents, but basic phenomenal presence and orienting can exist without it.

Combine Merker with predictive processing and basal cognition and you get a plausible ladder:

  1. Sensitive animal – world-centric scene, no self-concept, but definite “what-it’s-like-ness.”
  2. Golden Man – cortex-enriched world model, social and linguistic, but still no explicit self-as-object.
  3. Eve – adds a persistent self-model that can be talked about and thought about.

Golden Man sits in the middle: more than “just” an animal, less than a self-narrating subject.


3. Golden Man Through the Lens of Other Theories#

Let’s triangulate Golden Man using several big theories of mind.

3.1 Ginsburg & Jablonka: Unlimited Associative Learning#

In The Evolution of the Sensitive Soul, Simona Ginsburg and Eva Jablonka argue that Unlimited Associative Learning (UAL) marks the origin of conscious experience: the capacity to form open-ended associations across multiple modalities and time-scales, with flexible credit assignment.

UAL implies:

  • A unified arena where stimuli, actions, and outcomes can be related.
  • Some form of global workspace where learning signals are integrated.
  • Goal-directed behavior that can flexibly recombine learned components.

By their lights, all vertebrates (and some arthropods / cephalopods) have this. Golden Man certainly does. So Golden Man is:

  • Conscious in the “sensitive soul” sense (there is something it’s like to be him).
  • Not yet self-conscious in the EToC sense (there is nothing it’s like to conceive of himself as a self).

EToC effectively says: UAL gives you a soul; language and selection eventually give you a self.

3.2 Jaynes: Bicameral Voices as Externalized Control#

Julian Jaynes’s bicameral mind is a special case of Golden Man in a historicized, early-historic context: he argued that Bronze Age humans experienced auditory hallucinations—commands from gods—as the primary guidance system, not internal deliberation.

Whether or not you buy his Mesopotamian dating, two ideas are incredibly useful:

  • Guidance as voice – the control system’s high-level outputs are represented as “heard” commands.
  • No introspective workspace – decisions are experienced as coming from outside, not from a unified “I.”

Golden Man is a Jaynesian mind generalized and pushed deeper into prehistory:

  • The underlying architecture is still control-based and externalized.
  • The gods, ancestors, or “feelings” deliver policy, not an inner deliberator.
  • Language is used for coordination and storytelling, not for internal metacommentary on mental states.

Eve appears when language turns inward and the voice channel gets reinterpreted as my thoughts instead of the god’s orders.

3.3 Predictive Self vs. Golden Selfless Control#

In modern predictive-processing theories, the self can be modeled as a hierarchical prior: a compact way to encode regularities about one’s own body and behavior (“I tend to do X,” “this is my face,” “these are my memories”). Some models (e.g. Hohwy, Seth, Friston) treat the self as an inference the system makes about the source of its experiences.

Golden Man has:

  • Somatic priors (“this is my limb, not that branch”).
  • Agency priors (“these movements are mine, not the wind’s”).
  • Group and role priors (“I am hunter / elder / initiate”).

But he lacks:

  • A reflective self-prior that aggregates across time and modality as an object of thought (“I, the one who…”).
  • A metacognitive layer that estimates the reliability of his own beliefs (“I might be wrong about what I think”).

He is a beautifully tuned controller rather than a self-theorist. The priors are there, but not yet woven into a concept of self.

3.4 Basal Cognition: A Continuum, Not a Cliff#

Basal cognition work emphasizes that there isn’t a magical point where cognition “switches on.” Instead, there’s a continuum from cells to animals to humans, with bigger and richer cognitive light cones and more complex control architectures.

Golden Man is important because he shows that:

  • You can crank the dial very far—language, myth, UAL, complex social life—without yet acquiring Eve-style introspective subjectivity.
  • The step to selfhood is not from nothing → everything; it’s from rich control → reflective modeling of the controller.

Our last animal ancestor is not some dimly conscious ape; it’s cognitively adjacent to Golden Man. The real rupture is between Golden Man and Eve.


4. What It Felt Like to Be Golden Man (Hypothetically)#

We can’t interview Golden Man, but we can extrapolate from:

  • Comparative cognition (apes, corvids, cetaceans).
  • Ethnography of highly mythic, low-literacy societies.
  • Jaynesian readings of early texts.

Here’s a sketch of plausible phenomenology:

4.1 Time: Thick Present, Thin Autobiography#

Golden Man’s present is thick:

  • He remembers seasons, migrations, ritual calendars.
  • He anticipates hunts, storms, raids.

But his autobiographical horizon is thin:

  • He does not rehearse his life as a coherent story.
  • “Who am I?” is answered in terms of role, kin, totem, not an inner essence.

There’s a flow of action and memory, but no single narrator stitching it into “my life.”

4.2 Agency: Gods, Spirits, and Social Fields#

Decisions feel imposed rather than chosen by an inner decider:

  • A strong impulse is “the god wants this,” “the ancestor is angry,” “the omen is bad.”
  • Collective emotions (panic, rage, joy) are experienced as a field rather than “my mood.”

Think of modern crowd phenomena—stampedes, riots, mass ecstasy—then imagine that as the default interpretive frame, not an exception. Golden Man is exquisitely sensitive to social and mythic control signals, but doesn’t perceive a separate “me” issuing commands.

4.3 Emotions: Raw, Regulating, but Unexplained#

Affects are intense and behaviorally functional:

  • Fear drives avoidance; anger drives attack; shame drives withdrawal.
  • Learned norms channel these affects into ritual responses and status negotiations.

What’s missing is the Evesque move: “I am anxious, and I’m anxious because I am the kind of person who…” Golden Man feels everything, but doesn’t place his feelings into a theory of his own psyche.

4.4 Inner Speech: Social, Not Solitary#

Language is used:

  • Externally—for coordination, ritual, storytelling.
  • Quasi-externally—for heard voices of gods, ancestors, spirits.

What’s rare or absent is silent, reflective monologue:

“Okay, yesterday I did X; maybe I should do Y; what does that say about me?”

Inner speech, if present, is probably sparse and situation-triggered, not chronic. The default state is more like sensorimotor flow plus occasional punctuations of verbal guidance, not our constant mental podcast.


5. Our Last Animal Ancestor: Continuity and Difference#

So how does Golden Man relate to nonhuman animals?

5.1 Continuity: Conscious Control Systems Everywhere#

If Ginsburg & Jablonka are right, many vertebrates and some invertebrates already have Unlimited Associative Learning, implying a minimal conscious workspace.

Combine that with Merker’s brainstem-centered scene and you get:

  • A crow who plans, caches food, deceives rivals.
  • A chimp with social alliances, proto-norms, tool traditions.
  • An octopus exploring, solving problems, maybe even playing.

Golden Man sits on this continuum. He is not qualitatively different in “having experience”; the difference is in:

  • The scope of his cognitive light cone (larger).
  • The stability of cultural scaffolding (myth, ritual, tools).
  • The grammatical complexity of communication.

5.2 Difference: Language, Myth, and the Proto-Self#

Where Golden Man diverges from other animals is the symbolic layer:

  • Language lets him compress and transmit huge amounts of information.
  • Myth provides a shared ontology for gods, ancestors, and social rules.
  • Ritual stabilizes norms and expectations across generations.

This symbolic layer reshapes the control problem:

  • The environment now includes stories as part of what must be tracked and regulated.
  • Social feedback loops become more intricate (reputation, taboo, law).

But until Eve, that whole symbolic storm still orbits around roles, not selves. Golden Man can be “hunter of the Bear Clan under Sky-God X,” but not yet “I, a conflicted subject who thinks about his thoughts.”


6. From Control System to Self-Model#

What has to happen for Golden Man to become Eve?

Different theories would answer differently, but they rhyme:

  1. Language has to loop inward.
    Jaynes argues consciousness is a learned, linguistic skill: using metaphor and narrative to model one’s own mental life.
    EToC says: at some point, likely mediated by women’s social and reproductive roles, language that was used to talk about gods and others was turned on the speaker’s own mind.

  2. The predictive model has to include itself.
    Predictive processing can, in principle, build models of its own modeling (meta-priors about reliability, bias, habit). When this becomes sufficiently rich and stable, you get a narrative self: a construct that explains why this organism behaves as it does.

  3. The cognitive light cone has to extend over mental states.
    In Levin’s terms, the “cognitive light cone” is not just about space and time but which variables are in play. When your own beliefs, desires, and character traits enter that cone as manipulable objects, you’ve crossed into Eve.

  4. Culture has to reward introspection.
    Once some lineages gain this self-model, selection can act on it:

    • Better deceit detection (reading your own motives, then others’).
    • Better long-term planning (imagining future selves).
    • Stronger norm internalization (guilt, shame, confession).

Golden Man is the substrate on which these upgrades operate. You don’t add Eve to a frog; you add Eve to an already sophisticated human control system.


7. Golden Man vs. Eve: A Comparison Table#

DimensionGolden Man (control system)Eve (narrative self)
World-modelRich, world- and group-centered sceneSame, plus explicit model of “me-in-the-world”
Time horizonSeasons, rituals, short life episodesWhole-life autobiography, imagined futures & counterfactuals
GuidanceExternalized: gods, omens, group moodInternalized: conscience, plans, “my values”
Inner speechSparse, social/ritual, often heard as otherChronic, self-directed monologue/dialogue
EmotionsRaw drives + culturally patterned responsesEmotions about emotions (shame about anger, pride, etc.)
NormsEnforced via ritual, taboo, external sanctionInternalized moral law, guilt, self-judgment
Self-conceptRole- and kin-based (“hunter of X, child of Y”)Psychological and existential (“I am the kind of person who…”)
Failure modesPossession, crowd madness, simple impulsivityDepression, anxiety, psychosis, identity crises

The point is not that Eve is “better.” It’s that Eve is different: she adds a new failure mode—being haunted by herself—on top of Golden Man’s ancient animal vulnerabilities.


8. Why This Matters Now#

Understanding Golden Man isn’t just a historical curio or mythic device. It clarifies:

  • What your brain is still doing.
    Most of your daily life—driving, typing, social small talk—is handled by Golden Man: fast, automatic control loops that don’t need the narrator.

  • Why introspection hurts.
    You’re running an animal control system that evolved for action and immediate feedback. Slapping a reflective, language-based self on top produces beautiful art and science—and constant risk of anxiety, rumination, and psychosis.

  • Why animals feel familiar.
    When you look at a chimp, dog, or crow and feel an intuitive kinship, you may be recognizing Golden Man’s domain: a world of goals, feelings, and learning without the crushing weight of a self-narrative.

  • What EToC is really claiming.
    Not “humans used to be mindless zombies,” but “the kind of mind we now inhabit—recursive, self-obsessed, haunted by its own models—is a recent, fragile, selectable overlay on a very old control system.”

Golden Man is your ancestor, but also your basement. Eve lives upstairs, rearranging furniture and writing memoirs, but the old machinery is still humming below, regulating blood pressure, posture, politics, and panic.


FAQ#

Q1. Was Golden Man conscious or just a sophisticated zombie? A. Under frameworks like Ginsburg & Jablonka’s UAL and Merker’s brainstem consciousness, Golden Man almost certainly had genuine experience—sensations, feelings, unified scenes—but lacked a reflective, conceptual “I” that could become an object of thought.

Q2. How different was Golden Man from other animals? A. Qualitatively continuous but quantitatively extreme: same basic control architecture as other vertebrates, but with a much larger cognitive light cone and richer symbolic culture, yet still without Eve-style self-modeling.

Q3. Does this mean introspective consciousness is “unnatural”? A. Not unnatural—just late and brittle. It’s a specialized add-on that evolution discovered relatively recently, riding atop an ancient control system that still runs most of the show. EToC’s claim is that this add-on has its own selection history and pathologies.

Q4. Could some modern people be closer to Golden Man than others? A. Almost certainly in specific domains: variation in inner speech, mental imagery, and metacognition is huge. But even the “least introspective” modern humans live in cultures saturated with Eve-style self-concepts, so we’re all downstream of the same transition.


Sources#

  1. Jaynes, Julian. The Origin of Consciousness in the Breakdown of the Bicameral Mind. Houghton Mifflin, 1976.
  2. Friston, Karl. “A theory of cortical responses.” Philosophical Transactions of the Royal Society B 360 (2005): 815–836.
  3. Merker, Björn. “Consciousness without a cerebral cortex: A challenge for neuroscience and medicine.” Behavioral and Brain Sciences 30 (2007): 63–134.
  4. Ginsburg, Simona & Eva Jablonka. The Evolution of the Sensitive Soul: Learning and the Origins of Consciousness. MIT Press, 2019.tational Boundary of a ‘Self’: Developmental Bioelectricity Drives Multicellularity and Scale-Free Cognition."](https://www.frontiersin.org/articles/10.3389/fpsyg.2019.02688/full) Frontiers in Psychology 10 (2019): 2688.
  5. Levin, Michael. “Bioelectric networks: the cognitive glue enabling complex morphogenesis.” Animal Cognition (2023).
  6. Millidge, Beren et al. “Predictive coding: a theoretical and experimental review.” arXiv:2107.12979 (2021).