thinkn
  • Product
    Manifesto
    The reason we exist
    Founder Studioprivate beta
    Make better product decisions faster
    Belief SDKinvite only
    Add belief states to your AI system
    Request Access →Join the private beta waitlist
  • Docs
  • Pricing
  • FAQ
  • Docs
  • Pricing
  • FAQ
Sign In
Welcome
  • Hack Guide
  • Introduction
  • Install
  • Quickstart
  • FAQ
  • The Problem
  • Memory vs Beliefs
  • Drift
  • Examples
why/drift.mdx

Drift

Epistemic drift is a structural property, and it compounds with every actor you add.

Epistemic Drift Is Structural

Epistemic drift is a fundamental property of reasoning systems operating without explicit state.

Recent research formalizes this: identical inputs can yield incompatible internal representations and outputs, even under controlled conditions. This is ontological divergence. Models do not merely vary in expression. They vary in interpretation.

Three interacting properties make drift inevitable:

1┌──────────────────────────────────────────────────────────────┐
2│                 WHY DRIFT IS STRUCTURAL                      │
3│                                                              │
4│   Probabilistic        Identical inputs, different outputs.  │
5│   Generation           This enables creativity, but means   │
6│        │               no stable underlying belief state.    │
7│        ▼                                                     │
8│   Unstable Task        Models don't just vary in wording.    │
9│   Representation       They infer different objects, apply   │
10│        │               different thresholds, construct       │
11│        ▼               different causal models.              │
12│   No Persistent        Each interaction reconstructs from    │
13│   Epistemic State      scratch. No model of what is         │
14│                        believed, why, or what contradicts.   │
15│                                                              │
16│   These three interact: drift compounds over time,           │
17│   across agents, and across context.                         │
18└──────────────────────────────────────────────────────────────┘

Drift Compounds with Every Actor

This is a coordination problem that intensifies with every participant, human or machine.

Single agent, many turns

A single agent over 20 turns accumulates contradictions, carries stale assumptions, and loses track of what it believes. This is the simplest case.

Multiple agents, one goal

A research agent and an analyst agent work on the same question. Each produces plausible but incompatible outputs. Without shared epistemic state, their perspectives exist in silos, or worse, get concatenated into a report that contradicts itself across sections.

Humans and agents together

A product team has three humans and four agents collaborating over weeks on a market strategy. Each human brings different priors. Each agent processes different data. Decisions are made in meetings, Slack threads, and documents. Assumptions shift without anyone tracking what changed or why.

The drift is organizational. The team's shared understanding fragments because there is no shared epistemic substrate.

Swarms working on complex goals

Now scale this further. A swarm of agents working on deep science: drug discovery, climate modeling, materials research. Dozens of agents, each with partial views, each contributing hypotheses, each updating independently. How do they maintain coherence?

1┌──────────────────────────────────────────────────────────────┐
2│          DRIFT COMPOUNDS WITH ACTORS                         │
3│                                                              │
4│  1 agent, 20 turns:                                          │
5│    drift ≈ stale assumptions + missed contradictions         │
6│                                                              │
7│  3 agents, shared goal:                                      │
8│    drift ≈ above + conflicting interpretations               │
9│           + no reconciliation mechanism                      │
10│                                                              │
11│  Humans + agents, weeks of work:                             │
12│    drift ≈ above + shifting priors + decisions made          │
13│           in channels no agent can see + organizational      │
14│           fragmentation                                      │
15│                                                              │
16│  Agent swarm, complex science:                               │
17│    drift ≈ above + combinatorial explosion of belief         │
18│           interactions + no single participant holds          │
19│           the full picture + competing hypotheses             │
20│           across subdomains                                  │
21│                                                              │
22│  At every level, the problem is the same:                    │
23│  more actors × more time × no shared state = more drift      │
24└──────────────────────────────────────────────────────────────┘

Without belief infrastructure, each new actor multiplies the surface area for divergence. A swarm of 50 agents producing 50 independent perspectives produces 50 partially overlapping, partially contradictory views with no mechanism to reconcile.

Why Model-Level Solutions Are Not Enough

Prompting, temperature control, retrieval augmentation, and fine-tuning primarily operate on output behavior. They do not address the underlying issue: the absence of a stable, explicit, and evolving epistemic state.

Even as models improve:

  • Multi-agent systems will still diverge (spatial drift)
  • Long-horizon workflows will still accumulate hidden assumptions (temporal drift)
  • Human-agent teams will still fragment (organizational drift)
  • Swarms will still produce incompatible views (combinatorial drift)

Epistemic drift is a persistent property of how these systems operate.

Drift as Signal

Drift is often framed as a limitation. It is also an infrastructure opportunity.

When agents drift, they reveal something useful: ambiguity in the problem, gaps in knowledge, and competing plausible interpretations. Instead of suppressing drift, belief systems can capture these divergences, track competing hypotheses, and resolve them through evidence.

1┌──────────────────────────────────────────────────────────────┐
2│              DRIFT → SIGNAL → RESOLUTION                     │
3│                                                              │
4│   Agent A says $4.2B  ──┐                                    │
5│                         ├──→ Contradiction detected           │
6│   Agent B says $3.8B  ──┘    │                               │
7│                              ▼                               │
8│                         Investigation triggered               │
9│                              │                               │
10│                              ▼                               │
11│                         Gartner includes adjacent segments    │
12│                         Pure market is ~$3.1B                 │
13│                              │                               │
14│                              ▼                               │
15│                         Both claims updated with              │
16│                         narrower scope + provenance           │
17│                                                              │
18│   The drift was not a failure.                               │
19│   It was a signal that the question was underspecified.       │
20└──────────────────────────────────────────────────────────────┘

A swarm of agents with differing opinions is precisely the condition under which belief infrastructure is most valuable. Each agent contributes partial evidence. The fusion engine reconciles. Contradictions drive investigation. The system converges through evidence, not consensus.

The Solution Is Not Stability

The solution to drift is building systems that remain coherent despite disagreement.

Belief state infrastructure provides that layer:

  • Shared epistemic substrate. Every actor (human or agent) contributes to the same structured belief state.
  • Trust-weighted fusion. The system weights contributions proportionally to source reliability.
  • Contradiction as state. Disagreements are tracked and surfaced, not hidden or silently averaged.
  • Temporal decay. Old evidence loses weight, creating pressure to refresh.
  • The ledger. Every transition is recorded, so when drift happens, you can trace exactly when and why.
  • Information gain. The system can identify which gaps, if filled, would reduce the most uncertainty, directing attention toward what matters most.

Example

Watch beliefs evolve across 5 turns.

Learn more

Science

How swarms maintain coherence in research.

Learn more
PreviousMemory vs Beliefs
NextExamples

On this page

  • Epistemic Drift Is Structural
  • Drift Compounds with Every Actor
  • Single agent, many turns
  • Multiple agents, one goal
  • Humans and agents together
  • Swarms working on complex goals
  • Why Model-Level Solutions Are Not Enough
  • Drift as Signal
  • The Solution Is Not Stability