White Paper · 02 · November 2025

Temporal Memory

Why AI Systems Need to Forget

Nick Brandt & Leo Gestetner 14 min read AI Architecture

Abstract

Current AI systems treat all information as equally important forever. Your offhand mention of a restaurant three years ago has the same weight as your wedding anniversary. This paper presents a temporal memory architecture inspired by human cognition: facts have half-lives, decay naturally unless reinforced, and time-sensitive information maintains relevance until anchor points then fades rapidly. The result is AI that feels more human and handles contradictions gracefully.

1. The Problem with Perfect Recall

Current AI systems treat all information as equally important forever. Your offhand mention of a restaurant three years ago has the same weight as your wedding anniversary.

This isn't how human memory works. And it shouldn't be how AI memory works.

Human memory has:

  • Decay — Unimportant details fade naturally
  • Anchoring — Some facts stay vivid until an event passes, then fade fast
  • Reinforcement — Repeated information strengthens
  • Importance weighting — Some facts matter more than others
  • Temporal context — When you learned something affects relevance

AI systems have: a vector database that stores everything forever.

2. The Decay Model

Inspired by human memory research, facts should have a half-life:

relevance = importance × (0.5^(days / half_life)) × reinforcement_multiplier
Decay TypeHalf-LifeUse Case
PermanentCore identity, key relationships
Slow365 daysImportant professional knowledge
Medium90 daysGeneral information
Fast14 daysRecent context, current projects
Ephemeral3 daysMomentary details, casual mentions

A fact starts at full relevance and decays over time unless reinforced.

3. Anchored Decay

Not all facts decay gradually. Some have an anchor point — a date or milestone where relevance changes dramatically.

FactAnchor PointBehavior
"Concert on December 1st"Event dateHigh relevance until Dec 1, drops sharply after
"Broke my arm, cast on"6-week recoveryCritical while cast is on. Fades quickly once healed.
"Project deadline March 15"DeadlineIncreasingly relevant approaching date, irrelevant after
"Taking antibiotics"10-day courseHigh during treatment, drops when course ends

The Key Insight

These facts don't decay from day one. They maintain relevance until the anchor, then cliff-drop. A broken arm is critical while the cast is on — it affects what you can do, what help you need, every plan you make. Six weeks later when the cast comes off, it decays with a 7-day half-life.

4. The Reinforcement Mechanism

Decay alone would forget everything. Reinforcement counterbalances:

When a fact is mentioned again: reinforcement_multiplier = base + (mention_count × 0.1)
MentionsMultiplierEffect
11.0Normal decay
51.550% slower decay
102.0Twice as persistent
20+3.0Effectively permanent

Things you talk about repeatedly become core knowledge. Things mentioned once fade away.

5. Importance Classification

Not all facts are created equal. At capture time, classify importance:

LevelDescriptionInitial Half-Life
CriticalLife events, core relationshipsPermanent
HighCareer, health, major decisionsSlow (365 days)
MediumGeneral knowledge, preferencesMedium (90 days)
LowContext, casual mentionsFast (14 days)
TrivialBackground noiseEphemeral (3 days)

The AI learns that "I got married to Sarah" is Critical, while "I had coffee this morning" is Trivial.

6. Why This Matters

Without decay: an AI that knows you liked sushi in 2019 (100% relevance) and knows you're currently vegetarian (100% relevance) confidently recommends sushi restaurants.

With decay: "You mentioned liking sushi in 2019" (3% relevance) gives way to "You're currently vegetarian" (100% relevance). The AI understands your current preferences.

7. Conflict Resolution

Temporal memory elegantly handles contradictions:

Old fact: "John works at Google" (2 years old, decayed to 15%)

New fact: "John works at Microsoft" (just captured, 100%)

Without decay, you have a contradiction to resolve. With decay, the old fact naturally gives way to the new one. No explicit conflict resolution needed.

8. Privacy by Forgetting

Temporal memory is also a privacy feature:

Natural Data Lifecycle

Sensitive information naturally decays over time

Casual Disclosures Protected

One-time mentions don't persist forever

No Manual Deletion

Users don't need to actively manage old data

Matches Expectations

System mirrors human assumptions about memory

"I told you that once, years ago" shouldn't be perfect recall.

9. The Philosophical Argument

Perfect memory isn't a feature — it's a bug.

Human memory evolved to forget because:

  • Storage is finite
  • Relevance changes over time
  • Old information can mislead
  • Forgetting enables growth

AI assistants that remember everything create an uncanny, uncomfortable experience. They know things you've forgotten you ever said.

Temporal memory creates AI that feels more human.

10. Conclusion

Temporal memory represents a fundamental shift in how AI systems handle knowledge over time. Instead of treating memory as a static database, it becomes a living system that naturally prioritizes recent and important information while letting the irrelevant fade away.

The result is AI that better understands context, handles contradictions gracefully, and respects the natural lifecycle of information.

References