Architecture Overview
The problem with static memory
Section titled “The problem with static memory”Most AI memory systems work like a filing cabinet: store a document, retrieve it later by similarity. This approach has two fundamental problems:
- No forgetting. Every memory has equal weight regardless of age. A user’s address from three years ago competes equally with their current address.
- No context. Memories exist in isolation. There’s no way for retrieving one fact to activate related facts.
Human memory solves both problems. It decays over time (forcing prioritization), strengthens with use (surfacing important information), and forms associative networks (enabling multi-hop reasoning).
Cognitive Memory’s architecture
Section titled “Cognitive Memory’s architecture”The system implements six biologically-inspired mechanisms:
1. Ebbinghaus decay with floors
Section titled “1. Ebbinghaus decay with floors”Every memory has a retention score R(m) that decays exponentially over time. The decay rate depends on the memory’s type, stability, and importance. Core memories have a retention floor of 0.60 — they dim but never vanish.
See Decay model and Decay floors.
2. Memory categorization
Section titled “2. Memory categorization”Memories are classified into four categories, each with different decay characteristics:
| Category | Base decay (days) | Floor | Description |
|---|---|---|---|
| Episodic | 45 | 0.02 | Events with time/place context |
| Semantic | 120 | 0.02 | Facts, preferences, relationships |
| Procedural | infinite | — | Skills, routines (never decay) |
| Core | 120 | 0.60 | Identity-defining information |
See Memory types.
3. Retrieval-weighted scoring
Section titled “3. Retrieval-weighted scoring”Search results are scored by sim(m,q) * R(m)^0.3 — cosine similarity modulated by retention. The exponent 0.3 softens the decay penalty so faded memories can still surface when highly relevant.
See Retrieval scoring.
4. Two-tier retrieval boosting
Section titled “4. Two-tier retrieval boosting”When a memory is retrieved, its stability increases. Direct matches get a larger boost (0.1) than associatively-activated memories (0.03). Both are modulated by a spaced repetition factor: longer gaps between retrievals produce bigger boosts.
See Retrieval boosting.
5. Associative memory graph
Section titled “5. Associative memory graph”Memories form bidirectional links through two mechanisms:
- Synaptic tagging at ingestion: memories encoded in the same session with semantic overlap are linked.
- Co-retrieval strengthening: memories retrieved together have their link weights increased.
Links decay exponentially with time constant 90 days.
See Associations.
6. Tiered storage lifecycle
Section titled “6. Tiered storage lifecycle”Memories move through three tiers:
- Hot: full embeddings, vector-searchable
- Cold: ID-accessible only, no vector search
- Stub: archived summary, minimal footprint
See Tiered storage.
How it fits together
Section titled “How it fits together”The full pipeline from ingestion to retrieval:
Conversation text | v LLM Extraction (narrator prompt) | v Memory objects with category + importance | v Embedding + conflict detection | v Synaptic tagging (ingestion-time associations) | v Storage in hot tier | |--- Search query arrives ---> | v Embedding similarity search (hot store) | v R^alpha temporal scoring | v Associative expansion (linked memories) | v Direct + associative boosting | v Core promotion check | v Sorted results returnedPeriodic maintenance (tick()) handles cold migration, TTL expiry, and consolidation in the background.
See Scoring pipeline for the complete retrieval flow, and Ingestion for the extraction pipeline.