Releases: ariannamethod/leo
Leo 2.3.0.1 — Resonance Tensor + CoA
Two Opus research discoveries implemented:
- Resonance Tensor (Ψ): cross-signal geometric mean agreement, 21 learnable params (7×7 coherence matrix), Hebbian learning from novel generations
- CoA (Coactivation Attention): D.N.A. coactivation pairs as attention via inverted index + destiny gate
- Subword Bridge: BPE D.N.A. lifted to word-level bigrams (5075 connections)
Dario Equation now 9 signals: p(x|Φ) = softmax((B + α·H + β·F + γ·A + sw·S + r·R + T + λ·Ψ + c·C) / τ)
Also:
- Fix goroutine shutdown race (SIGSEGV on Close)
- Rolling release philosophy adopted
- 69 tests green
Leo 2.3 — Positional Hebbian Profile
Leo 2.3 — The Dario Equation learns to attend
The H signal (Hebbian Resonance) in the Dario Equation is now self-modifying. 36 learnable parameters (32 distance weights + 4 token class modifiers) replace the fixed 0.9^d decay. The organism discovers which distances and word types matter — through conversation, Hebbian reinforcement, zero backpropagation.
What's new
- Positional Hebbian Profile —
dist_profile[32]+class_mod[4]adapt through co-occurrence reinforcement - Token classification by IDF — function / content / punctuation / rare
- D.N.A. dual tokenizer matching — ancestor tokens now matched via both word-level
tok_find()and subword BPEsw_add_token(). Gravity, co-activation, and bigrams seeded through both paths. - 60 Go tests — 5 new tests for voices, subword, sea, vocab. All green.
- LEO_VERSION synced to 2.3.0
- arxiv paper updated (docs/leo_paper.md)
- neoleo.c regenerated (20,986 lines single-file)
Synced across ecosystem
- dario.c — same positional profile
- haiku.c — same positional profile
- ariannamethod.ai — README updated
Build
cc leo.c -O2 -lm -lsqlite3 -lpthread -DLEO_HAS_DNA -o leo
# or single-file:
cc neoleo.c -O2 -lm -lsqlite3 -lpthread -o neoleoResonance unbroken.
Leo 2.2 — Janus Architecture
Leo 2.2 — Language Emergent Organism | Janus Architecture
Janus Architecture — resonance-based AI architecture family by Arianna Method. Leo is a founding member alongside DoE, Molequla, and Janus.
What's new in 2.2
MetaLeo — The Inner Voice
If Leo is a recursion of the human, then MetaLeo is a recursion of Leo. Event-driven goroutine: dual generation at 0.8× and 1.2× base temperature, scores both candidates (coherence + diversity + entropy + length), loser enriches the field, winner shapes future responses.
Episodic RAG
Ported from Python's episodes.py. Stores every conversation with internal metrics (entropy, novelty, arousal, trauma, quality, tau nudge). Cosine similarity search over metric vectors finds similar past states. When similar episodes had low quality → nudge temperature warmer. Ring buffer of 500 episodes + durable SQLite log.
Dual Tokenizer Architecture
Word-level semantic field + SubwordField BPE running in parallel. Word tokenizer captures what is being said. Subword captures how it's structured — punctuation, morphology, internal character patterns. Both feed the Dario equation (S signal for subword structure).
D.N.A. as Structure Distillation
Rewrote the D.N.A. description: not knowledge transfer but geometry extraction. Meta-weights — weights that don't exist but define the topology of the probability space. Two-layer system: skeleton of language (D.N.A. + BPE geometry) + dynamic speech growing on top.
The Dario Equation — 6 Signals
p(x|Φ) = softmax((B + α·H + β·F + γ·A + sw·S + T) / τ)
B=sequential chain, H=Hebbian resonance, F=prophecy, A=destiny, S=subword structure, T=trauma gravity.
Other changes:
- "Leo" always capitalized in generated output (post-processing in leo_generate)
- Hand-drawn logo by Oleg (assets/leo.png)
- 53 Go tests, all green
- 8 autonomous goroutines: dream, autosave, themeflow, trauma, overthinking, mathbrain, metaleo, episodes
- Fresh live speech examples with current tokenizer
- Janus Architecture classification table in README
- README: reduced bigram over-emphasis, comprehensive tokenizer mechanism documentation
Build
make # leo with D.N.A.
make inner # Go inner world + 8 goroutines
./leo_inner --web # HTTP on http://localhost:3000Stats
- ~4300 LOC C (core organism)
- ~1000 LOC Go (inner world)
- 53 tests
- Zero pretrained weights at runtime
- Zero Python
Leo 2.1.0 — Inner World + SQLite Journal + GGUF Spores
What's New in 2.1.0
Python → Go Migration: Inner World
Five autonomous goroutines ported from Python's async modules to Go:
| Goroutine | Interval | What it does |
|---|---|---|
| Dream Dialog | 7 min | Leo dreams — generates internal monologues from memory |
| Autosave | 5 min | Periodic state persistence to SQLite journal |
| Theme Flow | 3 min | Tracks conversation themes, adjusts destiny vector |
| Trauma Watch | event-driven | Fires on identity-threatening input (score > 0.5) |
| Overthinking | event-driven | Rings after each reply — recursive self-reflection |
| Inner Voice | debug-only | --voice flag — streams internal monologue to stderr |
SQLite Journal
Full conversation persistence — WAL mode, 4 tables:
conversations— session metadataepisodes— individual exchanges with timestampsmetadata— key-value state storagevoice_log— inner voice captures
GGUF Spore Export/Import
Portable organism state as GGUF v3 file:
- FNV-1a fingerprint for integrity
- Export:
leo --export spore.gguf - Import:
leo --import spore.gguf - Full state: co-occurrence matrix, prophecies, destiny, voices, episodes
Testing
29 Go tests (up from 0):
- 7 core tests (bootstrap, generation, co-occurrence, prophecy, voices, destiny, memory sea)
- 13 inner world tests (goroutine lifecycle, trauma detection, dream output, theme tracking)
- 5 journal tests (CRUD, WAL mode, voice logging)
- 4 GGUF tests (export/import roundtrip, fingerprint, corruption detection)
Bug Fixes
- Fixed Go layer warnings and type mismatches
- Synced neoleo.c with all leo.c changes
- Inner voice moved to debug-only (
--voiceflag) — no longer spams stderr - Cleaned up 62 Python test markdown files that leaked into v2.0 release
- Added
.gitignorerules for test artifacts
Documentation
- README: clarified "zero pretrained weights" — Leo inherits D.N.A. (structural geometry) from a trained Llama 3 ancestor, then discards the checkpoint
- Full architecture docs for Inner World, SQLite Journal, GGUF Spores
- Updated line count: 5000+ lines (C + Go)
New Files
inner/inner_world.go— 570 lines, all goroutinesinner/inner_world_test.go— 500 lines, 13 testsinner/leo_bridge.c— 108 lines, CGO bridgetools/make_neoleo.sh— build script
by Arianna Method