The Problem: Information Scattered, Context Lost
Every day, critical information about my life flows through disconnected systems. My budget lives in YNAB, tasks in Todoist, time tracking in Toggl, and knowledge in Obsidian. Each conversation with AI starts fresh - explaining who Cherry is (my wife), that Hiide is 3 and Kaede is 1, what projects I'm working on.
The friction wasn't just annoying - it was costing me insights. Patterns in my gratitude practice going unnoticed. Important context requiring repetitive explanations. Family milestones getting lost in note chaos.
I needed something different. Not another productivity app, but an operating system for life - one where AI actually understands my context and handles the integration complexity so I can focus on living.
What LifeOS Actually Is
LifeOS is my personal operating system built on three principles:
- Natural Language First: Speak naturally, let automation handle the rest
- Everything Connected: Finances, family, knowledge, and time in one ecosystem
- AI That Remembers: Persistent context that evolves with my life
Here's what that looks like in practice:
Me: "I'm grateful for Cherry's patience with the kids today"
LifeOS:
→ Creates dated gratitude note in Obsidian
→ Auto-links to Cherry's person note
→ Updates relationship strength metrics
→ Marks Todoist gratitude task complete
→ Indexes for semantic search
→ Updates living memory contextOne natural sentence, six integrated actions. No app-switching, no manual entry, no context loss.
sequenceDiagram
participant U as User
participant FM as family-manager
participant GE as gratitude-entry
participant VM as vault-manager
participant TD as Todoist
participant QD as Qdrant
U->>FM: "Grateful for Cherry's patience"
FM->>FM: Detect gratitude keywords
FM->>GE: Invoke /gratitude
GE->>VM: Create note with frontmatter
VM->>VM: Generate filename, validate
VM-->>GE: Note created
GE->>TD: Mark task complete
GE->>QD: Index for semantic search
GE-->>FM: Workflow complete
FM-->>U: "Gratitude logged for Cherry"The Architecture: 9 Skills, Zero Cloud Dependencies
graph TB
subgraph "User Input Layer"
NL[Natural Language]
SC[Slash Commands]
end
subgraph "Skill Layer"
FM[family-manager]
FIN[finance-manager]
VM[vault-manager]
GE[gratitude-entry]
VIA[vault-insight-analyzer]
SE[skill-evolution]
end
subgraph "Memory Layer"
LM[living-memory]
VN[vault-notes]
end
subgraph "Integration Layer"
YNAB[(YNAB)]
TD[(Todoist)]
TG[(Toggl)]
QD[(Qdrant)]
end
NL --> FM
NL --> FIN
SC --> GE
SC --> VIA
FM --> GE
FM --> VM
FIN --> YNAB
GE --> VM
GE --> TD
VIA --> QD
VIA --> SE
SE --> LM
VM --> VN
QD --> VNThe Skill Ecosystem
LifeOS runs on 9 specialized AI skills that compose complex workflows:
| Skill | Purpose |
|---|---|
| joey-living-memory | Evolving personal context (static core + dynamic state) |
| family-manager | Orchestrates gratitude, milestones, family time |
| finance-manager | Budget integration via YNAB API |
| vault-manager | Obsidian note creation and validation |
| gratitude-entry | Daily gratitude workflow automation |
| vault-insight-analyzer | Pattern detection and compliance metrics |
| skill-evolution | Meta-skill that learns from every interaction |
| date-calculator | Accurate age/date computations (never guesses) |
| api-integrations | External service connections (YNAB, Todoist, etc.) |
The key insight: skills compose. When I mention gratitude, family-manager detects it, invokes /gratitude, which uses gratitude-entry for workflow and vault-manager for file operations. Each skill focuses on one domain, but together they handle complex life orchestration.
Living Memory: The Secret Sauce
Unlike static knowledge bases, LifeOS maintains living memory - context that actively evolves:
living-memory/
├── static-context.md # Rarely-changing identity (birthdate, role)
├── evolving-context.md # Current state (updated weekly)
├── relationship-graph.md # Connection strength metrics
├── preferences.md # Communication style, dislikes
├── recent-insights.md # Last 30 days patterns
└── update-log.md # Change historyEvery Sunday at 2am, vault-insight-analyzer scans my Obsidian vault, detects patterns in my gratitude entries, time tracking, and daily notes, then updates evolving-context.md with fresh insights.
The result? Claude actually knows:
- Cherry mentioned 15 times in gratitude this month (relationship thriving)
- Thesis hours trending up (+2h vs last week)
- Household routine compliance dipped mid-month (needs attention)
This isn't generic productivity advice. It's my life data driving personalized insights.
The Integration Layer: External Services
LifeOS connects to external services through MCP tools. The finance-manager skill demonstrates how AI can orchestrate multiple APIs:
- Connects to YNAB for budget data
- Fetches live exchange rates via Frankfurter API
- Presents unified view without manual data entry
The key insight: AI skills can abstract away integration complexity. Instead of switching between apps, natural language queries surface the information you need.
The Vector Search Layer: Local, Private, Powerful
Every note I write gets indexed into a local Qdrant vector database using sentence-transformers/all-MiniLM-L6-v2 embeddings.
Why local?
- Zero API costs (vs cloud embeddings)
- Complete privacy (no data leaves my machine)
- Offline capability
- Sub-100ms search latency
What it enables:
/search "cherry gratitude"
Results:
1. 2026-01-10 Gratitude for Cherry.md (0.92 similarity)
2. 2025-12-28 Gratitude for Cherry's patience.md (0.89)
3. 2025-11-15 Family milestone - Cherry's birthday.md (0.84)But it's not just keyword matching. Semantic search understands:
- "feeling thankful" finds gratitude entries
- "money worries" finds finance notes
- "kids growing up" finds milestone entries
The vault currently indexes 50+ notes, but the architecture handles thousands without performance degradation.
Slash Commands: Rapid Workflows
Beyond skills, LifeOS provides 8 slash commands for common workflows:
| Command | Purpose |
|---|---|
/gratitude | Daily gratitude practice with auto-linking |
/milestone | Log family milestones for kids |
/encounter | Record person meetings |
/search <query> | Semantic search across vault + memory |
/daily | Daily routine check-in |
/weekly | Weekly review with analytics |
/what-do-you-know | Summary of living memory |
/update-memory | Sync skills from conversation |
Example /weekly output:
## Weekly Insights (Jan 6-12, 2026)
### Compliance Dashboard
- Gratitude Practice: 7 entries, 18 day streak
- Thesis Hours: 8h / 6h target (133%)
- Household Routines: 85% complete
- Daily Notes: 7/7 days logged
### Top Relationships This Week
1. Cherry (7 mentions)
2. Hiide (5 mentions)
3. Kaede (3 mentions)
### Observations
- Gratitude streaks strongest when capturing family moments
- Thursday productivity dip correlates with late dinnerThe Self-Evolving System
The most powerful aspect of LifeOS is that it learns. The skill-evolution meta-skill watches for:
Explicit signals:
- "I prefer..." statements
- Template change requests
- Workflow friction complaints
Implicit signals:
- Patterns across 3+ uses
- Skill sections consistently unused
- Vault structure changes
When evolution triggers, it updates the relevant skills and logs to evolution-log.md:
## 2025-10-31 - Workflow Improvement Discovery
**Trigger:** Pattern detected across 3+ uses
**Type:** Process optimization
**Discovery:**
- Gratitude entries consistently needed person linking
- Manual linking took 30+ seconds per entry
- Auto-detection could eliminate this friction
**Files Updated:**
1. gratitude-entry/SKILL.md - Added auto-link detection
2. family-manager/SKILL.md - Updated person matching
3. evolving-context.md - Workflow preferences updatedThis creates a feedback loop where the AI system genuinely improves from use.
Technical Implementation: Under the Hood
The MCP Architecture
LifeOS uses Claude's Model Context Protocol (MCP) for all integrations. Here's how the vector search MCP server is structured:
# vector-mcp/server.py (simplified)
from mcp.server import Server
from sentence_transformers import SentenceTransformer
app = Server("vector-mcp")
model = SentenceTransformer('all-MiniLM-L6-v2') # 384-dim embeddings
@app.tool()
async def search(query: str, collection: str, limit: int = 5):
"""Semantic search across vault notes."""
query_vector = model.encode(query).tolist()
results = qdrant.search(
collection_name=collection,
query_vector=query_vector,
limit=limit,
with_payload=True
)
return [{"path": r.payload["path"], "score": r.score} for r in results]The key architectural decision: lazy-load the embedding model. Sentence-transformers loads 384MB into memory - doing that at MCP server startup blocks Claude Code initialization. Loading on first search request avoids this.
Skill File Structure
Each skill follows a consistent pattern:
# SKILL.md
## Purpose
One-sentence description of what this skill does.
## Triggers
- Keywords that activate this skill
- Commands that invoke it
## Workflow
1. Step one
2. Step two
3. Step three
## Integration Points
- Which MCPs it uses
- Which skills it composes
## Auto-Behaviors
- What happens automatically
- When updates triggerThis structure lets skills compose - family-manager can read gratitude-entry's workflow and invoke it correctly.
The Vault Insight Analysis
The vault-insight-analyzer skill runs weekly to detect patterns. Here's the core logic:
# Count gratitude entries by person
grep -r "\[\[.*\]\]" ~/Documents/brain/lifeos/40\ Time/Gratitudes/ | \
grep -o "\[\[.*\]\]" | sort | uniq -c | sort -nr
# Output:
# 15 [[Hipolito, Cherry Jayne|Cherry]]
# 8 [[Hipolito, Hiide Illumi|Hiide]]
# 5 [[Hipolito, Kaede Miyuki|Kaede]]This feeds directly into relationship-graph.md for connection strength tracking.
The Integration Stack
LifeOS connects to:
| Integration | Purpose |
|---|---|
| YNAB | Budget tracking and financial data |
| Todoist | Task tracking across life domains |
| Toggl | Time tracking for thesis, work, family |
| Obsidian | Knowledge vault (source of truth) |
| Qdrant | Local vector database |
| Frankfurter.app | Live exchange rates (free, no API key) |
All through MCP (Model Context Protocol) - Claude's native integration layer. No custom middleware, no API stitching, just clean tool definitions.
Results: What LifeOS Enables
After 6 months of daily use:
Quantitative:
- Gratitude streak: 100+ days (previously inconsistent)
- Thesis tracking: 6+ hours/week consistently hit
- Context restoration: 0 minutes (vs 5-10 min previously explaining who people are)
- Note creation: ~5 seconds vs 2+ minutes manual entry
Qualitative:
- Relationship patterns visible (who I'm grateful for most)
- Family milestones never lost
- AI conversations start with context, not explanation
- Reduced friction means habits actually stick
What I Learned Building This
Natural Language First is Non-Negotiable
The moment I require specific syntax, friction wins. Everything in LifeOS accepts natural speech.
Local > Cloud for Personal Data
My life data doesn't need to leave my machine. Local embeddings are good enough for personal knowledge management.
Skills Should Compose, Not Monolith
9 focused skills that call each other beats 1 massive skill that tries to do everything.
Evolution > Perfection
The system that learns from use beats the perfectly designed system that stays static.
Vault as Truth
When AI memory and vault content diverge, vault wins. Always.
Try It Yourself
LifeOS is built on publicly available tools:
- Claude Code - AI interface with MCP support
- Obsidian - Knowledge vault
- Qdrant - Vector database (Docker)
- sentence-transformers - Local embeddings
- YNAB/Todoist/Toggl - Life integrations
The secret isn't the tools - it's the skill architecture and living memory pattern. Any serious knowledge worker can build something similar.
The question isn't whether you need a personal operating system. It's whether you can afford not to have one as AI becomes central to how we work and live.
LifeOS represents 6 months of iteration on personal knowledge management. The system continues to evolve - that's kind of the point.
What's Next?
Currently exploring:
- Voice integration for hands-free logging
- Photo/video milestone memories
- Health data integration via Apple Health
- Investment tracking via KiwiSaver APIs
The beautiful thing about LifeOS: each new capability is just another skill. The architecture scales.
If you're building something similar, I'd love to hear about it. The personal OS space is still early, and there's much to learn from each other's approaches.