Add development plan with 13 milestone specifications
- docs/plan.md: Master roadmap with phases and priorities - docs/milestones/01-13: Detailed specs for each feature - Updated CLAUDE.md with plan references and build commands Milestones cover: - Phase 1: Temporal versioning, auto-capture, context injection, codebase indexing - Phase 2: Daily journal, content ingestion, graph visualization, import/export - Phase 3: Multi-graph, smart retrieval, TUI dashboard, browser extension, shell completions
This commit is contained in:
140
docs/milestones/01-temporal-versioning.md
Normal file
140
docs/milestones/01-temporal-versioning.md
Normal file
@@ -0,0 +1,140 @@
|
||||
# Milestone 1: Temporal Versioning
|
||||
|
||||
## Overview
|
||||
|
||||
Track how knowledge evolves over time. Every node change creates a version, enabling "time travel" queries like "what did I know about authentication last month?"
|
||||
|
||||
## Motivation
|
||||
|
||||
- Facts change — what was true yesterday may not be true today
|
||||
- Debugging requires knowing historical context
|
||||
- AI agents need to reason about temporal relationships
|
||||
- Prevents accidental knowledge loss from overwrites
|
||||
|
||||
## Features
|
||||
|
||||
### 1.1 Node Versioning
|
||||
|
||||
Every update to a node creates a new version instead of overwriting.
|
||||
|
||||
```sql
|
||||
-- New table
|
||||
CREATE TABLE node_versions (
|
||||
id TEXT PRIMARY KEY,
|
||||
node_id TEXT NOT NULL,
|
||||
version INTEGER NOT NULL,
|
||||
title TEXT NOT NULL,
|
||||
content TEXT,
|
||||
status TEXT,
|
||||
tags TEXT,
|
||||
metadata TEXT,
|
||||
valid_from INTEGER NOT NULL,
|
||||
valid_until INTEGER, -- NULL = current version
|
||||
created_by TEXT, -- 'user', 'auto-capture', 'merge', etc.
|
||||
FOREIGN KEY (node_id) REFERENCES nodes(id)
|
||||
);
|
||||
|
||||
CREATE INDEX idx_versions_node ON node_versions(node_id, version);
|
||||
CREATE INDEX idx_versions_time ON node_versions(valid_from, valid_until);
|
||||
```
|
||||
|
||||
### 1.2 Time-Travel Queries
|
||||
|
||||
```bash
|
||||
# Show node as it was on a specific date
|
||||
cortex show abc123 --at "2024-01-15"
|
||||
|
||||
# Query the graph at a point in time
|
||||
cortex query "authentication" --at "2024-01-15"
|
||||
|
||||
# Show version history
|
||||
cortex history abc123
|
||||
```
|
||||
|
||||
### 1.3 Diff & Compare
|
||||
|
||||
```bash
|
||||
# Compare two versions
|
||||
cortex diff abc123 --v1 3 --v2 5
|
||||
|
||||
# Compare node across time
|
||||
cortex diff abc123 --from "2024-01-01" --to "2024-02-01"
|
||||
```
|
||||
|
||||
### 1.4 MCP Tools
|
||||
|
||||
```typescript
|
||||
// New MCP tools
|
||||
memory_history // Get version history for a node
|
||||
memory_show_at // Show node at specific timestamp
|
||||
memory_diff // Compare versions
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### Database Changes
|
||||
|
||||
1. Add `node_versions` table
|
||||
2. Add `version` column to `nodes` table (current version number)
|
||||
3. Modify `updateNode()` to create version records
|
||||
4. Add `valid_from`/`valid_until` to enable point-in-time queries
|
||||
|
||||
### API Changes
|
||||
|
||||
```typescript
|
||||
// store.ts additions
|
||||
export function getNodeHistory(id: string): NodeVersion[];
|
||||
export function getNodeAtTime(id: string, timestamp: number): Node | null;
|
||||
export function diffVersions(id: string, v1: number, v2: number): NodeDiff;
|
||||
```
|
||||
|
||||
### Migration Strategy
|
||||
|
||||
1. Create new tables
|
||||
2. Migrate existing nodes as version 1
|
||||
3. Set `valid_from` to `created_at` for existing nodes
|
||||
|
||||
## CLI Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `cortex history <id>` | Show version history |
|
||||
| `cortex show <id> --at <date>` | Show node at point in time |
|
||||
| `cortex diff <id> [--v1 N] [--v2 M]` | Compare versions |
|
||||
| `cortex restore <id> --version <N>` | Restore old version |
|
||||
|
||||
## Testing
|
||||
|
||||
- [ ] Create node, update 3 times, verify 3 versions exist
|
||||
- [ ] Query at historical timestamp returns correct version
|
||||
- [ ] Diff shows actual changes between versions
|
||||
- [ ] Restore creates new version (not destructive)
|
||||
- [ ] Migration preserves existing data
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] All node updates create version records
|
||||
- [ ] `--at` flag works on show/query commands
|
||||
- [ ] History command shows all versions with timestamps
|
||||
- [ ] Diff output is human-readable
|
||||
- [ ] MCP tools expose versioning functionality
|
||||
- [ ] No performance regression on normal operations (<10% slower)
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- Database schema: 2 hours
|
||||
- Store layer changes: 4 hours
|
||||
- CLI commands: 3 hours
|
||||
- MCP tools: 2 hours
|
||||
- Testing: 2 hours
|
||||
- **Total: ~13 hours**
|
||||
|
||||
## Dependencies
|
||||
|
||||
- None (can start immediately)
|
||||
|
||||
## References
|
||||
|
||||
- [Zep Temporal Knowledge Graph](https://arxiv.org/abs/2501.13956)
|
||||
- [Datomic's immutable database model](https://www.datomic.com/)
|
||||
- [Event sourcing patterns](https://martinfowler.com/eaaDev/EventSourcing.html)
|
||||
212
docs/milestones/02-auto-capture.md
Normal file
212
docs/milestones/02-auto-capture.md
Normal file
@@ -0,0 +1,212 @@
|
||||
# Milestone 2: Auto-Capture System
|
||||
|
||||
## Overview
|
||||
|
||||
Automatically capture Claude Code conversations as memory nodes. Every significant interaction becomes searchable knowledge without manual effort.
|
||||
|
||||
## Motivation
|
||||
|
||||
- Manual memory creation is friction that prevents adoption
|
||||
- Conversations contain valuable context that's lost after the session
|
||||
- Supermemory's killer feature is invisible capture
|
||||
- "Just works" experience is essential for daily use
|
||||
|
||||
## Features
|
||||
|
||||
### 2.1 Conversation Hooks
|
||||
|
||||
Integrate with Claude Code's hook system to capture interactions.
|
||||
|
||||
```json
|
||||
// .claude/settings.json
|
||||
{
|
||||
"hooks": {
|
||||
"post_response": {
|
||||
"command": "cortex capture-hook",
|
||||
"timeout": 5000
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 2.2 Smart Summarization
|
||||
|
||||
Don't store raw conversations — summarize and extract:
|
||||
|
||||
```typescript
|
||||
interface CapturedMemory {
|
||||
summary: string; // 1-2 sentence summary
|
||||
topics: string[]; // Extracted topics/tags
|
||||
decisions: string[]; // Any decisions made
|
||||
codeChanges: string[]; // Files modified
|
||||
relatedNodes: string[]; // Links to existing nodes
|
||||
}
|
||||
```
|
||||
|
||||
### 2.3 Deduplication
|
||||
|
||||
Prevent duplicate memories from similar conversations:
|
||||
|
||||
- Embedding similarity check before insert
|
||||
- Merge into existing node if >0.90 similarity
|
||||
- Link as "relates_to" if 0.75-0.90 similarity
|
||||
|
||||
### 2.4 Capture Modes
|
||||
|
||||
```bash
|
||||
# Always capture (default when enabled)
|
||||
cortex config set capture.mode always
|
||||
|
||||
# Only capture on explicit save
|
||||
cortex config set capture.mode manual
|
||||
|
||||
# Capture decisions and code changes only
|
||||
cortex config set capture.mode decisions
|
||||
|
||||
# Disable capture
|
||||
cortex config set capture.mode off
|
||||
```
|
||||
|
||||
### 2.5 MCP Notification
|
||||
|
||||
Add MCP tool for Claude to explicitly save memories:
|
||||
|
||||
```typescript
|
||||
memory_capture // Explicitly capture current context
|
||||
memory_remember // "Remember this for later" - user triggered
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### Hook Handler
|
||||
|
||||
```typescript
|
||||
// src/cli/commands/capture-hook.ts
|
||||
export async function captureHook(input: HookInput): Promise<void> {
|
||||
const { conversation, files_changed, session_id } = input;
|
||||
|
||||
// Skip if disabled or trivial interaction
|
||||
if (!shouldCapture(conversation)) return;
|
||||
|
||||
// Summarize with Ollama
|
||||
const summary = await summarizeConversation(conversation);
|
||||
|
||||
// Extract structured data
|
||||
const extracted = await extractMemory(conversation, summary);
|
||||
|
||||
// Check for duplicates
|
||||
const existing = await findSimilar(extracted.embedding);
|
||||
|
||||
if (existing && existing.similarity > 0.90) {
|
||||
// Merge into existing
|
||||
await mergeIntoNode(existing.id, extracted);
|
||||
} else {
|
||||
// Create new node
|
||||
await addNode({
|
||||
kind: 'memory',
|
||||
title: extracted.summary.slice(0, 100),
|
||||
content: extracted.fullSummary,
|
||||
tags: ['auto-capture', ...extracted.topics],
|
||||
metadata: {
|
||||
session_id,
|
||||
files_changed,
|
||||
captured_at: Date.now(),
|
||||
source: 'claude-code'
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Summarization Prompts
|
||||
|
||||
```typescript
|
||||
const SUMMARIZE_PROMPT = `
|
||||
Summarize this Claude Code conversation in 1-2 sentences.
|
||||
Focus on: what was accomplished, decisions made, problems solved.
|
||||
Do NOT include greetings or meta-discussion.
|
||||
|
||||
Conversation:
|
||||
{conversation}
|
||||
|
||||
Summary:`;
|
||||
|
||||
const EXTRACT_PROMPT = `
|
||||
Extract from this conversation:
|
||||
1. Main topics (as tags, lowercase, hyphenated)
|
||||
2. Decisions made (if any)
|
||||
3. Code files discussed or modified
|
||||
|
||||
Conversation:
|
||||
{conversation}
|
||||
|
||||
Output as JSON:
|
||||
{"topics": [], "decisions": [], "files": []}`;
|
||||
```
|
||||
|
||||
### Configuration
|
||||
|
||||
```typescript
|
||||
// src/core/config.ts
|
||||
interface CaptureConfig {
|
||||
mode: 'always' | 'manual' | 'decisions' | 'off';
|
||||
minLength: number; // Minimum conversation length to capture
|
||||
excludePatterns: string[]; // Regex patterns to skip
|
||||
autoTag: boolean; // Auto-generate tags
|
||||
linkRelated: boolean; // Auto-link to related nodes
|
||||
}
|
||||
```
|
||||
|
||||
## CLI Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `cortex capture-hook` | Hook handler (called by Claude Code) |
|
||||
| `cortex capture <text>` | Manually capture a thought |
|
||||
| `cortex capture --session` | Capture current session summary |
|
||||
| `cortex config set capture.mode <mode>` | Set capture mode |
|
||||
|
||||
## MCP Tools
|
||||
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| `memory_capture` | Explicitly capture current context |
|
||||
| `memory_remember` | User-triggered "remember this" |
|
||||
|
||||
## Testing
|
||||
|
||||
- [ ] Hook receives conversation data correctly
|
||||
- [ ] Summarization produces coherent summaries
|
||||
- [ ] Deduplication prevents duplicate nodes
|
||||
- [ ] Tags are extracted accurately
|
||||
- [ ] Related nodes are linked
|
||||
- [ ] Config modes work as expected
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Conversations auto-captured without user action
|
||||
- [ ] Summaries are concise and useful
|
||||
- [ ] No duplicate memories for repeated topics
|
||||
- [ ] Can disable/configure capture behavior
|
||||
- [ ] Works offline (queues for later if Ollama unavailable)
|
||||
- [ ] <500ms latency (non-blocking)
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- Hook integration: 3 hours
|
||||
- Summarization pipeline: 4 hours
|
||||
- Deduplication logic: 3 hours
|
||||
- Configuration system: 2 hours
|
||||
- MCP tools: 2 hours
|
||||
- Testing: 3 hours
|
||||
- **Total: ~17 hours**
|
||||
|
||||
## Dependencies
|
||||
|
||||
- Ollama for summarization (graceful fallback if unavailable)
|
||||
- Claude Code hooks system
|
||||
|
||||
## References
|
||||
|
||||
- [Claude-Supermemory](https://github.com/supermemoryai/claude-supermemory)
|
||||
- [Claude Code Hooks](https://docs.anthropic.com/claude-code/hooks)
|
||||
238
docs/milestones/03-context-injection.md
Normal file
238
docs/milestones/03-context-injection.md
Normal file
@@ -0,0 +1,238 @@
|
||||
# Milestone 3: Context Injection
|
||||
|
||||
## Overview
|
||||
|
||||
Automatically inject relevant memories into Claude's context at session start. Claude begins every conversation already "knowing" relevant history.
|
||||
|
||||
## Motivation
|
||||
|
||||
- Eliminates "Claude doesn't remember" frustration
|
||||
- No manual "let me search my notes" workflow
|
||||
- Supermemory's most impactful feature
|
||||
- Makes memory system invisible to users
|
||||
|
||||
## Features
|
||||
|
||||
### 3.1 Session Start Hook
|
||||
|
||||
```json
|
||||
// .claude/settings.json
|
||||
{
|
||||
"hooks": {
|
||||
"session_start": {
|
||||
"command": "cortex context-hook",
|
||||
"timeout": 3000
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3.2 Context Selection
|
||||
|
||||
Intelligently select what to inject:
|
||||
|
||||
```typescript
|
||||
interface ContextSelection {
|
||||
recentWork: Node[]; // Last 24-48 hours of activity
|
||||
projectContext: Node[]; // Indexed codebase info
|
||||
relevantMemories: Node[]; // Semantic match to current directory
|
||||
userPreferences: Node[]; // User's stated preferences
|
||||
openTasks: Node[]; // Incomplete tasks
|
||||
}
|
||||
```
|
||||
|
||||
### 3.3 Context Budget
|
||||
|
||||
Manage token usage with configurable limits:
|
||||
|
||||
```typescript
|
||||
interface ContextConfig {
|
||||
maxTokens: number; // Default: 4000
|
||||
maxNodes: number; // Default: 20
|
||||
includeRecent: boolean; // Include recent work
|
||||
includeProject: boolean; // Include project index
|
||||
includeTasks: boolean; // Include open tasks
|
||||
customQuery?: string; // Additional filter query
|
||||
}
|
||||
```
|
||||
|
||||
### 3.4 Priority Ranking
|
||||
|
||||
Rank memories for inclusion:
|
||||
|
||||
1. **Project-specific** — Nodes tagged with current project
|
||||
2. **Recent** — Accessed in last 48 hours
|
||||
3. **Tasks** — Open todos/tasks
|
||||
4. **High-relevance** — Semantic match to project files
|
||||
5. **Decisions** — Past architectural decisions
|
||||
|
||||
### 3.5 MCP Resource
|
||||
|
||||
Expose context as an MCP resource:
|
||||
|
||||
```typescript
|
||||
// MCP resource for context
|
||||
{
|
||||
uri: "memory://context/current",
|
||||
name: "Current Context",
|
||||
description: "Relevant memories for this session"
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### Context Hook
|
||||
|
||||
```typescript
|
||||
// src/cli/commands/context-hook.ts
|
||||
export async function contextHook(): Promise<string> {
|
||||
const cwd = process.cwd();
|
||||
const projectName = path.basename(cwd);
|
||||
|
||||
// Gather context candidates
|
||||
const candidates = await gatherCandidates(cwd, projectName);
|
||||
|
||||
// Rank and select within budget
|
||||
const selected = selectWithinBudget(candidates, config.maxTokens);
|
||||
|
||||
// Format for Claude
|
||||
return formatContext(selected);
|
||||
}
|
||||
|
||||
async function gatherCandidates(cwd: string, project: string): Promise<RankedNode[]> {
|
||||
const results: RankedNode[] = [];
|
||||
|
||||
// Recent work (last 48h)
|
||||
const recent = await listNodes({
|
||||
limit: 10,
|
||||
// Custom: accessed in last 48h
|
||||
});
|
||||
results.push(...recent.map(n => ({ node: n, score: 0.8, reason: 'recent' })));
|
||||
|
||||
// Project-tagged nodes
|
||||
const projectNodes = await listNodes({ tags: [project] });
|
||||
results.push(...projectNodes.map(n => ({ node: n, score: 0.9, reason: 'project' })));
|
||||
|
||||
// Open tasks
|
||||
const tasks = await listNodes({ kind: 'task', status: 'todo' });
|
||||
results.push(...tasks.map(n => ({ node: n, score: 0.7, reason: 'task' })));
|
||||
|
||||
// Semantic search on README/package.json
|
||||
const projectContext = await getProjectContext(cwd);
|
||||
if (projectContext) {
|
||||
const semantic = await query(projectContext, { limit: 10 });
|
||||
results.push(...semantic.map(n => ({ node: n.node, score: n.score, reason: 'semantic' })));
|
||||
}
|
||||
|
||||
return dedupeAndRank(results);
|
||||
}
|
||||
```
|
||||
|
||||
### Context Formatting
|
||||
|
||||
```typescript
|
||||
function formatContext(nodes: RankedNode[]): string {
|
||||
const sections: string[] = [];
|
||||
|
||||
// Group by reason
|
||||
const byReason = groupBy(nodes, 'reason');
|
||||
|
||||
if (byReason.project?.length) {
|
||||
sections.push(`## Project Context\n${formatNodes(byReason.project)}`);
|
||||
}
|
||||
|
||||
if (byReason.recent?.length) {
|
||||
sections.push(`## Recent Work\n${formatNodes(byReason.recent)}`);
|
||||
}
|
||||
|
||||
if (byReason.task?.length) {
|
||||
sections.push(`## Open Tasks\n${formatNodes(byReason.task)}`);
|
||||
}
|
||||
|
||||
if (byReason.semantic?.length) {
|
||||
sections.push(`## Related Memories\n${formatNodes(byReason.semantic)}`);
|
||||
}
|
||||
|
||||
return sections.join('\n\n');
|
||||
}
|
||||
```
|
||||
|
||||
### Configuration
|
||||
|
||||
```bash
|
||||
# Configure context injection
|
||||
cortex config set context.maxTokens 4000
|
||||
cortex config set context.maxNodes 20
|
||||
cortex config set context.includeRecent true
|
||||
cortex config set context.includeTasks true
|
||||
```
|
||||
|
||||
## CLI Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `cortex context-hook` | Hook handler (outputs context to stdout) |
|
||||
| `cortex context` | Preview what context would be injected |
|
||||
| `cortex context --project <name>` | Show context for specific project |
|
||||
|
||||
## MCP Integration
|
||||
|
||||
```typescript
|
||||
// MCP resource
|
||||
server.resource({
|
||||
uri: 'memory://context/current',
|
||||
name: 'Session Context',
|
||||
mimeType: 'text/markdown',
|
||||
async read() {
|
||||
const context = await generateContext();
|
||||
return { text: context };
|
||||
}
|
||||
});
|
||||
|
||||
// MCP tool
|
||||
server.tool('memory_context', 'Get relevant context for current session', {
|
||||
project: z.string().optional(),
|
||||
maxTokens: z.number().optional()
|
||||
}, async ({ project, maxTokens }) => {
|
||||
const context = await generateContext({ project, maxTokens });
|
||||
return { content: [{ type: 'text', text: context }] };
|
||||
});
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
- [ ] Hook outputs valid markdown
|
||||
- [ ] Context respects token budget
|
||||
- [ ] Project-specific nodes ranked highest
|
||||
- [ ] Recent nodes included
|
||||
- [ ] Open tasks included
|
||||
- [ ] No duplicate nodes in output
|
||||
- [ ] Performance <500ms
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Context auto-injected at session start
|
||||
- [ ] Relevant memories appear without user action
|
||||
- [ ] Token budget respected
|
||||
- [ ] Configurable what to include
|
||||
- [ ] Works in any project directory
|
||||
- [ ] Graceful when no relevant memories
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- Hook implementation: 3 hours
|
||||
- Context gathering: 4 hours
|
||||
- Ranking algorithm: 3 hours
|
||||
- Configuration: 2 hours
|
||||
- MCP integration: 2 hours
|
||||
- Testing: 2 hours
|
||||
- **Total: ~16 hours**
|
||||
|
||||
## Dependencies
|
||||
|
||||
- Milestone 4 (Codebase Indexing) enhances this but not required
|
||||
|
||||
## References
|
||||
|
||||
- [Claude-Supermemory context injection](https://github.com/supermemoryai/claude-supermemory)
|
||||
- [RAG best practices](https://www.anthropic.com/research/rag)
|
||||
260
docs/milestones/04-codebase-indexing.md
Normal file
260
docs/milestones/04-codebase-indexing.md
Normal file
@@ -0,0 +1,260 @@
|
||||
# Milestone 4: Codebase Indexing
|
||||
|
||||
## Overview
|
||||
|
||||
Automatically scan and index project structure, creating component nodes for modules, services, and architectural patterns. Claude understands your codebase from day one.
|
||||
|
||||
## Motivation
|
||||
|
||||
- New projects require extensive explanation to Claude
|
||||
- Architecture decisions are scattered across files
|
||||
- Component relationships aren't captured anywhere
|
||||
- Supermemory's `/index` command is highly valued
|
||||
|
||||
## Features
|
||||
|
||||
### 4.1 Project Scanner
|
||||
|
||||
```bash
|
||||
# Index current project
|
||||
cortex index .
|
||||
|
||||
# Index specific directory
|
||||
cortex index ./src
|
||||
|
||||
# Re-index (update existing)
|
||||
cortex index . --update
|
||||
|
||||
# Index with specific depth
|
||||
cortex index . --depth 3
|
||||
```
|
||||
|
||||
### 4.2 Auto-Detection
|
||||
|
||||
Detect project type and extract relevant info:
|
||||
|
||||
| Project Type | Detection | Extracts |
|
||||
|--------------|-----------|----------|
|
||||
| Node.js | `package.json` | Dependencies, scripts, name |
|
||||
| Python | `pyproject.toml`, `setup.py` | Dependencies, entry points |
|
||||
| Rust | `Cargo.toml` | Crates, features |
|
||||
| Go | `go.mod` | Modules, dependencies |
|
||||
| Generic | `README.md` | Description, setup |
|
||||
|
||||
### 4.3 Component Extraction
|
||||
|
||||
Create nodes for discovered components:
|
||||
|
||||
```typescript
|
||||
interface IndexedComponent {
|
||||
kind: 'component';
|
||||
title: string; // e.g., "UserService"
|
||||
content: string; // Description + key exports
|
||||
tags: string[]; // ['backend', 'service', 'auth']
|
||||
metadata: {
|
||||
filePath: string;
|
||||
language: string;
|
||||
exports: string[];
|
||||
imports: string[];
|
||||
loc: number;
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
### 4.4 Relationship Mapping
|
||||
|
||||
Auto-create edges based on imports/dependencies:
|
||||
|
||||
```typescript
|
||||
// File A imports from File B
|
||||
addEdge(componentA.id, componentB.id, 'depends_on');
|
||||
|
||||
// Directory contains files
|
||||
addEdge(directoryNode.id, fileNode.id, 'contains');
|
||||
|
||||
// Module implements interface
|
||||
addEdge(impl.id, interface.id, 'implements');
|
||||
```
|
||||
|
||||
### 4.5 Architecture Summary
|
||||
|
||||
Generate high-level architecture node:
|
||||
|
||||
```typescript
|
||||
const architectureNode = {
|
||||
kind: 'component',
|
||||
title: `${projectName} Architecture`,
|
||||
content: `
|
||||
## Overview
|
||||
${projectDescription}
|
||||
|
||||
## Tech Stack
|
||||
- Runtime: ${runtime}
|
||||
- Framework: ${framework}
|
||||
- Database: ${database}
|
||||
|
||||
## Key Components
|
||||
${components.map(c => `- **${c.title}**: ${c.summary}`).join('\n')}
|
||||
|
||||
## Directory Structure
|
||||
${directoryTree}
|
||||
`,
|
||||
tags: ['architecture', 'index', projectName],
|
||||
};
|
||||
```
|
||||
|
||||
### 4.6 Incremental Updates
|
||||
|
||||
Track indexed files and only re-process changes:
|
||||
|
||||
```typescript
|
||||
interface IndexState {
|
||||
projectPath: string;
|
||||
lastIndexed: number;
|
||||
fileHashes: Record<string, string>; // path -> content hash
|
||||
nodeIds: Record<string, string>; // path -> node ID
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### Scanner Architecture
|
||||
|
||||
```typescript
|
||||
// src/core/indexer/index.ts
|
||||
export async function indexProject(root: string, options: IndexOptions): Promise<IndexResult> {
|
||||
// Detect project type
|
||||
const projectType = await detectProjectType(root);
|
||||
|
||||
// Load existing index state
|
||||
const state = await loadIndexState(root);
|
||||
|
||||
// Scan files
|
||||
const files = await scanFiles(root, {
|
||||
ignore: [...DEFAULT_IGNORE, ...options.ignore],
|
||||
maxDepth: options.depth,
|
||||
});
|
||||
|
||||
// Process each file
|
||||
const components: IndexedComponent[] = [];
|
||||
for (const file of files) {
|
||||
if (shouldSkip(file, state)) continue;
|
||||
|
||||
const component = await extractComponent(file, projectType);
|
||||
if (component) {
|
||||
components.push(component);
|
||||
}
|
||||
}
|
||||
|
||||
// Create/update nodes
|
||||
const nodes = await upsertComponents(components, state);
|
||||
|
||||
// Map relationships
|
||||
await mapRelationships(nodes, files);
|
||||
|
||||
// Generate architecture summary
|
||||
await generateArchitectureSummary(root, projectType, nodes);
|
||||
|
||||
// Save state
|
||||
await saveIndexState(root, state);
|
||||
|
||||
return { indexed: nodes.length, relationships: edges.length };
|
||||
}
|
||||
```
|
||||
|
||||
### Language Parsers
|
||||
|
||||
```typescript
|
||||
// src/core/indexer/parsers/typescript.ts
|
||||
export async function parseTypeScript(file: string): Promise<ParsedFile> {
|
||||
// Use TypeScript compiler API or tree-sitter
|
||||
const ast = ts.createSourceFile(file, content, ts.ScriptTarget.Latest);
|
||||
|
||||
return {
|
||||
exports: extractExports(ast),
|
||||
imports: extractImports(ast),
|
||||
classes: extractClasses(ast),
|
||||
functions: extractFunctions(ast),
|
||||
interfaces: extractInterfaces(ast),
|
||||
};
|
||||
}
|
||||
|
||||
// Parsers for: JavaScript, Python, Rust, Go, etc.
|
||||
```
|
||||
|
||||
### Ignore Patterns
|
||||
|
||||
```typescript
|
||||
const DEFAULT_IGNORE = [
|
||||
'node_modules',
|
||||
'.git',
|
||||
'dist',
|
||||
'build',
|
||||
'__pycache__',
|
||||
'.env*',
|
||||
'*.min.js',
|
||||
'*.map',
|
||||
'coverage',
|
||||
'.next',
|
||||
'target', // Rust
|
||||
'vendor', // Go
|
||||
];
|
||||
```
|
||||
|
||||
## CLI Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `cortex index [path]` | Index project at path |
|
||||
| `cortex index --update` | Update existing index |
|
||||
| `cortex index --dry-run` | Preview what would be indexed |
|
||||
| `cortex index --depth <n>` | Limit directory depth |
|
||||
| `cortex index --lang <lang>` | Only index specific language |
|
||||
|
||||
## MCP Tools
|
||||
|
||||
```typescript
|
||||
memory_index // Index current project
|
||||
memory_reindex // Force re-index
|
||||
memory_components // List indexed components
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
- [ ] Detects Node.js, Python, Rust, Go projects
|
||||
- [ ] Creates component nodes for modules
|
||||
- [ ] Maps import relationships correctly
|
||||
- [ ] Respects .gitignore patterns
|
||||
- [ ] Incremental update only processes changes
|
||||
- [ ] Architecture summary is accurate
|
||||
- [ ] Performance: <30s for 10k file project
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] `cortex index .` creates meaningful component nodes
|
||||
- [ ] Relationships reflect actual code dependencies
|
||||
- [ ] Architecture summary provides useful overview
|
||||
- [ ] Incremental updates are fast
|
||||
- [ ] Works with monorepos
|
||||
- [ ] MCP tool enables Claude to trigger indexing
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- Project detection: 2 hours
|
||||
- File scanner: 3 hours
|
||||
- TypeScript parser: 4 hours
|
||||
- Python parser: 3 hours
|
||||
- Relationship mapping: 4 hours
|
||||
- Architecture summary: 3 hours
|
||||
- Incremental updates: 3 hours
|
||||
- Testing: 3 hours
|
||||
- **Total: ~25 hours**
|
||||
|
||||
## Dependencies
|
||||
|
||||
- None (enhances Milestone 3 but independent)
|
||||
|
||||
## References
|
||||
|
||||
- [tree-sitter](https://tree-sitter.github.io/tree-sitter/) for parsing
|
||||
- [Sourcebot architecture](https://github.com/sourcebot-dev/sourcebot)
|
||||
249
docs/milestones/05-daily-journal.md
Normal file
249
docs/milestones/05-daily-journal.md
Normal file
@@ -0,0 +1,249 @@
|
||||
# Milestone 5: Daily Journal
|
||||
|
||||
## Overview
|
||||
|
||||
Quick-capture system for daily notes, thoughts, and todos. Every day gets its own node, making it easy to track what happened when.
|
||||
|
||||
## Motivation
|
||||
|
||||
- Logseq's daily notes are beloved by users
|
||||
- Low-friction capture encourages consistent use
|
||||
- Chronological organization aids recall
|
||||
- Natural integration with auto-capture
|
||||
|
||||
## Features
|
||||
|
||||
### 5.1 Daily Note Creation
|
||||
|
||||
```bash
|
||||
# Open/create today's journal
|
||||
cortex journal
|
||||
|
||||
# Add to today's journal
|
||||
cortex journal "Fixed the auth bug"
|
||||
cortex j "Quick thought" # Alias
|
||||
|
||||
# View specific day
|
||||
cortex journal --date 2024-01-15
|
||||
cortex journal --yesterday
|
||||
```
|
||||
|
||||
### 5.2 Quick Capture
|
||||
|
||||
Append to today's journal without opening it:
|
||||
|
||||
```bash
|
||||
# Quick capture
|
||||
cortex capture "Remember to update docs"
|
||||
cortex c "TODO: refactor auth" # Alias
|
||||
|
||||
# Capture with tags
|
||||
cortex capture "API rate limiting idea" --tags api,performance
|
||||
|
||||
# Capture from clipboard
|
||||
cortex capture --clipboard
|
||||
```
|
||||
|
||||
### 5.3 Journal Structure
|
||||
|
||||
```typescript
|
||||
interface JournalNode {
|
||||
kind: 'memory';
|
||||
title: string; // "Journal: 2024-01-15"
|
||||
content: string; // Markdown with entries
|
||||
tags: ['journal', 'daily', '2024-01', '2024'];
|
||||
metadata: {
|
||||
date: string; // "2024-01-15"
|
||||
entries: JournalEntry[];
|
||||
};
|
||||
}
|
||||
|
||||
interface JournalEntry {
|
||||
time: string; // "14:30"
|
||||
text: string;
|
||||
tags?: string[];
|
||||
}
|
||||
```
|
||||
|
||||
### 5.4 Journal Linking
|
||||
|
||||
Auto-link journals to related nodes:
|
||||
|
||||
- Mentioned node IDs → `relates_to` edge
|
||||
- Mentioned tags → find and link relevant nodes
|
||||
- Mentioned files → link to indexed components
|
||||
|
||||
### 5.5 End-of-Day Summary
|
||||
|
||||
Optional AI-generated summary:
|
||||
|
||||
```bash
|
||||
# Generate summary of today
|
||||
cortex journal --summarize
|
||||
|
||||
# Auto-summarize at end of day (cron/scheduled task)
|
||||
cortex journal --summarize --auto
|
||||
```
|
||||
|
||||
### 5.6 Journal Search
|
||||
|
||||
```bash
|
||||
# Search within journals
|
||||
cortex journal --search "auth bug"
|
||||
|
||||
# List recent journals
|
||||
cortex journal --list
|
||||
cortex journal --list --month 2024-01
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### Journal Service
|
||||
|
||||
```typescript
|
||||
// src/core/journal.ts
|
||||
export async function getOrCreateJournal(date?: string): Promise<Node> {
|
||||
const targetDate = date || formatDate(new Date());
|
||||
const title = `Journal: ${targetDate}`;
|
||||
|
||||
// Find existing
|
||||
const existing = await findNodeByTitle(title);
|
||||
if (existing) return existing;
|
||||
|
||||
// Create new
|
||||
return addNode({
|
||||
kind: 'memory',
|
||||
title,
|
||||
content: `# ${targetDate}\n\n`,
|
||||
tags: ['journal', 'daily', ...dateTags(targetDate)],
|
||||
metadata: { date: targetDate, entries: [] },
|
||||
});
|
||||
}
|
||||
|
||||
export async function appendToJournal(text: string, options?: CaptureOptions): Promise<void> {
|
||||
const journal = await getOrCreateJournal();
|
||||
const time = formatTime(new Date());
|
||||
|
||||
const entry: JournalEntry = {
|
||||
time,
|
||||
text,
|
||||
tags: options?.tags,
|
||||
};
|
||||
|
||||
// Append to content
|
||||
const newContent = `${journal.content}\n- **${time}** ${text}`;
|
||||
|
||||
// Update metadata
|
||||
const entries = [...(journal.metadata.entries || []), entry];
|
||||
|
||||
await updateNode(journal.id, {
|
||||
content: newContent,
|
||||
metadata: { ...journal.metadata, entries },
|
||||
});
|
||||
|
||||
// Auto-link mentioned nodes
|
||||
await linkMentionedNodes(journal.id, text);
|
||||
}
|
||||
```
|
||||
|
||||
### Daily Summary Generator
|
||||
|
||||
```typescript
|
||||
async function generateDailySummary(date: string): Promise<string> {
|
||||
const journal = await getOrCreateJournal(date);
|
||||
if (!journal.metadata.entries?.length) return '';
|
||||
|
||||
const prompt = `
|
||||
Summarize this day's journal entries in 2-3 sentences.
|
||||
Focus on: accomplishments, decisions, blockers.
|
||||
|
||||
Entries:
|
||||
${journal.metadata.entries.map(e => `- ${e.time}: ${e.text}`).join('\n')}
|
||||
|
||||
Summary:`;
|
||||
|
||||
return generate(prompt);
|
||||
}
|
||||
```
|
||||
|
||||
## CLI Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `cortex journal` | Open today's journal |
|
||||
| `cortex journal <text>` | Add entry to today |
|
||||
| `cortex journal --date <date>` | View/edit specific day |
|
||||
| `cortex journal --yesterday` | Yesterday's journal |
|
||||
| `cortex journal --list` | List recent journals |
|
||||
| `cortex journal --search <query>` | Search journals |
|
||||
| `cortex journal --summarize` | Generate summary |
|
||||
| `cortex capture <text>` | Quick capture alias |
|
||||
| `cortex c <text>` | Even shorter alias |
|
||||
|
||||
## MCP Tools
|
||||
|
||||
```typescript
|
||||
memory_journal // Get/create today's journal
|
||||
memory_capture // Quick capture to journal
|
||||
memory_journal_list // List recent journals
|
||||
```
|
||||
|
||||
## Integration
|
||||
|
||||
### With Auto-Capture (Milestone 2)
|
||||
|
||||
Auto-captured conversations link to that day's journal:
|
||||
|
||||
```typescript
|
||||
// In auto-capture hook
|
||||
const journal = await getOrCreateJournal();
|
||||
addEdge(capturedNode.id, journal.id, 'relates_to', { reason: 'same-day' });
|
||||
```
|
||||
|
||||
### With Context Injection (Milestone 3)
|
||||
|
||||
Recent journal entries included in context:
|
||||
|
||||
```typescript
|
||||
// In context gathering
|
||||
const recentJournals = await getRecentJournals(3); // Last 3 days
|
||||
candidates.push(...recentJournals.map(j => ({ node: j, score: 0.6, reason: 'journal' })));
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
- [ ] Creates journal node for today
|
||||
- [ ] Appends entries with timestamps
|
||||
- [ ] `--date` flag accesses correct day
|
||||
- [ ] Search finds entries across journals
|
||||
- [ ] Summary generation works
|
||||
- [ ] Journal linked to mentioned nodes
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] `cortex capture "text"` takes <100ms
|
||||
- [ ] Journals organized by date
|
||||
- [ ] Can search across all journals
|
||||
- [ ] Summary captures key points
|
||||
- [ ] Integrates with auto-capture
|
||||
- [ ] MCP tools work correctly
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- Journal CRUD: 3 hours
|
||||
- Quick capture: 2 hours
|
||||
- Search/list: 2 hours
|
||||
- Summary generation: 2 hours
|
||||
- Linking: 2 hours
|
||||
- CLI polish: 2 hours
|
||||
- Testing: 2 hours
|
||||
- **Total: ~15 hours**
|
||||
|
||||
## Dependencies
|
||||
|
||||
- None (independent milestone)
|
||||
|
||||
## References
|
||||
|
||||
- [Logseq daily notes](https://docs.logseq.com/#/page/daily%20notes)
|
||||
- [Obsidian daily notes plugin](https://help.obsidian.md/Plugins/Daily+notes)
|
||||
264
docs/milestones/06-content-ingestion.md
Normal file
264
docs/milestones/06-content-ingestion.md
Normal file
@@ -0,0 +1,264 @@
|
||||
# Milestone 6: URL & Content Ingestion
|
||||
|
||||
## Overview
|
||||
|
||||
Ingest content from URLs, PDFs, and documents into the knowledge graph. Automatically chunk, summarize, and link to existing knowledge.
|
||||
|
||||
## Motivation
|
||||
|
||||
- Knowledge exists outside the codebase (docs, articles, specs)
|
||||
- Manual copy-paste is tedious and loses structure
|
||||
- Supermemory's multi-source ingestion is key feature
|
||||
- Research and documentation should be first-class
|
||||
|
||||
## Features
|
||||
|
||||
### 6.1 URL Ingestion
|
||||
|
||||
```bash
|
||||
# Ingest a webpage
|
||||
cortex ingest https://docs.example.com/api
|
||||
|
||||
# Ingest with custom title
|
||||
cortex ingest https://... --title "API Documentation"
|
||||
|
||||
# Ingest and tag
|
||||
cortex ingest https://... --tags docs,api,reference
|
||||
```
|
||||
|
||||
### 6.2 PDF Ingestion
|
||||
|
||||
```bash
|
||||
# Ingest a PDF
|
||||
cortex ingest ./spec.pdf
|
||||
|
||||
# Ingest specific pages
|
||||
cortex ingest ./spec.pdf --pages 1-10
|
||||
|
||||
# Ingest with chunking strategy
|
||||
cortex ingest ./spec.pdf --chunk-size 1000
|
||||
```
|
||||
|
||||
### 6.3 Markdown/Text Ingestion
|
||||
|
||||
```bash
|
||||
# Ingest markdown file
|
||||
cortex ingest ./notes.md
|
||||
|
||||
# Ingest from stdin
|
||||
cat notes.txt | cortex ingest --stdin
|
||||
|
||||
# Ingest clipboard
|
||||
cortex ingest --clipboard
|
||||
```
|
||||
|
||||
### 6.4 Smart Chunking
|
||||
|
||||
Large documents are split intelligently:
|
||||
|
||||
```typescript
|
||||
interface ChunkStrategy {
|
||||
maxTokens: number; // Max tokens per chunk
|
||||
overlap: number; // Overlap between chunks
|
||||
splitOn: 'paragraph' | 'sentence' | 'heading' | 'page';
|
||||
preserveStructure: boolean;
|
||||
}
|
||||
```
|
||||
|
||||
### 6.5 Entity Extraction
|
||||
|
||||
Extract and link entities:
|
||||
|
||||
```typescript
|
||||
interface ExtractedEntities {
|
||||
people: string[];
|
||||
organizations: string[];
|
||||
technologies: string[];
|
||||
concepts: string[];
|
||||
}
|
||||
|
||||
// Auto-link to existing nodes with matching titles/tags
|
||||
```
|
||||
|
||||
### 6.6 Source Tracking
|
||||
|
||||
Track where content came from:
|
||||
|
||||
```typescript
|
||||
metadata: {
|
||||
source: {
|
||||
type: 'url' | 'pdf' | 'file' | 'clipboard';
|
||||
url?: string;
|
||||
filePath?: string;
|
||||
ingestedAt: number;
|
||||
checksum: string; // For deduplication
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### Ingestion Pipeline
|
||||
|
||||
```typescript
|
||||
// src/core/ingest/index.ts
|
||||
export async function ingest(source: string, options: IngestOptions): Promise<IngestResult> {
|
||||
// Detect source type
|
||||
const sourceType = detectSourceType(source);
|
||||
|
||||
// Fetch/read content
|
||||
const rawContent = await fetchContent(source, sourceType);
|
||||
|
||||
// Convert to markdown
|
||||
const markdown = await convertToMarkdown(rawContent, sourceType);
|
||||
|
||||
// Chunk if needed
|
||||
const chunks = chunkContent(markdown, options.chunkStrategy);
|
||||
|
||||
// Create nodes
|
||||
const nodes: Node[] = [];
|
||||
|
||||
if (chunks.length === 1) {
|
||||
// Single node
|
||||
const node = await createIngestNode(chunks[0], source, options);
|
||||
nodes.push(node);
|
||||
} else {
|
||||
// Parent + children
|
||||
const parent = await createParentNode(source, chunks, options);
|
||||
nodes.push(parent);
|
||||
|
||||
for (const chunk of chunks) {
|
||||
const child = await createChunkNode(chunk, parent.id, options);
|
||||
nodes.push(child);
|
||||
addEdge(parent.id, child.id, 'contains');
|
||||
}
|
||||
}
|
||||
|
||||
// Extract and link entities
|
||||
for (const node of nodes) {
|
||||
await extractAndLinkEntities(node);
|
||||
}
|
||||
|
||||
// Find and link related nodes
|
||||
for (const node of nodes) {
|
||||
await linkRelatedNodes(node);
|
||||
}
|
||||
|
||||
return { nodes: nodes.length, source: sourceType };
|
||||
}
|
||||
```
|
||||
|
||||
### URL Fetcher
|
||||
|
||||
```typescript
|
||||
// src/core/ingest/fetchers/url.ts
|
||||
export async function fetchUrl(url: string): Promise<FetchedContent> {
|
||||
const response = await fetch(url);
|
||||
const html = await response.text();
|
||||
|
||||
// Use readability to extract main content
|
||||
const doc = new JSDOM(html);
|
||||
const reader = new Readability(doc.window.document);
|
||||
const article = reader.parse();
|
||||
|
||||
return {
|
||||
title: article?.title || url,
|
||||
content: article?.textContent || '',
|
||||
html: article?.content || html,
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
### PDF Parser
|
||||
|
||||
```typescript
|
||||
// src/core/ingest/fetchers/pdf.ts
|
||||
export async function parsePdf(filePath: string, options?: PdfOptions): Promise<ParsedPdf> {
|
||||
// Use pdf-parse or pdfjs-dist
|
||||
const dataBuffer = fs.readFileSync(filePath);
|
||||
const data = await pdfParse(dataBuffer);
|
||||
|
||||
return {
|
||||
text: data.text,
|
||||
pages: data.numpages,
|
||||
metadata: data.info,
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
### Markdown Converter
|
||||
|
||||
```typescript
|
||||
// src/core/ingest/convert.ts
|
||||
export async function convertToMarkdown(content: FetchedContent, type: SourceType): Promise<string> {
|
||||
switch (type) {
|
||||
case 'url':
|
||||
return turndown.turndown(content.html);
|
||||
case 'pdf':
|
||||
return content.text; // Already text
|
||||
case 'markdown':
|
||||
return content.content;
|
||||
default:
|
||||
return content.content;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## CLI Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `cortex ingest <source>` | Ingest URL, file, or path |
|
||||
| `cortex ingest --clipboard` | Ingest from clipboard |
|
||||
| `cortex ingest --stdin` | Ingest from stdin |
|
||||
| `cortex ingest --title <title>` | Override title |
|
||||
| `cortex ingest --tags <tags>` | Add tags |
|
||||
| `cortex ingest --chunk-size <n>` | Set chunk size |
|
||||
| `cortex ingest --no-link` | Skip auto-linking |
|
||||
|
||||
## MCP Tools
|
||||
|
||||
```typescript
|
||||
memory_ingest // Ingest URL or content
|
||||
memory_clip // Quick clip from URL
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
- [ ] URL ingestion extracts main content
|
||||
- [ ] PDF parsing handles multi-page docs
|
||||
- [ ] Chunking preserves context
|
||||
- [ ] Entities extracted and linked
|
||||
- [ ] Duplicate content detected
|
||||
- [ ] Source metadata preserved
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] URLs ingested with readable extraction
|
||||
- [ ] PDFs parsed into searchable text
|
||||
- [ ] Large docs chunked intelligently
|
||||
- [ ] Related nodes auto-linked
|
||||
- [ ] Source tracked for reference
|
||||
- [ ] Deduplication prevents duplicates
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- URL fetcher + Readability: 4 hours
|
||||
- PDF parser: 4 hours
|
||||
- Chunking strategy: 3 hours
|
||||
- Entity extraction: 4 hours
|
||||
- Auto-linking: 3 hours
|
||||
- CLI commands: 2 hours
|
||||
- Testing: 3 hours
|
||||
- **Total: ~23 hours**
|
||||
|
||||
## Dependencies
|
||||
|
||||
- `@mozilla/readability` for URL content extraction
|
||||
- `pdf-parse` or `pdfjs-dist` for PDFs
|
||||
- `turndown` for HTML→Markdown
|
||||
|
||||
## References
|
||||
|
||||
- [Mozilla Readability](https://github.com/mozilla/readability)
|
||||
- [LangChain document loaders](https://js.langchain.com/docs/modules/data_connection/document_loaders/)
|
||||
271
docs/milestones/07-graph-visualization.md
Normal file
271
docs/milestones/07-graph-visualization.md
Normal file
@@ -0,0 +1,271 @@
|
||||
# Milestone 7: Graph Visualization
|
||||
|
||||
## Overview
|
||||
|
||||
Export the knowledge graph as interactive HTML visualizations, SVG images, and Mermaid diagrams. See and understand your knowledge structure.
|
||||
|
||||
## Motivation
|
||||
|
||||
- Visual understanding of knowledge structure
|
||||
- Identify orphaned or disconnected nodes
|
||||
- Documentation and sharing
|
||||
- Obsidian-style graph view is beloved
|
||||
|
||||
## Features
|
||||
|
||||
### 7.1 Interactive HTML Export
|
||||
|
||||
```bash
|
||||
# Export full graph as interactive HTML
|
||||
cortex export --format html --output graph.html
|
||||
|
||||
# Export subgraph from a root node
|
||||
cortex export abc123 --format html --output component-graph.html
|
||||
|
||||
# Export with filters
|
||||
cortex export --kind component --format html
|
||||
```
|
||||
|
||||
### 7.2 SVG/PNG Export
|
||||
|
||||
```bash
|
||||
# Static SVG export
|
||||
cortex export --format svg --output graph.svg
|
||||
|
||||
# PNG with custom dimensions
|
||||
cortex export --format png --width 1920 --height 1080
|
||||
|
||||
# Export specific depth from root
|
||||
cortex export abc123 --format svg --depth 3
|
||||
```
|
||||
|
||||
### 7.3 Mermaid Diagram Export
|
||||
|
||||
```bash
|
||||
# Export as Mermaid syntax
|
||||
cortex export --format mermaid
|
||||
|
||||
# Output:
|
||||
# ```mermaid
|
||||
# graph TD
|
||||
# A[Component: Auth] --> B[Component: UserService]
|
||||
# A --> C[Decision: Use JWT]
|
||||
# ```
|
||||
```
|
||||
|
||||
### 7.4 Live Server
|
||||
|
||||
```bash
|
||||
# Start live visualization server
|
||||
cortex viz
|
||||
|
||||
# Opens browser with interactive graph
|
||||
# Auto-refreshes on database changes
|
||||
```
|
||||
|
||||
### 7.5 Customization
|
||||
|
||||
```bash
|
||||
# Custom color scheme
|
||||
cortex export --format html --theme dark
|
||||
|
||||
# Filter by tags
|
||||
cortex export --format html --tags auth,security
|
||||
|
||||
# Layout options
|
||||
cortex export --format html --layout force|tree|radial
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### HTML Exporter (D3.js)
|
||||
|
||||
```typescript
|
||||
// src/core/export/html.ts
|
||||
export async function exportHtml(options: ExportOptions): Promise<string> {
|
||||
const { nodes, edges } = await getGraphData(options);
|
||||
|
||||
const graphData = {
|
||||
nodes: nodes.map(n => ({
|
||||
id: n.id,
|
||||
label: n.title,
|
||||
kind: n.kind,
|
||||
tags: n.tags,
|
||||
group: kindToGroup(n.kind),
|
||||
})),
|
||||
links: edges.map(e => ({
|
||||
source: e.fromId,
|
||||
target: e.toId,
|
||||
type: e.type,
|
||||
})),
|
||||
};
|
||||
|
||||
return generateHtmlTemplate(graphData, options.theme);
|
||||
}
|
||||
|
||||
function generateHtmlTemplate(data: GraphData, theme: string): string {
|
||||
return `
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<title>Cortex Knowledge Graph</title>
|
||||
<script src="https://d3js.org/d3.v7.min.js"></script>
|
||||
<style>${getStyles(theme)}</style>
|
||||
</head>
|
||||
<body>
|
||||
<div id="graph"></div>
|
||||
<div id="sidebar"></div>
|
||||
<script>
|
||||
const data = ${JSON.stringify(data)};
|
||||
${getD3Script()}
|
||||
</script>
|
||||
</body>
|
||||
</html>`;
|
||||
}
|
||||
```
|
||||
|
||||
### D3 Force Graph
|
||||
|
||||
```typescript
|
||||
const D3_SCRIPT = `
|
||||
const width = window.innerWidth;
|
||||
const height = window.innerHeight;
|
||||
|
||||
const svg = d3.select("#graph")
|
||||
.append("svg")
|
||||
.attr("width", width)
|
||||
.attr("height", height);
|
||||
|
||||
const simulation = d3.forceSimulation(data.nodes)
|
||||
.force("link", d3.forceLink(data.links).id(d => d.id).distance(100))
|
||||
.force("charge", d3.forceManyBody().strength(-300))
|
||||
.force("center", d3.forceCenter(width / 2, height / 2));
|
||||
|
||||
// ... link and node rendering
|
||||
// ... drag behavior
|
||||
// ... zoom behavior
|
||||
// ... click to show details
|
||||
`;
|
||||
```
|
||||
|
||||
### Mermaid Exporter
|
||||
|
||||
```typescript
|
||||
// src/core/export/mermaid.ts
|
||||
export async function exportMermaid(options: ExportOptions): Promise<string> {
|
||||
const { nodes, edges } = await getGraphData(options);
|
||||
|
||||
const lines = ['graph TD'];
|
||||
|
||||
// Define nodes
|
||||
for (const node of nodes) {
|
||||
const shape = kindToShape(node.kind);
|
||||
lines.push(` ${shortId(node.id)}${shape.open}"${escape(node.title)}"${shape.close}`);
|
||||
}
|
||||
|
||||
// Define edges
|
||||
for (const edge of edges) {
|
||||
const arrow = typeToArrow(edge.type);
|
||||
lines.push(` ${shortId(edge.fromId)} ${arrow} ${shortId(edge.toId)}`);
|
||||
}
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function kindToShape(kind: string): { open: string; close: string } {
|
||||
switch (kind) {
|
||||
case 'component': return { open: '[', close: ']' };
|
||||
case 'decision': return { open: '{', close: '}' };
|
||||
case 'task': return { open: '([', close: '])' };
|
||||
default: return { open: '(', close: ')' };
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Live Visualization Server
|
||||
|
||||
```typescript
|
||||
// src/server/viz.ts
|
||||
export function startVizServer(port: number): void {
|
||||
const app = express();
|
||||
|
||||
app.get('/', async (req, res) => {
|
||||
const html = await exportHtml({ theme: 'dark', layout: 'force' });
|
||||
res.send(html);
|
||||
});
|
||||
|
||||
app.get('/api/graph', async (req, res) => {
|
||||
const data = await getGraphData({});
|
||||
res.json(data);
|
||||
});
|
||||
|
||||
// WebSocket for live updates
|
||||
const wss = new WebSocketServer({ server });
|
||||
watchDatabase((change) => {
|
||||
wss.clients.forEach(client => {
|
||||
client.send(JSON.stringify({ type: 'update', data: change }));
|
||||
});
|
||||
});
|
||||
|
||||
app.listen(port);
|
||||
open(`http://localhost:${port}`);
|
||||
}
|
||||
```
|
||||
|
||||
## CLI Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `cortex export --format html` | Export interactive HTML |
|
||||
| `cortex export --format svg` | Export static SVG |
|
||||
| `cortex export --format png` | Export PNG image |
|
||||
| `cortex export --format mermaid` | Export Mermaid syntax |
|
||||
| `cortex export <id> --depth <n>` | Export subgraph |
|
||||
| `cortex viz` | Start live visualization server |
|
||||
| `cortex viz --port <n>` | Custom port |
|
||||
|
||||
## MCP Tools
|
||||
|
||||
```typescript
|
||||
memory_export // Export graph in specified format
|
||||
memory_visualize // Get visualization URL
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
- [ ] HTML export renders correctly in browsers
|
||||
- [ ] SVG export produces valid SVG
|
||||
- [ ] Mermaid syntax is valid
|
||||
- [ ] Subgraph export respects depth
|
||||
- [ ] Live server updates on changes
|
||||
- [ ] Large graphs (1000+ nodes) perform well
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Interactive HTML with zoom, pan, search
|
||||
- [ ] Click node to see details
|
||||
- [ ] Color-coded by node kind
|
||||
- [ ] Edge types visually distinct
|
||||
- [ ] Mermaid compatible with GitHub/docs
|
||||
- [ ] Live server auto-refreshes
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- HTML + D3 template: 6 hours
|
||||
- SVG export: 3 hours
|
||||
- Mermaid export: 2 hours
|
||||
- Live server: 4 hours
|
||||
- Theming: 2 hours
|
||||
- Testing: 3 hours
|
||||
- **Total: ~20 hours**
|
||||
|
||||
## Dependencies
|
||||
|
||||
- D3.js (bundled in HTML output)
|
||||
- Optional: Puppeteer for PNG export
|
||||
|
||||
## References
|
||||
|
||||
- [D3.js Force Graph](https://observablehq.com/@d3/force-directed-graph)
|
||||
- [Mermaid.js](https://mermaid.js.org/)
|
||||
- [Obsidian Graph View](https://help.obsidian.md/Plugins/Graph+view)
|
||||
284
docs/milestones/08-import-export.md
Normal file
284
docs/milestones/08-import-export.md
Normal file
@@ -0,0 +1,284 @@
|
||||
# Milestone 8: Import/Export
|
||||
|
||||
## Overview
|
||||
|
||||
Import from and export to popular formats: Obsidian vaults, Markdown folders, JSON-LD, and more. Never be locked into Cortex.
|
||||
|
||||
## Motivation
|
||||
|
||||
- Users have existing knowledge in other tools
|
||||
- Portability prevents lock-in
|
||||
- Enables backup and migration
|
||||
- Standard formats enable interoperability
|
||||
|
||||
## Features
|
||||
|
||||
### 8.1 Obsidian Import
|
||||
|
||||
```bash
|
||||
# Import Obsidian vault
|
||||
cortex import obsidian ~/vaults/notes
|
||||
|
||||
# Import with tag mapping
|
||||
cortex import obsidian ~/vaults/notes --map-tags
|
||||
|
||||
# Preview without importing
|
||||
cortex import obsidian ~/vaults/notes --dry-run
|
||||
```
|
||||
|
||||
Handles:
|
||||
- Wikilinks `[[Page Name]]` → edges
|
||||
- Tags `#tag` → node tags
|
||||
- YAML frontmatter → metadata
|
||||
- Folder structure → hierarchy
|
||||
|
||||
### 8.2 Markdown Folder Import
|
||||
|
||||
```bash
|
||||
# Import markdown folder
|
||||
cortex import markdown ./docs
|
||||
|
||||
# Respect folder hierarchy
|
||||
cortex import markdown ./docs --hierarchy
|
||||
|
||||
# Set kind for all imported nodes
|
||||
cortex import markdown ./docs --kind memory
|
||||
```
|
||||
|
||||
### 8.3 Markdown Export
|
||||
|
||||
```bash
|
||||
# Export to markdown folder
|
||||
cortex export-md ./output
|
||||
|
||||
# Export with frontmatter
|
||||
cortex export-md ./output --frontmatter
|
||||
|
||||
# Export specific nodes
|
||||
cortex export-md ./output --kind component
|
||||
```
|
||||
|
||||
### 8.4 JSON-LD Export
|
||||
|
||||
```bash
|
||||
# Export as JSON-LD (linked data)
|
||||
cortex export --format jsonld --output knowledge.jsonld
|
||||
|
||||
# Compatible with knowledge graph standards
|
||||
```
|
||||
|
||||
### 8.5 Full Backup/Restore
|
||||
|
||||
```bash
|
||||
# Backup entire database
|
||||
cortex backup ./backup-2024-01-15.cortex
|
||||
|
||||
# Restore from backup
|
||||
cortex restore ./backup-2024-01-15.cortex
|
||||
|
||||
# Auto-backup (cron-friendly)
|
||||
cortex backup --auto --keep 7 # Keep last 7 backups
|
||||
```
|
||||
|
||||
### 8.6 Notion Import (Future)
|
||||
|
||||
```bash
|
||||
# Import from Notion export
|
||||
cortex import notion ./notion-export
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### Obsidian Importer
|
||||
|
||||
```typescript
|
||||
// src/core/import/obsidian.ts
|
||||
export async function importObsidian(vaultPath: string, options: ImportOptions): Promise<ImportResult> {
|
||||
const files = await glob(`${vaultPath}/**/*.md`);
|
||||
const nodes: Map<string, Node> = new Map();
|
||||
const pendingLinks: Array<{ from: string; to: string; type: EdgeType }> = [];
|
||||
|
||||
// First pass: create nodes
|
||||
for (const file of files) {
|
||||
const content = await fs.readFile(file, 'utf-8');
|
||||
const { frontmatter, body } = parseFrontmatter(content);
|
||||
|
||||
const title = path.basename(file, '.md');
|
||||
const relativePath = path.relative(vaultPath, file);
|
||||
|
||||
const node = await addNode({
|
||||
kind: frontmatter.kind || 'memory',
|
||||
title,
|
||||
content: body,
|
||||
tags: extractTags(content, frontmatter.tags),
|
||||
status: frontmatter.status,
|
||||
metadata: {
|
||||
...frontmatter,
|
||||
importedFrom: 'obsidian',
|
||||
originalPath: relativePath,
|
||||
},
|
||||
});
|
||||
|
||||
nodes.set(title, node);
|
||||
}
|
||||
|
||||
// Second pass: create links
|
||||
for (const file of files) {
|
||||
const content = await fs.readFile(file, 'utf-8');
|
||||
const title = path.basename(file, '.md');
|
||||
const sourceNode = nodes.get(title);
|
||||
|
||||
// Find wikilinks [[Target]]
|
||||
const wikilinks = content.matchAll(/\[\[([^\]]+)\]\]/g);
|
||||
for (const match of wikilinks) {
|
||||
const targetTitle = match[1].split('|')[0]; // Handle aliases [[Page|Alias]]
|
||||
const targetNode = nodes.get(targetTitle);
|
||||
|
||||
if (sourceNode && targetNode) {
|
||||
addEdge(sourceNode.id, targetNode.id, 'relates_to');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Third pass: folder hierarchy
|
||||
if (options.hierarchy) {
|
||||
await createFolderHierarchy(vaultPath, nodes);
|
||||
}
|
||||
|
||||
return { imported: nodes.size };
|
||||
}
|
||||
```
|
||||
|
||||
### Markdown Exporter
|
||||
|
||||
```typescript
|
||||
// src/core/export/markdown.ts
|
||||
export async function exportMarkdown(outputDir: string, options: ExportOptions): Promise<void> {
|
||||
const nodes = await listNodes(options.filter);
|
||||
|
||||
await fs.mkdir(outputDir, { recursive: true });
|
||||
|
||||
for (const node of nodes) {
|
||||
const filename = sanitizeFilename(node.title) + '.md';
|
||||
const filepath = path.join(outputDir, filename);
|
||||
|
||||
let content = '';
|
||||
|
||||
// Add frontmatter
|
||||
if (options.frontmatter) {
|
||||
content += '---\n';
|
||||
content += `id: ${node.id}\n`;
|
||||
content += `kind: ${node.kind}\n`;
|
||||
content += `tags: [${node.tags.join(', ')}]\n`;
|
||||
if (node.status) content += `status: ${node.status}\n`;
|
||||
content += `created: ${new Date(node.createdAt).toISOString()}\n`;
|
||||
content += '---\n\n';
|
||||
}
|
||||
|
||||
// Add title
|
||||
content += `# ${node.title}\n\n`;
|
||||
|
||||
// Add content
|
||||
content += node.content || '';
|
||||
|
||||
// Add linked nodes as wikilinks
|
||||
const connections = getConnections(node.id);
|
||||
if (connections.outgoing.length > 0) {
|
||||
content += '\n\n## Related\n\n';
|
||||
for (const conn of connections.outgoing) {
|
||||
content += `- [[${conn.node.title}]] (${conn.type})\n`;
|
||||
}
|
||||
}
|
||||
|
||||
await fs.writeFile(filepath, content);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### JSON-LD Exporter
|
||||
|
||||
```typescript
|
||||
// src/core/export/jsonld.ts
|
||||
export async function exportJsonLd(options: ExportOptions): Promise<object> {
|
||||
const nodes = await listNodes(options.filter);
|
||||
const edges = await listEdges();
|
||||
|
||||
return {
|
||||
'@context': {
|
||||
'@vocab': 'https://schema.org/',
|
||||
'cortex': 'https://cortex.memory/',
|
||||
'relates_to': 'cortex:relatesTo',
|
||||
'contains': 'cortex:contains',
|
||||
'depends_on': 'cortex:dependsOn',
|
||||
},
|
||||
'@graph': nodes.map(node => ({
|
||||
'@id': `cortex:node/${node.id}`,
|
||||
'@type': kindToSchemaType(node.kind),
|
||||
'name': node.title,
|
||||
'description': node.content,
|
||||
'keywords': node.tags,
|
||||
'dateCreated': new Date(node.createdAt).toISOString(),
|
||||
...edgesToRelations(node.id, edges),
|
||||
})),
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
## CLI Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `cortex import obsidian <path>` | Import Obsidian vault |
|
||||
| `cortex import markdown <path>` | Import markdown folder |
|
||||
| `cortex import notion <path>` | Import Notion export |
|
||||
| `cortex export-md <path>` | Export to markdown |
|
||||
| `cortex export --format jsonld` | Export as JSON-LD |
|
||||
| `cortex backup <path>` | Backup database |
|
||||
| `cortex restore <path>` | Restore from backup |
|
||||
|
||||
## MCP Tools
|
||||
|
||||
```typescript
|
||||
memory_import // Import from file/path
|
||||
memory_export // Export to format
|
||||
memory_backup // Create backup
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
- [ ] Obsidian wikilinks become edges
|
||||
- [ ] Tags extracted correctly
|
||||
- [ ] Frontmatter preserved
|
||||
- [ ] Folder hierarchy respected
|
||||
- [ ] Markdown export produces valid files
|
||||
- [ ] JSON-LD validates against schema
|
||||
- [ ] Backup/restore preserves all data
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Import Obsidian vault with links intact
|
||||
- [ ] Export to Obsidian-compatible markdown
|
||||
- [ ] Round-trip preserves data
|
||||
- [ ] JSON-LD compatible with knowledge graph tools
|
||||
- [ ] Backup is single file, easily portable
|
||||
- [ ] Large vaults (10k+ files) import in <5 minutes
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- Obsidian importer: 6 hours
|
||||
- Markdown importer: 3 hours
|
||||
- Markdown exporter: 3 hours
|
||||
- JSON-LD exporter: 3 hours
|
||||
- Backup/restore: 2 hours
|
||||
- Testing: 4 hours
|
||||
- **Total: ~21 hours**
|
||||
|
||||
## Dependencies
|
||||
|
||||
- `gray-matter` for frontmatter parsing
|
||||
|
||||
## References
|
||||
|
||||
- [Obsidian file format](https://help.obsidian.md/Files+and+folders/Accepted+file+formats)
|
||||
- [JSON-LD specification](https://json-ld.org/)
|
||||
- [Schema.org](https://schema.org/)
|
||||
294
docs/milestones/09-multi-graph.md
Normal file
294
docs/milestones/09-multi-graph.md
Normal file
@@ -0,0 +1,294 @@
|
||||
# Milestone 9: Multi-Graph Support
|
||||
|
||||
## Overview
|
||||
|
||||
Support multiple separate knowledge graphs, one per project or context. Switch between them seamlessly.
|
||||
|
||||
## Motivation
|
||||
|
||||
- Different projects have different knowledge domains
|
||||
- Prevents cross-contamination of contexts
|
||||
- Enables project-specific memory without noise
|
||||
- Clean separation of concerns
|
||||
|
||||
## Features
|
||||
|
||||
### 9.1 Graph Management
|
||||
|
||||
```bash
|
||||
# List all graphs
|
||||
cortex graphs
|
||||
|
||||
# Create new graph
|
||||
cortex graphs create work
|
||||
cortex graphs create personal
|
||||
|
||||
# Switch active graph
|
||||
cortex use work
|
||||
cortex use personal
|
||||
|
||||
# Delete graph
|
||||
cortex graphs delete old-project
|
||||
```
|
||||
|
||||
### 9.2 Automatic Project Detection
|
||||
|
||||
```bash
|
||||
# Auto-detect based on .cortex file or git remote
|
||||
cd ~/projects/myapp
|
||||
cortex query "auth" # Automatically uses 'myapp' graph
|
||||
|
||||
# Explicit override
|
||||
cortex query "auth" --graph personal
|
||||
```
|
||||
|
||||
### 9.3 Cross-Graph Search
|
||||
|
||||
```bash
|
||||
# Search across all graphs
|
||||
cortex query "auth" --all-graphs
|
||||
|
||||
# Search specific graphs
|
||||
cortex query "auth" --graphs work,personal
|
||||
```
|
||||
|
||||
### 9.4 Graph Linking
|
||||
|
||||
```bash
|
||||
# Link nodes across graphs
|
||||
cortex link abc123 --to def456 --graph work
|
||||
|
||||
# Reference external graph nodes
|
||||
cortex show work:abc123 # graph:nodeId syntax
|
||||
```
|
||||
|
||||
### 9.5 Graph Configuration
|
||||
|
||||
```bash
|
||||
# Set default graph
|
||||
cortex config set default-graph work
|
||||
|
||||
# Project-specific graph mapping
|
||||
# In .cortex.json or .cortex file:
|
||||
{
|
||||
"graph": "myapp"
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### Storage Structure
|
||||
|
||||
```
|
||||
~/.cortex/
|
||||
├── graphs/
|
||||
│ ├── default/
|
||||
│ │ └── cortex.db
|
||||
│ ├── work/
|
||||
│ │ └── cortex.db
|
||||
│ └── personal/
|
||||
│ └── cortex.db
|
||||
├── config.json
|
||||
└── graph-links.db # Cross-graph references
|
||||
```
|
||||
|
||||
### Graph Manager
|
||||
|
||||
```typescript
|
||||
// src/core/graphs.ts
|
||||
export interface GraphInfo {
|
||||
name: string;
|
||||
path: string;
|
||||
nodeCount: number;
|
||||
lastAccessed: number;
|
||||
createdAt: number;
|
||||
}
|
||||
|
||||
export function listGraphs(): GraphInfo[] {
|
||||
const graphsDir = path.join(getConfigDir(), 'graphs');
|
||||
const dirs = fs.readdirSync(graphsDir);
|
||||
|
||||
return dirs.map(name => ({
|
||||
name,
|
||||
path: path.join(graphsDir, name),
|
||||
...getGraphStats(path.join(graphsDir, name, 'cortex.db')),
|
||||
}));
|
||||
}
|
||||
|
||||
export function createGraph(name: string): void {
|
||||
const graphPath = path.join(getConfigDir(), 'graphs', name);
|
||||
fs.mkdirSync(graphPath, { recursive: true });
|
||||
initializeDatabase(path.join(graphPath, 'cortex.db'));
|
||||
}
|
||||
|
||||
export function useGraph(name: string): void {
|
||||
const graphPath = path.join(getConfigDir(), 'graphs', name);
|
||||
if (!fs.existsSync(graphPath)) {
|
||||
throw new Error(`Graph '${name}' does not exist`);
|
||||
}
|
||||
setActiveGraph(name);
|
||||
}
|
||||
|
||||
export function getActiveGraph(): string {
|
||||
// Check project-local .cortex file
|
||||
const localConfig = findLocalConfig();
|
||||
if (localConfig?.graph) return localConfig.graph;
|
||||
|
||||
// Check git remote for auto-detection
|
||||
const gitRemote = getGitRemote();
|
||||
if (gitRemote) {
|
||||
const projectName = extractProjectName(gitRemote);
|
||||
if (graphExists(projectName)) return projectName;
|
||||
}
|
||||
|
||||
// Fall back to default
|
||||
return getConfig().defaultGraph || 'default';
|
||||
}
|
||||
```
|
||||
|
||||
### Database Connection Manager
|
||||
|
||||
```typescript
|
||||
// src/core/db.ts
|
||||
let activeDb: Database | null = null;
|
||||
let activeGraph: string = 'default';
|
||||
|
||||
export function getDb(): Database {
|
||||
const currentGraph = getActiveGraph();
|
||||
|
||||
if (activeDb && activeGraph === currentGraph) {
|
||||
return activeDb;
|
||||
}
|
||||
|
||||
// Close previous connection
|
||||
if (activeDb) {
|
||||
activeDb.close();
|
||||
}
|
||||
|
||||
// Open new connection
|
||||
const dbPath = path.join(getConfigDir(), 'graphs', currentGraph, 'cortex.db');
|
||||
activeDb = new Database(dbPath);
|
||||
activeGraph = currentGraph;
|
||||
|
||||
return activeDb;
|
||||
}
|
||||
```
|
||||
|
||||
### Project Detection
|
||||
|
||||
```typescript
|
||||
// src/core/graphs.ts
|
||||
function findLocalConfig(): LocalConfig | null {
|
||||
// Walk up directory tree looking for .cortex or .cortex.json
|
||||
let dir = process.cwd();
|
||||
while (dir !== path.dirname(dir)) {
|
||||
const configPath = path.join(dir, '.cortex');
|
||||
if (fs.existsSync(configPath)) {
|
||||
return JSON.parse(fs.readFileSync(configPath, 'utf-8'));
|
||||
}
|
||||
const jsonPath = path.join(dir, '.cortex.json');
|
||||
if (fs.existsSync(jsonPath)) {
|
||||
return JSON.parse(fs.readFileSync(jsonPath, 'utf-8'));
|
||||
}
|
||||
dir = path.dirname(dir);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
```
|
||||
|
||||
### Cross-Graph References
|
||||
|
||||
```typescript
|
||||
// Parse graph:nodeId syntax
|
||||
function parseNodeRef(ref: string): { graph?: string; nodeId: string } {
|
||||
if (ref.includes(':')) {
|
||||
const [graph, nodeId] = ref.split(':');
|
||||
return { graph, nodeId };
|
||||
}
|
||||
return { nodeId: ref };
|
||||
}
|
||||
|
||||
// Get node from any graph
|
||||
async function getNodeFromAnyGraph(ref: string): Promise<Node | null> {
|
||||
const { graph, nodeId } = parseNodeRef(ref);
|
||||
if (graph) {
|
||||
const originalGraph = getActiveGraph();
|
||||
useGraph(graph);
|
||||
const node = await getNode(nodeId);
|
||||
useGraph(originalGraph);
|
||||
return node;
|
||||
}
|
||||
return getNode(nodeId);
|
||||
}
|
||||
```
|
||||
|
||||
## CLI Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `cortex graphs` | List all graphs |
|
||||
| `cortex graphs create <name>` | Create new graph |
|
||||
| `cortex graphs delete <name>` | Delete graph |
|
||||
| `cortex use <name>` | Switch active graph |
|
||||
| `cortex <cmd> --graph <name>` | Run command on specific graph |
|
||||
| `cortex query --all-graphs` | Search all graphs |
|
||||
|
||||
## MCP Tools
|
||||
|
||||
```typescript
|
||||
memory_graphs // List available graphs
|
||||
memory_use_graph // Switch active graph
|
||||
memory_query // Add optional 'graph' parameter
|
||||
```
|
||||
|
||||
## Project Config File
|
||||
|
||||
```json
|
||||
// .cortex or .cortex.json in project root
|
||||
{
|
||||
"graph": "myproject",
|
||||
"autoCapture": true,
|
||||
"contextInjection": {
|
||||
"maxTokens": 4000,
|
||||
"includeTasks": true
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
- [ ] Create/list/delete graphs
|
||||
- [ ] Switch between graphs
|
||||
- [ ] Auto-detect from .cortex file
|
||||
- [ ] Auto-detect from git remote
|
||||
- [ ] Cross-graph search works
|
||||
- [ ] Cross-graph references resolve
|
||||
- [ ] Concurrent access handles graph switching
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Each project can have isolated graph
|
||||
- [ ] Auto-detection based on directory
|
||||
- [ ] Cross-graph search available
|
||||
- [ ] Graph switching is fast (<100ms)
|
||||
- [ ] No data leakage between graphs
|
||||
- [ ] MCP tools respect active graph
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- Storage structure: 2 hours
|
||||
- Graph manager: 4 hours
|
||||
- DB connection manager: 2 hours
|
||||
- Project detection: 3 hours
|
||||
- Cross-graph search: 3 hours
|
||||
- CLI commands: 2 hours
|
||||
- Testing: 3 hours
|
||||
- **Total: ~19 hours**
|
||||
|
||||
## Dependencies
|
||||
|
||||
- None (can implement independently)
|
||||
|
||||
## References
|
||||
|
||||
- [Git worktrees](https://git-scm.com/docs/git-worktree) for multi-project inspiration
|
||||
299
docs/milestones/10-smart-retrieval.md
Normal file
299
docs/milestones/10-smart-retrieval.md
Normal file
@@ -0,0 +1,299 @@
|
||||
# Milestone 10: Smart Retrieval
|
||||
|
||||
## Overview
|
||||
|
||||
Context-aware, git-integrated search that understands what you're working on and retrieves the most relevant memories.
|
||||
|
||||
## Motivation
|
||||
|
||||
- Current search requires explicit queries
|
||||
- Relevance should consider current context
|
||||
- Git changes indicate what's important right now
|
||||
- Reduce cognitive load of "what should I search for?"
|
||||
|
||||
## Features
|
||||
|
||||
### 10.1 Context-Aware Search
|
||||
|
||||
```bash
|
||||
# Search based on current context (files, git status)
|
||||
cortex smart-search
|
||||
cortex ss # Alias
|
||||
|
||||
# Combine with explicit query
|
||||
cortex smart-search "authentication"
|
||||
```
|
||||
|
||||
### 10.2 Git-Integrated Relevance
|
||||
|
||||
```typescript
|
||||
interface GitContext {
|
||||
branch: string;
|
||||
recentCommits: string[]; // Last 5 commit messages
|
||||
modifiedFiles: string[]; // Uncommitted changes
|
||||
stagedFiles: string[];
|
||||
recentlyTouched: string[]; // Files in recent commits
|
||||
}
|
||||
```
|
||||
|
||||
### 10.3 File-Based Relevance
|
||||
|
||||
```typescript
|
||||
// Boost nodes related to currently open/modified files
|
||||
function getFileContext(): FileContext {
|
||||
return {
|
||||
cwd: process.cwd(),
|
||||
modifiedFiles: getGitModified(),
|
||||
recentFiles: getRecentlyAccessed(),
|
||||
projectType: detectProjectType(),
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
### 10.4 Time-Aware Boosting
|
||||
|
||||
```typescript
|
||||
interface TimeBoost {
|
||||
lastHour: 1.5; // Accessed in last hour
|
||||
lastDay: 1.3; // Accessed today
|
||||
lastWeek: 1.1; // Accessed this week
|
||||
older: 1.0; // No boost
|
||||
}
|
||||
```
|
||||
|
||||
### 10.5 Related Node Expansion
|
||||
|
||||
```bash
|
||||
# Find nodes and expand to related
|
||||
cortex query "auth" --expand
|
||||
|
||||
# Returns auth nodes + nodes they link to
|
||||
```
|
||||
|
||||
### 10.6 "What Should I Know?" Command
|
||||
|
||||
```bash
|
||||
# Proactive: what's relevant right now?
|
||||
cortex context
|
||||
cortex what # Alias
|
||||
|
||||
# Returns:
|
||||
# - Memories related to current git branch
|
||||
# - Decisions about modified files
|
||||
# - Open tasks for this project
|
||||
# - Recent work in this area
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### Smart Search Pipeline
|
||||
|
||||
```typescript
|
||||
// src/core/search/smart.ts
|
||||
export async function smartSearch(explicitQuery?: string): Promise<SearchResult[]> {
|
||||
// Gather context signals
|
||||
const gitContext = await getGitContext();
|
||||
const fileContext = await getFileContext();
|
||||
const projectContext = await getProjectContext();
|
||||
|
||||
// Build implicit query from context
|
||||
const implicitQuery = buildImplicitQuery(gitContext, fileContext, projectContext);
|
||||
|
||||
// Combine with explicit query
|
||||
const combinedQuery = explicitQuery
|
||||
? `${explicitQuery} ${implicitQuery}`
|
||||
: implicitQuery;
|
||||
|
||||
// Run hybrid search
|
||||
const results = await query(combinedQuery, { limit: 50 });
|
||||
|
||||
// Re-rank based on context signals
|
||||
const reranked = rerankResults(results, {
|
||||
gitContext,
|
||||
fileContext,
|
||||
timeBoosts: TIME_BOOSTS,
|
||||
});
|
||||
|
||||
return reranked.slice(0, 20);
|
||||
}
|
||||
|
||||
function buildImplicitQuery(git: GitContext, file: FileContext, project: ProjectContext): string {
|
||||
const parts: string[] = [];
|
||||
|
||||
// Add branch name (often contains feature/ticket info)
|
||||
if (git.branch && git.branch !== 'main' && git.branch !== 'master') {
|
||||
parts.push(git.branch.replace(/[-_\/]/g, ' '));
|
||||
}
|
||||
|
||||
// Add recent commit messages
|
||||
parts.push(...git.recentCommits.slice(0, 3));
|
||||
|
||||
// Add modified file names (without extension)
|
||||
parts.push(...file.modifiedFiles.map(f => path.basename(f, path.extname(f))));
|
||||
|
||||
// Add project name
|
||||
parts.push(project.name);
|
||||
|
||||
return parts.join(' ');
|
||||
}
|
||||
```
|
||||
|
||||
### Git Context Extractor
|
||||
|
||||
```typescript
|
||||
// src/core/search/git-context.ts
|
||||
export async function getGitContext(): Promise<GitContext> {
|
||||
try {
|
||||
const branch = execSync('git branch --show-current', { encoding: 'utf-8' }).trim();
|
||||
const recentCommits = execSync('git log --oneline -5', { encoding: 'utf-8' })
|
||||
.trim()
|
||||
.split('\n')
|
||||
.map(line => line.split(' ').slice(1).join(' '));
|
||||
const modifiedFiles = execSync('git diff --name-only', { encoding: 'utf-8' })
|
||||
.trim()
|
||||
.split('\n')
|
||||
.filter(Boolean);
|
||||
const stagedFiles = execSync('git diff --staged --name-only', { encoding: 'utf-8' })
|
||||
.trim()
|
||||
.split('\n')
|
||||
.filter(Boolean);
|
||||
|
||||
return { branch, recentCommits, modifiedFiles, stagedFiles };
|
||||
} catch {
|
||||
return { branch: '', recentCommits: [], modifiedFiles: [], stagedFiles: [] };
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Re-ranking Algorithm
|
||||
|
||||
```typescript
|
||||
function rerankResults(
|
||||
results: SearchResult[],
|
||||
context: RerankContext
|
||||
): SearchResult[] {
|
||||
return results
|
||||
.map(result => {
|
||||
let boost = 1.0;
|
||||
|
||||
// Time boost
|
||||
const age = Date.now() - result.node.lastAccessedAt;
|
||||
if (age < 60 * 60 * 1000) boost *= 1.5; // Last hour
|
||||
else if (age < 24 * 60 * 60 * 1000) boost *= 1.3; // Last day
|
||||
else if (age < 7 * 24 * 60 * 60 * 1000) boost *= 1.1; // Last week
|
||||
|
||||
// File relevance boost
|
||||
const nodeFiles = result.node.metadata?.files || [];
|
||||
const overlap = nodeFiles.filter(f =>
|
||||
context.gitContext.modifiedFiles.includes(f)
|
||||
).length;
|
||||
if (overlap > 0) boost *= 1.0 + (0.2 * overlap);
|
||||
|
||||
// Branch relevance boost
|
||||
if (result.node.tags.includes(context.gitContext.branch)) {
|
||||
boost *= 1.3;
|
||||
}
|
||||
|
||||
return { ...result, score: result.score * boost };
|
||||
})
|
||||
.sort((a, b) => b.score - a.score);
|
||||
}
|
||||
```
|
||||
|
||||
### "What Should I Know?" Command
|
||||
|
||||
```typescript
|
||||
// src/cli/commands/what.ts
|
||||
export async function whatCommand(): Promise<void> {
|
||||
const context = await gatherFullContext();
|
||||
|
||||
console.log(chalk.bold('📚 What you should know:\n'));
|
||||
|
||||
// Related to current branch
|
||||
if (context.branchRelated.length > 0) {
|
||||
console.log(chalk.cyan('Branch: ' + context.gitContext.branch));
|
||||
for (const node of context.branchRelated.slice(0, 3)) {
|
||||
console.log(` • ${node.title}`);
|
||||
}
|
||||
console.log();
|
||||
}
|
||||
|
||||
// Related to modified files
|
||||
if (context.fileRelated.length > 0) {
|
||||
console.log(chalk.cyan('Related to changes:'));
|
||||
for (const node of context.fileRelated.slice(0, 3)) {
|
||||
console.log(` • ${node.title}`);
|
||||
}
|
||||
console.log();
|
||||
}
|
||||
|
||||
// Open tasks
|
||||
if (context.tasks.length > 0) {
|
||||
console.log(chalk.cyan('Open tasks:'));
|
||||
for (const task of context.tasks.slice(0, 3)) {
|
||||
console.log(` • ${task.title}`);
|
||||
}
|
||||
console.log();
|
||||
}
|
||||
|
||||
// Recent decisions
|
||||
if (context.decisions.length > 0) {
|
||||
console.log(chalk.cyan('Recent decisions:'));
|
||||
for (const decision of context.decisions.slice(0, 3)) {
|
||||
console.log(` • ${decision.title}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## CLI Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `cortex smart-search [query]` | Context-aware search |
|
||||
| `cortex ss [query]` | Alias |
|
||||
| `cortex what` | What should I know right now? |
|
||||
| `cortex query --expand` | Expand to related nodes |
|
||||
|
||||
## MCP Tools
|
||||
|
||||
```typescript
|
||||
memory_smart_search // Context-aware search
|
||||
memory_what // Proactive context retrieval
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
- [ ] Git context extraction works
|
||||
- [ ] Modified files boost relevant nodes
|
||||
- [ ] Branch name improves relevance
|
||||
- [ ] Time boosting works correctly
|
||||
- [ ] "What" command returns useful results
|
||||
- [ ] Works when not in git repo (graceful fallback)
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Smart search outperforms basic search in relevance
|
||||
- [ ] Git integration provides useful signals
|
||||
- [ ] "What" command surfaces actionable info
|
||||
- [ ] No performance regression
|
||||
- [ ] Works in non-git directories
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- Git context extraction: 2 hours
|
||||
- Smart search pipeline: 4 hours
|
||||
- Re-ranking algorithm: 3 hours
|
||||
- "What" command: 3 hours
|
||||
- MCP tools: 2 hours
|
||||
- Testing: 3 hours
|
||||
- **Total: ~17 hours**
|
||||
|
||||
## Dependencies
|
||||
|
||||
- Git (optional, graceful fallback)
|
||||
|
||||
## References
|
||||
|
||||
- [Semantic search best practices](https://www.anthropic.com/research/rag)
|
||||
- [Learning to rank](https://en.wikipedia.org/wiki/Learning_to_rank)
|
||||
267
docs/milestones/11-tui-dashboard.md
Normal file
267
docs/milestones/11-tui-dashboard.md
Normal file
@@ -0,0 +1,267 @@
|
||||
# Milestone 11: TUI Dashboard
|
||||
|
||||
## Overview
|
||||
|
||||
Interactive terminal user interface for browsing, searching, and managing the knowledge graph without leaving the terminal.
|
||||
|
||||
## Motivation
|
||||
|
||||
- CLI commands require knowing what to type
|
||||
- Visual browsing aids discovery
|
||||
- Power users prefer staying in terminal
|
||||
- Real-time feedback on graph state
|
||||
|
||||
## Features
|
||||
|
||||
### 11.1 Main Dashboard
|
||||
|
||||
```bash
|
||||
# Launch interactive TUI
|
||||
cortex tui
|
||||
cortex ui # Alias
|
||||
```
|
||||
|
||||
Layout:
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────┐
|
||||
│ CORTEX Graph: work│
|
||||
├────────────────────┬────────────────────────────────────────────┤
|
||||
│ 📚 Recent (5) │ 🔍 Search: ________________________ │
|
||||
│ ├─ Auth flow... │ │
|
||||
│ ├─ API design... │ ┌─────────────────────────────────────┐ │
|
||||
│ └─ DB schema... │ │ Auth flow design │ │
|
||||
│ │ │ ───────────────────── │ │
|
||||
│ 📋 Tasks (3) │ │ OAuth2 PKCE flow for SPA │ │
|
||||
│ ├─ Fix login bug │ │ │ │
|
||||
│ └─ Update docs │ │ Tags: auth, security, oauth │ │
|
||||
│ │ │ Status: active │ │
|
||||
│ 🔗 Components (8) │ │ │ │
|
||||
│ ├─ UserService │ │ ── Connections ── │ │
|
||||
│ └─ AuthService │ │ → implements: OAuth2 spec │ │
|
||||
│ │ │ → contains: Token refresh logic │ │
|
||||
│ 🎯 Decisions (4) │ └─────────────────────────────────────┘ │
|
||||
│ └─ Use JWT... │ │
|
||||
├────────────────────┴────────────────────────────────────────────┤
|
||||
│ [/]Search [n]New [e]Edit [d]Delete [l]Link [q]Quit │
|
||||
└─────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### 11.2 Navigation
|
||||
|
||||
- Arrow keys / vim keys (hjkl) to navigate
|
||||
- Enter to select/expand
|
||||
- Tab to switch panels
|
||||
- `/` to search
|
||||
- `?` for help
|
||||
|
||||
### 11.3 Quick Actions
|
||||
|
||||
| Key | Action |
|
||||
|-----|--------|
|
||||
| `n` | New node |
|
||||
| `e` | Edit selected |
|
||||
| `d` | Delete selected |
|
||||
| `l` | Link to another node |
|
||||
| `t` | Add tags |
|
||||
| `s` | Change status |
|
||||
| `/` | Search |
|
||||
| `g` | Graph view (ASCII) |
|
||||
| `q` | Quit |
|
||||
|
||||
### 11.4 Search Mode
|
||||
|
||||
Real-time search as you type:
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────┐
|
||||
│ 🔍 Search: auth_ │
|
||||
├─────────────────────────────────────────────────────────────────┤
|
||||
│ 1. [component] AuthService (0.95) │
|
||||
│ 2. [decision] Use JWT tokens (0.87) │
|
||||
│ 3. [memory] Auth flow design (0.82) │
|
||||
│ 4. [task] Fix auth bug (0.75) │
|
||||
└─────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### 11.5 Graph View Mode
|
||||
|
||||
ASCII visualization of connections:
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────┐
|
||||
│ Graph View: AuthService │
|
||||
├─────────────────────────────────────────────────────────────────┤
|
||||
│ │
|
||||
│ ┌──────────────┐ │
|
||||
│ depends_on │ UserService │ │
|
||||
│ ┌───────►│ │ │
|
||||
│ │ └──────────────┘ │
|
||||
│ ┌────────┴─────┐ │
|
||||
│ │ AuthService │ │
|
||||
│ │ │──────────┐ │
|
||||
│ └──────────────┘ │ implements │
|
||||
│ │ ▼ │
|
||||
│ │ ┌──────────────┐ │
|
||||
│ contains│ │ OAuth2 spec │ │
|
||||
│ ▼ └──────────────┘ │
|
||||
│ ┌──────────────┐ │
|
||||
│ │Token refresh │ │
|
||||
│ └──────────────┘ │
|
||||
│ │
|
||||
└─────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### 11.6 Edit Mode
|
||||
|
||||
Inline editing with validation:
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────┐
|
||||
│ Edit: AuthService │
|
||||
├─────────────────────────────────────────────────────────────────┤
|
||||
│ Title: [AuthService ] │
|
||||
│ Kind: [component ▼] │
|
||||
│ Status: [active ▼] │
|
||||
│ Tags: [backend, auth, service ] │
|
||||
│ │
|
||||
│ Content: │
|
||||
│ ┌─────────────────────────────────────────────────────────┐ │
|
||||
│ │ Handles all authentication operations including: │ │
|
||||
│ │ - Login/logout │ │
|
||||
│ │ - Token management │ │
|
||||
│ │ - Session handling │ │
|
||||
│ └─────────────────────────────────────────────────────────┘ │
|
||||
│ │
|
||||
│ [Tab]Next field [Ctrl+S]Save [Esc]Cancel │
|
||||
└─────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### Technology Choice
|
||||
|
||||
Use **Ink** (React for CLI) or **blessed-contrib**:
|
||||
|
||||
```typescript
|
||||
// Using Ink
|
||||
import React from 'react';
|
||||
import { render, Box, Text, useInput } from 'ink';
|
||||
import { useState, useEffect } from 'react';
|
||||
|
||||
const Dashboard: React.FC = () => {
|
||||
const [nodes, setNodes] = useState<Node[]>([]);
|
||||
const [selected, setSelected] = useState(0);
|
||||
const [mode, setMode] = useState<'browse' | 'search' | 'edit'>('browse');
|
||||
|
||||
useInput((input, key) => {
|
||||
if (key.upArrow || input === 'k') setSelected(s => Math.max(0, s - 1));
|
||||
if (key.downArrow || input === 'j') setSelected(s => Math.min(nodes.length - 1, s + 1));
|
||||
if (input === '/') setMode('search');
|
||||
if (input === 'q') process.exit(0);
|
||||
});
|
||||
|
||||
return (
|
||||
<Box flexDirection="column">
|
||||
<Header graph={activeGraph} />
|
||||
<Box flexDirection="row">
|
||||
<Sidebar nodes={nodes} selected={selected} />
|
||||
<DetailPanel node={nodes[selected]} />
|
||||
</Box>
|
||||
<StatusBar mode={mode} />
|
||||
</Box>
|
||||
);
|
||||
};
|
||||
```
|
||||
|
||||
### Component Structure
|
||||
|
||||
```
|
||||
src/tui/
|
||||
├── index.tsx # Entry point
|
||||
├── components/
|
||||
│ ├── Dashboard.tsx # Main layout
|
||||
│ ├── Sidebar.tsx # Node list
|
||||
│ ├── DetailPanel.tsx # Node details
|
||||
│ ├── SearchBox.tsx # Search input
|
||||
│ ├── GraphView.tsx # ASCII graph
|
||||
│ ├── EditForm.tsx # Edit mode
|
||||
│ └── StatusBar.tsx # Bottom bar
|
||||
├── hooks/
|
||||
│ ├── useNodes.ts # Node data fetching
|
||||
│ ├── useSearch.ts # Search state
|
||||
│ └── useKeyboard.ts # Key handling
|
||||
└── utils/
|
||||
└── ascii-graph.ts # ASCII graph renderer
|
||||
```
|
||||
|
||||
### ASCII Graph Renderer
|
||||
|
||||
```typescript
|
||||
// src/tui/utils/ascii-graph.ts
|
||||
export function renderAsciiGraph(rootId: string, depth: number = 3): string[] {
|
||||
const lines: string[] = [];
|
||||
const node = getNode(rootId);
|
||||
if (!node) return lines;
|
||||
|
||||
const connections = getConnections(rootId);
|
||||
|
||||
lines.push(`┌${'─'.repeat(node.title.length + 4)}┐`);
|
||||
lines.push(`│ ${node.title} │`);
|
||||
lines.push(`└${'─'.repeat(node.title.length + 4)}┘`);
|
||||
|
||||
for (const conn of connections.outgoing) {
|
||||
lines.push(` │ ${conn.type}`);
|
||||
lines.push(` ▼`);
|
||||
lines.push(`┌${'─'.repeat(conn.node.title.length + 4)}┐`);
|
||||
lines.push(`│ ${conn.node.title} │`);
|
||||
lines.push(`└${'─'.repeat(conn.node.title.length + 4)}┘`);
|
||||
}
|
||||
|
||||
return lines;
|
||||
}
|
||||
```
|
||||
|
||||
## CLI Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `cortex tui` | Launch interactive TUI |
|
||||
| `cortex ui` | Alias |
|
||||
|
||||
## Testing
|
||||
|
||||
- [ ] Navigation works smoothly
|
||||
- [ ] Search updates in real-time
|
||||
- [ ] Edit mode saves correctly
|
||||
- [ ] Graph view renders correctly
|
||||
- [ ] Handles large node lists
|
||||
- [ ] Vim keybindings work
|
||||
- [ ] Color scheme readable
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Can browse all nodes visually
|
||||
- [ ] Search is responsive (<100ms)
|
||||
- [ ] Edit/create nodes without leaving TUI
|
||||
- [ ] Graph view shows connections
|
||||
- [ ] Keyboard-only navigation
|
||||
- [ ] Works in standard terminals
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- Ink/blessed setup: 3 hours
|
||||
- Sidebar component: 3 hours
|
||||
- Detail panel: 3 hours
|
||||
- Search component: 3 hours
|
||||
- Edit form: 4 hours
|
||||
- Graph view: 4 hours
|
||||
- Polish/testing: 4 hours
|
||||
- **Total: ~24 hours**
|
||||
|
||||
## Dependencies
|
||||
|
||||
- `ink` or `blessed-contrib`
|
||||
- `react` (for Ink)
|
||||
|
||||
## References
|
||||
|
||||
- [Ink - React for CLI](https://github.com/vadimdemedes/ink)
|
||||
- [blessed-contrib](https://github.com/yaronn/blessed-contrib)
|
||||
- [Lazygit TUI](https://github.com/jesseduffield/lazygit) for inspiration
|
||||
339
docs/milestones/12-browser-extension.md
Normal file
339
docs/milestones/12-browser-extension.md
Normal file
@@ -0,0 +1,339 @@
|
||||
# Milestone 12: Browser Extension
|
||||
|
||||
## Overview
|
||||
|
||||
Browser extension to save content from any webpage directly to Cortex. One-click capture for research and documentation.
|
||||
|
||||
## Motivation
|
||||
|
||||
- Web research is a major knowledge source
|
||||
- Manual copy-paste is friction
|
||||
- Supermemory's browser extension is popular
|
||||
- Capture context while browsing
|
||||
|
||||
## Features
|
||||
|
||||
### 12.1 Quick Save
|
||||
|
||||
Right-click any page → "Save to Cortex"
|
||||
|
||||
### 12.2 Selection Save
|
||||
|
||||
Select text → Right-click → "Save selection to Cortex"
|
||||
|
||||
### 12.3 Popup Interface
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────┐
|
||||
│ 🧠 Save to Cortex │
|
||||
├─────────────────────────────────────────┤
|
||||
│ Title: [API Documentation ] │
|
||||
│ Kind: [memory ▼] │
|
||||
│ Tags: [docs, api, reference ] │
|
||||
│ │
|
||||
│ ☑ Include page content │
|
||||
│ ☐ Include selection only │
|
||||
│ ☐ Include screenshot │
|
||||
│ │
|
||||
│ [Save] [Cancel] │
|
||||
└─────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### 12.4 Auto-Extract
|
||||
|
||||
Automatically extract:
|
||||
- Page title
|
||||
- Main content (via Readability)
|
||||
- Meta description
|
||||
- Author/date if available
|
||||
|
||||
### 12.5 Tag Suggestions
|
||||
|
||||
Suggest tags based on:
|
||||
- Page content analysis
|
||||
- Existing tags in Cortex
|
||||
- URL domain
|
||||
|
||||
### 12.6 Local Communication
|
||||
|
||||
Extension communicates with local Cortex server:
|
||||
- Native messaging (preferred)
|
||||
- Local HTTP API fallback
|
||||
|
||||
## Implementation
|
||||
|
||||
### Extension Structure
|
||||
|
||||
```
|
||||
extension/
|
||||
├── manifest.json
|
||||
├── popup/
|
||||
│ ├── popup.html
|
||||
│ ├── popup.js
|
||||
│ └── popup.css
|
||||
├── content/
|
||||
│ └── content.js # Injected into pages
|
||||
├── background/
|
||||
│ └── background.js # Service worker
|
||||
├── icons/
|
||||
│ ├── icon-16.png
|
||||
│ ├── icon-48.png
|
||||
│ └── icon-128.png
|
||||
└── native/
|
||||
└── cortex-native.js # Native messaging host
|
||||
```
|
||||
|
||||
### Manifest (v3)
|
||||
|
||||
```json
|
||||
{
|
||||
"manifest_version": 3,
|
||||
"name": "Cortex - Save to Memory",
|
||||
"version": "1.0.0",
|
||||
"description": "Save web content to your Cortex knowledge graph",
|
||||
"permissions": [
|
||||
"activeTab",
|
||||
"contextMenus",
|
||||
"storage",
|
||||
"nativeMessaging"
|
||||
],
|
||||
"host_permissions": [
|
||||
"http://localhost:3100/*"
|
||||
],
|
||||
"action": {
|
||||
"default_popup": "popup/popup.html",
|
||||
"default_icon": {
|
||||
"16": "icons/icon-16.png",
|
||||
"48": "icons/icon-48.png",
|
||||
"128": "icons/icon-128.png"
|
||||
}
|
||||
},
|
||||
"background": {
|
||||
"service_worker": "background/background.js"
|
||||
},
|
||||
"content_scripts": [
|
||||
{
|
||||
"matches": ["<all_urls>"],
|
||||
"js": ["content/content.js"]
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### Content Script
|
||||
|
||||
```javascript
|
||||
// content/content.js
|
||||
// Extract page content using Readability
|
||||
function extractContent() {
|
||||
const clone = document.cloneNode(true);
|
||||
const reader = new Readability(clone);
|
||||
const article = reader.parse();
|
||||
|
||||
return {
|
||||
title: article?.title || document.title,
|
||||
content: article?.textContent || '',
|
||||
excerpt: article?.excerpt || '',
|
||||
url: window.location.href,
|
||||
selection: window.getSelection()?.toString() || '',
|
||||
};
|
||||
}
|
||||
|
||||
// Listen for messages from popup/background
|
||||
chrome.runtime.onMessage.addListener((request, sender, sendResponse) => {
|
||||
if (request.action === 'extract') {
|
||||
sendResponse(extractContent());
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
### Background Script
|
||||
|
||||
```javascript
|
||||
// background/background.js
|
||||
// Context menu
|
||||
chrome.runtime.onInstalled.addListener(() => {
|
||||
chrome.contextMenus.create({
|
||||
id: 'save-to-cortex',
|
||||
title: 'Save to Cortex',
|
||||
contexts: ['page', 'selection'],
|
||||
});
|
||||
});
|
||||
|
||||
chrome.contextMenus.onClicked.addListener(async (info, tab) => {
|
||||
if (info.menuItemId === 'save-to-cortex') {
|
||||
// Get content from page
|
||||
const [{ result }] = await chrome.scripting.executeScript({
|
||||
target: { tabId: tab.id },
|
||||
func: () => window.__cortexExtract(),
|
||||
});
|
||||
|
||||
// Save via native messaging or HTTP
|
||||
await saveToCortext(result);
|
||||
}
|
||||
});
|
||||
|
||||
async function saveToCortex(content) {
|
||||
try {
|
||||
// Try native messaging first
|
||||
const response = await chrome.runtime.sendNativeMessage('cortex', {
|
||||
action: 'save',
|
||||
data: content,
|
||||
});
|
||||
return response;
|
||||
} catch {
|
||||
// Fall back to HTTP API
|
||||
const response = await fetch('http://localhost:3100/api/nodes', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
kind: 'memory',
|
||||
title: content.title,
|
||||
content: content.content,
|
||||
tags: ['web-clip'],
|
||||
metadata: {
|
||||
source: { type: 'url', url: content.url },
|
||||
},
|
||||
}),
|
||||
});
|
||||
return response.json();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Popup UI
|
||||
|
||||
```html
|
||||
<!-- popup/popup.html -->
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<link rel="stylesheet" href="popup.css">
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<h1>🧠 Save to Cortex</h1>
|
||||
|
||||
<div class="field">
|
||||
<label>Title</label>
|
||||
<input type="text" id="title" />
|
||||
</div>
|
||||
|
||||
<div class="field">
|
||||
<label>Kind</label>
|
||||
<select id="kind">
|
||||
<option value="memory">Memory</option>
|
||||
<option value="component">Component</option>
|
||||
<option value="decision">Decision</option>
|
||||
<option value="task">Task</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div class="field">
|
||||
<label>Tags</label>
|
||||
<input type="text" id="tags" placeholder="comma, separated" />
|
||||
</div>
|
||||
|
||||
<div class="field">
|
||||
<label>
|
||||
<input type="checkbox" id="includeContent" checked />
|
||||
Include page content
|
||||
</label>
|
||||
</div>
|
||||
|
||||
<div class="actions">
|
||||
<button id="save">Save</button>
|
||||
<button id="cancel">Cancel</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script src="popup.js"></script>
|
||||
</body>
|
||||
</html>
|
||||
```
|
||||
|
||||
### Native Messaging Host
|
||||
|
||||
```javascript
|
||||
// native/cortex-native.js
|
||||
#!/usr/bin/env node
|
||||
const { addNode } = require('../dist/core/store');
|
||||
|
||||
process.stdin.on('readable', () => {
|
||||
// Read message length (4 bytes)
|
||||
const lengthBuf = process.stdin.read(4);
|
||||
if (!lengthBuf) return;
|
||||
|
||||
const length = lengthBuf.readUInt32LE(0);
|
||||
const messageBuf = process.stdin.read(length);
|
||||
const message = JSON.parse(messageBuf.toString());
|
||||
|
||||
handleMessage(message).then(response => {
|
||||
const responseBuf = Buffer.from(JSON.stringify(response));
|
||||
const lengthBuf = Buffer.alloc(4);
|
||||
lengthBuf.writeUInt32LE(responseBuf.length, 0);
|
||||
process.stdout.write(lengthBuf);
|
||||
process.stdout.write(responseBuf);
|
||||
});
|
||||
});
|
||||
|
||||
async function handleMessage(message) {
|
||||
if (message.action === 'save') {
|
||||
const node = await addNode({
|
||||
kind: message.data.kind || 'memory',
|
||||
title: message.data.title,
|
||||
content: message.data.content,
|
||||
tags: message.data.tags || ['web-clip'],
|
||||
metadata: { source: message.data.source },
|
||||
});
|
||||
return { success: true, nodeId: node.id };
|
||||
}
|
||||
return { success: false, error: 'Unknown action' };
|
||||
}
|
||||
```
|
||||
|
||||
## CLI Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `cortex extension install` | Install native messaging host |
|
||||
| `cortex extension status` | Check extension connectivity |
|
||||
|
||||
## Testing
|
||||
|
||||
- [ ] Extension installs in Chrome/Edge
|
||||
- [ ] Context menu appears
|
||||
- [ ] Popup opens and pre-fills data
|
||||
- [ ] Save creates node in Cortex
|
||||
- [ ] Selection save captures selected text
|
||||
- [ ] Native messaging works
|
||||
- [ ] HTTP fallback works
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] One-click save from any webpage
|
||||
- [ ] Content extracted cleanly
|
||||
- [ ] Tags can be added before save
|
||||
- [ ] Works without Cortex server running (queues)
|
||||
- [ ] Firefox + Chrome support
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- Extension scaffold: 2 hours
|
||||
- Content extraction: 3 hours
|
||||
- Popup UI: 4 hours
|
||||
- Background worker: 3 hours
|
||||
- Native messaging: 4 hours
|
||||
- Testing/polish: 4 hours
|
||||
- **Total: ~20 hours**
|
||||
|
||||
## Dependencies
|
||||
|
||||
- `@anthropic/readability` for content extraction
|
||||
- Chrome/Edge extension APIs
|
||||
|
||||
## References
|
||||
|
||||
- [Chrome Extension Manifest V3](https://developer.chrome.com/docs/extensions/mv3/intro/)
|
||||
- [Native Messaging](https://developer.chrome.com/docs/extensions/mv3/nativeMessaging/)
|
||||
- [Readability](https://github.com/mozilla/readability)
|
||||
396
docs/milestones/13-shell-completions.md
Normal file
396
docs/milestones/13-shell-completions.md
Normal file
@@ -0,0 +1,396 @@
|
||||
# Milestone 13: Shell Completions
|
||||
|
||||
## Overview
|
||||
|
||||
Auto-completion for Cortex CLI in Bash, Zsh, Fish, and PowerShell. Tab-complete commands, node IDs, tags, and options.
|
||||
|
||||
## Motivation
|
||||
|
||||
- Tab completion is expected in modern CLIs
|
||||
- Reduces typing and errors
|
||||
- Helps discoverability of commands
|
||||
- Professional polish
|
||||
|
||||
## Features
|
||||
|
||||
### 13.1 Command Completion
|
||||
|
||||
```bash
|
||||
cortex <TAB>
|
||||
# add children decay export graph index journal link list query remove serve show update
|
||||
```
|
||||
|
||||
### 13.2 Option Completion
|
||||
|
||||
```bash
|
||||
cortex add --<TAB>
|
||||
# --title --content --tags --status --section --help
|
||||
```
|
||||
|
||||
### 13.3 Node ID Completion
|
||||
|
||||
```bash
|
||||
cortex show <TAB>
|
||||
# abc123 def456 ghi789 (recent node IDs)
|
||||
|
||||
cortex show abc<TAB>
|
||||
# abc123 abc456 (matching prefixes)
|
||||
```
|
||||
|
||||
### 13.4 Tag Completion
|
||||
|
||||
```bash
|
||||
cortex list --tags <TAB>
|
||||
# auth backend security api docs (existing tags)
|
||||
```
|
||||
|
||||
### 13.5 Kind/Status Completion
|
||||
|
||||
```bash
|
||||
cortex add <TAB>
|
||||
# memory component task decision
|
||||
|
||||
cortex list --status <TAB>
|
||||
# active todo done deprecated
|
||||
```
|
||||
|
||||
### 13.6 Graph Completion
|
||||
|
||||
```bash
|
||||
cortex use <TAB>
|
||||
# default work personal myproject
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### Completion Generator
|
||||
|
||||
```typescript
|
||||
// src/cli/completions/index.ts
|
||||
export function generateCompletions(shell: 'bash' | 'zsh' | 'fish' | 'powershell'): string {
|
||||
switch (shell) {
|
||||
case 'bash':
|
||||
return generateBashCompletions();
|
||||
case 'zsh':
|
||||
return generateZshCompletions();
|
||||
case 'fish':
|
||||
return generateFishCompletions();
|
||||
case 'powershell':
|
||||
return generatePowerShellCompletions();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Bash Completions
|
||||
|
||||
```bash
|
||||
# Generated by cortex completions bash
|
||||
_cortex_completions() {
|
||||
local cur prev words cword
|
||||
_init_completion || return
|
||||
|
||||
local commands="add children decay export graph index journal link list query remove serve show update use graphs"
|
||||
local kinds="memory component task decision"
|
||||
local statuses="active todo doing done deprecated"
|
||||
|
||||
case "${prev}" in
|
||||
cortex)
|
||||
COMPREPLY=($(compgen -W "${commands}" -- "${cur}"))
|
||||
return 0
|
||||
;;
|
||||
add)
|
||||
COMPREPLY=($(compgen -W "${kinds}" -- "${cur}"))
|
||||
return 0
|
||||
;;
|
||||
--kind)
|
||||
COMPREPLY=($(compgen -W "${kinds}" -- "${cur}"))
|
||||
return 0
|
||||
;;
|
||||
--status)
|
||||
COMPREPLY=($(compgen -W "${statuses}" -- "${cur}"))
|
||||
return 0
|
||||
;;
|
||||
--tags)
|
||||
# Dynamic: fetch existing tags
|
||||
local tags=$(cortex --get-tags 2>/dev/null)
|
||||
COMPREPLY=($(compgen -W "${tags}" -- "${cur}"))
|
||||
return 0
|
||||
;;
|
||||
show|update|remove|link|children)
|
||||
# Dynamic: fetch node IDs
|
||||
local nodes=$(cortex --get-nodes "${cur}" 2>/dev/null)
|
||||
COMPREPLY=($(compgen -W "${nodes}" -- "${cur}"))
|
||||
return 0
|
||||
;;
|
||||
use)
|
||||
# Dynamic: fetch graphs
|
||||
local graphs=$(cortex --get-graphs 2>/dev/null)
|
||||
COMPREPLY=($(compgen -W "${graphs}" -- "${cur}"))
|
||||
return 0
|
||||
;;
|
||||
esac
|
||||
|
||||
# Options
|
||||
if [[ "${cur}" == -* ]]; then
|
||||
local opts="--help --version --title --content --tags --status --kind --limit --format"
|
||||
COMPREPLY=($(compgen -W "${opts}" -- "${cur}"))
|
||||
return 0
|
||||
fi
|
||||
}
|
||||
|
||||
complete -F _cortex_completions cortex
|
||||
```
|
||||
|
||||
### Zsh Completions
|
||||
|
||||
```zsh
|
||||
#compdef cortex
|
||||
|
||||
_cortex() {
|
||||
local -a commands
|
||||
commands=(
|
||||
'add:Add a node to the knowledge graph'
|
||||
'query:Search the knowledge graph'
|
||||
'show:Show a node and its connections'
|
||||
'list:List nodes'
|
||||
'update:Update a node'
|
||||
'remove:Remove a node'
|
||||
'link:Create a link between nodes'
|
||||
'graph:Visualize the knowledge graph'
|
||||
'children:List child nodes'
|
||||
'journal:Daily journal'
|
||||
'index:Index project'
|
||||
'serve:Start web server'
|
||||
'decay:Mark old nodes as stale'
|
||||
'use:Switch active graph'
|
||||
'graphs:Manage graphs'
|
||||
)
|
||||
|
||||
local -a kinds=(memory component task decision)
|
||||
local -a statuses=(active todo doing done deprecated)
|
||||
|
||||
_arguments -C \
|
||||
'1:command:->command' \
|
||||
'*::arg:->args'
|
||||
|
||||
case "$state" in
|
||||
command)
|
||||
_describe -t commands 'cortex commands' commands
|
||||
;;
|
||||
args)
|
||||
case "$words[1]" in
|
||||
add)
|
||||
_arguments \
|
||||
'1:kind:(memory component task decision)' \
|
||||
'--title[Node title]:title:' \
|
||||
'--content[Node content]:content:' \
|
||||
'--tags[Tags]:tags:->tags' \
|
||||
'--status[Status]:status:(active todo doing done deprecated)'
|
||||
;;
|
||||
show|update|remove|children)
|
||||
_arguments '1:node:->nodes'
|
||||
;;
|
||||
link)
|
||||
_arguments \
|
||||
'1:from:->nodes' \
|
||||
'2:to:->nodes' \
|
||||
'--type[Edge type]:type:(depends_on contains implements blocked_by subtask_of relates_to supersedes about)'
|
||||
;;
|
||||
list)
|
||||
_arguments \
|
||||
'--kind[Filter by kind]:kind:(memory component task decision)' \
|
||||
'--status[Filter by status]:status:(active todo doing done deprecated)' \
|
||||
'--tags[Filter by tags]:tags:->tags'
|
||||
;;
|
||||
use)
|
||||
_arguments '1:graph:->graphs'
|
||||
;;
|
||||
esac
|
||||
;;
|
||||
esac
|
||||
|
||||
case "$state" in
|
||||
nodes)
|
||||
local -a nodes
|
||||
nodes=(${(f)"$(cortex --get-nodes 2>/dev/null)"})
|
||||
_describe -t nodes 'nodes' nodes
|
||||
;;
|
||||
tags)
|
||||
local -a tags
|
||||
tags=(${(f)"$(cortex --get-tags 2>/dev/null)"})
|
||||
_describe -t tags 'tags' tags
|
||||
;;
|
||||
graphs)
|
||||
local -a graphs
|
||||
graphs=(${(f)"$(cortex --get-graphs 2>/dev/null)"})
|
||||
_describe -t graphs 'graphs' graphs
|
||||
;;
|
||||
esac
|
||||
}
|
||||
|
||||
_cortex
|
||||
```
|
||||
|
||||
### Fish Completions
|
||||
|
||||
```fish
|
||||
# cortex.fish
|
||||
complete -c cortex -f
|
||||
|
||||
# Commands
|
||||
complete -c cortex -n __fish_use_subcommand -a add -d 'Add a node'
|
||||
complete -c cortex -n __fish_use_subcommand -a query -d 'Search'
|
||||
complete -c cortex -n __fish_use_subcommand -a show -d 'Show node'
|
||||
complete -c cortex -n __fish_use_subcommand -a list -d 'List nodes'
|
||||
complete -c cortex -n __fish_use_subcommand -a update -d 'Update node'
|
||||
complete -c cortex -n __fish_use_subcommand -a remove -d 'Remove node'
|
||||
complete -c cortex -n __fish_use_subcommand -a link -d 'Link nodes'
|
||||
complete -c cortex -n __fish_use_subcommand -a graph -d 'Visualize graph'
|
||||
|
||||
# Kinds
|
||||
complete -c cortex -n '__fish_seen_subcommand_from add' -a 'memory component task decision'
|
||||
|
||||
# Dynamic node completion
|
||||
complete -c cortex -n '__fish_seen_subcommand_from show update remove' -a '(cortex --get-nodes)'
|
||||
|
||||
# Tags
|
||||
complete -c cortex -n '__fish_seen_subcommand_from list' -l tags -a '(cortex --get-tags)'
|
||||
```
|
||||
|
||||
### PowerShell Completions
|
||||
|
||||
```powershell
|
||||
# cortex.ps1
|
||||
Register-ArgumentCompleter -Native -CommandName cortex -ScriptBlock {
|
||||
param($wordToComplete, $commandAst, $cursorPosition)
|
||||
|
||||
$commands = @('add', 'query', 'show', 'list', 'update', 'remove', 'link', 'graph', 'children', 'journal', 'index', 'serve', 'decay', 'use', 'graphs')
|
||||
$kinds = @('memory', 'component', 'task', 'decision')
|
||||
$statuses = @('active', 'todo', 'doing', 'done', 'deprecated')
|
||||
|
||||
$elements = $commandAst.CommandElements
|
||||
$command = $elements[1].Value
|
||||
|
||||
if ($elements.Count -eq 2) {
|
||||
$commands | Where-Object { $_ -like "$wordToComplete*" } | ForEach-Object {
|
||||
[System.Management.Automation.CompletionResult]::new($_, $_, 'ParameterValue', $_)
|
||||
}
|
||||
}
|
||||
elseif ($command -eq 'add' -and $elements.Count -eq 3) {
|
||||
$kinds | Where-Object { $_ -like "$wordToComplete*" } | ForEach-Object {
|
||||
[System.Management.Automation.CompletionResult]::new($_, $_, 'ParameterValue', $_)
|
||||
}
|
||||
}
|
||||
elseif ($command -in @('show', 'update', 'remove')) {
|
||||
$nodes = cortex --get-nodes $wordToComplete 2>$null
|
||||
$nodes -split "`n" | Where-Object { $_ -like "$wordToComplete*" } | ForEach-Object {
|
||||
[System.Management.Automation.CompletionResult]::new($_, $_, 'ParameterValue', $_)
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### CLI Support Commands
|
||||
|
||||
```typescript
|
||||
// Hidden commands for completion scripts
|
||||
program
|
||||
.command('--get-nodes [prefix]', { hidden: true })
|
||||
.action(async (prefix) => {
|
||||
const nodes = await listNodes({ limit: 20 });
|
||||
const filtered = prefix
|
||||
? nodes.filter(n => n.id.startsWith(prefix))
|
||||
: nodes;
|
||||
console.log(filtered.map(n => `${n.id.slice(0, 8)}\t${n.title}`).join('\n'));
|
||||
});
|
||||
|
||||
program
|
||||
.command('--get-tags', { hidden: true })
|
||||
.action(async () => {
|
||||
const tags = await getAllTags();
|
||||
console.log(tags.join('\n'));
|
||||
});
|
||||
|
||||
program
|
||||
.command('--get-graphs', { hidden: true })
|
||||
.action(async () => {
|
||||
const graphs = listGraphs();
|
||||
console.log(graphs.map(g => g.name).join('\n'));
|
||||
});
|
||||
```
|
||||
|
||||
## CLI Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `cortex completions bash` | Output bash completions |
|
||||
| `cortex completions zsh` | Output zsh completions |
|
||||
| `cortex completions fish` | Output fish completions |
|
||||
| `cortex completions powershell` | Output PowerShell completions |
|
||||
| `cortex completions install` | Auto-install for current shell |
|
||||
|
||||
## Installation Instructions
|
||||
|
||||
### Bash
|
||||
```bash
|
||||
cortex completions bash > /etc/bash_completion.d/cortex
|
||||
# or
|
||||
cortex completions bash >> ~/.bashrc
|
||||
```
|
||||
|
||||
### Zsh
|
||||
```bash
|
||||
cortex completions zsh > ~/.zsh/completions/_cortex
|
||||
# Add to ~/.zshrc: fpath=(~/.zsh/completions $fpath)
|
||||
```
|
||||
|
||||
### Fish
|
||||
```bash
|
||||
cortex completions fish > ~/.config/fish/completions/cortex.fish
|
||||
```
|
||||
|
||||
### PowerShell
|
||||
```powershell
|
||||
cortex completions powershell >> $PROFILE
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
- [ ] Bash completions work
|
||||
- [ ] Zsh completions work
|
||||
- [ ] Fish completions work
|
||||
- [ ] PowerShell completions work
|
||||
- [ ] Dynamic node ID completion
|
||||
- [ ] Dynamic tag completion
|
||||
- [ ] Install command works
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] All four shells supported
|
||||
- [ ] Commands and options complete
|
||||
- [ ] Node IDs complete dynamically
|
||||
- [ ] Tags complete dynamically
|
||||
- [ ] Easy installation command
|
||||
- [ ] No errors on missing cortex binary
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- Bash completions: 2 hours
|
||||
- Zsh completions: 3 hours
|
||||
- Fish completions: 2 hours
|
||||
- PowerShell completions: 2 hours
|
||||
- CLI support commands: 2 hours
|
||||
- Install command: 1 hour
|
||||
- Testing: 2 hours
|
||||
- **Total: ~14 hours**
|
||||
|
||||
## Dependencies
|
||||
|
||||
- None
|
||||
|
||||
## References
|
||||
|
||||
- [Bash Programmable Completion](https://www.gnu.org/software/bash/manual/html_node/Programmable-Completion.html)
|
||||
- [Zsh Completion System](https://zsh.sourceforge.io/Doc/Release/Completion-System.html)
|
||||
- [Fish Completions](https://fishshell.com/docs/current/completions.html)
|
||||
- [PowerShell Tab Completion](https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/register-argumentcompleter)
|
||||
Reference in New Issue
Block a user