15 Commits

Author SHA1 Message Date
d8addcfcb7 Fix migration to handle locked directories gracefully 2026-02-03 11:51:58 +01:00
f21426fc43 Add browser extension and shell completions (Milestones 12-13)
M12: Browser Extension
- Chrome/Edge Manifest V3 extension
- Popup UI for saving pages
- Context menu integration (save page/selection/link)
- Background service worker
- Content script for extraction

M13: Shell Completions
- Bash, Zsh, Fish, PowerShell completions
- Dynamic node ID completion
- Dynamic tag completion
- Dynamic graph completion
- Auto-install command (--install)
2026-02-03 11:35:41 +01:00
b1c62c5da9 Add interactive TUI dashboard (Milestone 11)
- Lightweight TUI using Node.js readline (no extra deps)
- Browse nodes with vim-style navigation (j/k/arrows)
- Real-time search with / key
- Detail view with connections
- Filter by kind (1-4 keys)
- CLI: cortex tui, cortex ui
2026-02-03 11:30:44 +01:00
f891f37bde Add smart retrieval with git context (Milestone 10)
- Git context extraction: branch, commits, modified files
- Smart search with context-based re-ranking
- Time boosting for recently accessed nodes
- File relevance boosting for modified files
- Branch keyword matching
- CLI: smart-search, ss, what, now commands
- MCP tools: memory_smart_search, memory_what
2026-02-03 11:28:39 +01:00
aea3e93ff7 Add multi-graph support (Milestone 9)
- Graph manager: create, list, delete, use graphs
- Automatic project detection via .cortex.json or git remote
- Graph switching in db.ts connection manager
- Cross-graph search with --all-graphs flag
- CLI: graphs, use, init commands
- MCP tools: memory_graphs, memory_use_graph, memory_create_graph, memory_delete_graph
- Database migration from .memory to ~/.cortex/graphs
2026-02-03 11:25:44 +01:00
45998c73d0 Add import/export and backup system (Milestone 8)
- Obsidian vault importer with wikilink → edge conversion
- Markdown folder importer with frontmatter parsing
- Markdown exporter with wikilinks and frontmatter
- JSON-LD linked data exporter
- Database backup/restore functionality
- CLI: import, backup, restore-backup, list-backups
- MCP tools: memory_import, memory_backup, memory_export_markdown, memory_export_jsonld
2026-02-03 11:21:42 +01:00
3a334d2941 Add graph visualization exports (Milestone 7)
- Add interactive HTML export with D3.js force-directed graph
- Add SVG export with simple force-directed layout
- Add Mermaid diagram export for documentation
- Support subgraph export from root node with depth
- Add node filtering by kind and tags
- Add light/dark theme support
- Add CLI commands: export, viz
- Add MCP tool: memory_export
2026-02-03 11:08:19 +01:00
c65a5bb03a Add URL and content ingestion (Milestone 6)
- Add URL fetching with HTML-to-text extraction
- Add basic PDF text extraction
- Add smart content chunking with overlap
- Add deduplication via content checksums
- Add auto-linking to semantically related nodes
- Add CLI commands: ingest, clip
- Add MCP tools: memory_ingest, memory_clip
2026-02-03 11:00:28 +01:00
67b1e3b481 Add daily journal system (Milestone 5)
- Add journal service with date-based organization
- Add quick capture with timestamps and tags
- Add auto-linking for mentioned nodes and hashtags
- Add AI-powered daily summary generation
- Add journal search across all entries
- Add CLI commands: journal, j (alias), c (quick capture)
- Add MCP tools: memory_journal, memory_journal_list, memory_journal_summarize
2026-02-03 10:57:23 +01:00
056a02d936 Add codebase indexing system (Milestone 4)
- Add project type detection (Node.js, Python, Rust, Go, generic)
- Add file scanner with ignore patterns and hash tracking
- Add TypeScript/JavaScript parser (exports, imports, classes, functions)
- Add Python parser (imports, classes, functions, __all__)
- Add relationship mapper for import dependencies
- Add architecture summary generation with tech stack detection
- Add incremental update support via file hash comparison
- Add CLI command: cortex index [path] [--update] [--dry-run]
- Add MCP tools: memory_index, memory_components
2026-02-03 10:53:26 +01:00
9490cd1db4 Merge feature/context-injection into main 2026-02-03 10:07:32 +01:00
1cad7d6cb9 Merge feature/auto-capture into main 2026-02-03 10:06:48 +01:00
53ac83756f Add temporal versioning core and summary generation
- Add node_versions table with migrations for tracking history
- Add version tracking to addNode/updateNode in store
- Add getNodeHistory, getNodeAtTime, diffVersions, restoreVersion
- Add NodeVersion, HistoricalNode, NodeDiff types
- Add summary generation with caching for memory_summary
2026-02-03 10:05:26 +01:00
7a4dc07038 Add auto-capture system (Milestone 2)
- Add capture configuration system with modes: always, manual, decisions, off
- Add Ollama-based conversation summarization and extraction
- Add deduplication via embedding similarity (merge >0.90, link 0.75-0.90)
- Add CLI commands: capture, capture-hook, config
- Add MCP tools: memory_capture, memory_remember, memory_capture_config
- Include summary.ts (previously uncommitted)
2026-02-03 09:59:49 +01:00
761c7a247c Add temporal versioning for node history tracking (Milestone 1)
Enable time-travel queries and history viewing by creating immutable
version records on every node update. Includes database schema changes,
store functions, MCP tools, and CLI commands for viewing history,
comparing versions, and restoring to previous states.
2026-02-03 09:58:16 +01:00
62 changed files with 8614 additions and 53 deletions

75
extension/README.md Normal file
View File

@@ -0,0 +1,75 @@
# Cortex Browser Extension
Save web content directly to your Cortex knowledge graph.
## Features
- One-click save from any webpage
- Right-click context menu integration
- Save selected text only
- Auto-extract page title and content
- Tag suggestions based on URL
## Installation
### Chrome / Edge
1. Open `chrome://extensions` (or `edge://extensions`)
2. Enable "Developer mode" in the top right
3. Click "Load unpacked"
4. Select this `extension` folder
### Firefox
1. Open `about:debugging`
2. Click "This Firefox"
3. Click "Load Temporary Add-on"
4. Select `manifest.json` from this folder
## Usage
### Popup
1. Click the Cortex icon in your browser toolbar
2. Edit the title and tags as needed
3. Choose whether to include page content
4. Click "Save"
### Context Menu
- **Save page**: Right-click anywhere on a page → "Save page to Cortex"
- **Save selection**: Select text, right-click → "Save selection to Cortex"
- **Save link**: Right-click a link → "Save link to Cortex"
## Requirements
The Cortex server must be running for the extension to work:
```bash
cortex serve
```
By default, the extension connects to `http://localhost:3100/api`.
## Development
The extension uses Manifest V3 and consists of:
- `manifest.json` - Extension configuration
- `popup/` - Popup UI (HTML, CSS, JS)
- `background/` - Service worker for context menus
- `content/` - Content script for page extraction
- `icons/` - Extension icons
## Icon Generation
To generate PNG icons from the SVG:
```bash
# Using ImageMagick
convert icons/icon.svg -resize 16x16 icons/icon-16.png
convert icons/icon.svg -resize 48x48 icons/icon-48.png
convert icons/icon.svg -resize 128x128 icons/icon-128.png
```
Or use an online SVG to PNG converter.

View File

@@ -0,0 +1,136 @@
// Background service worker for Cortex browser extension
const CORTEX_API = 'http://localhost:3100/api';
// Create context menu on install
chrome.runtime.onInstalled.addListener(() => {
// Save page context menu
chrome.contextMenus.create({
id: 'cortex-save-page',
title: 'Save page to Cortex',
contexts: ['page'],
});
// Save selection context menu
chrome.contextMenus.create({
id: 'cortex-save-selection',
title: 'Save selection to Cortex',
contexts: ['selection'],
});
// Save link context menu
chrome.contextMenus.create({
id: 'cortex-save-link',
title: 'Save link to Cortex',
contexts: ['link'],
});
});
// Handle context menu clicks
chrome.contextMenus.onClicked.addListener(async (info, tab) => {
try {
let content = '';
let title = tab.title;
if (info.menuItemId === 'cortex-save-selection') {
content = info.selectionText || '';
title = `Selection from: ${tab.title}`;
} else if (info.menuItemId === 'cortex-save-link') {
content = `Link: ${info.linkUrl}`;
title = info.linkUrl;
} else {
// Get full page content
const [{ result }] = await chrome.scripting.executeScript({
target: { tabId: tab.id },
func: extractPageContent,
});
content = result?.content || '';
title = result?.title || tab.title;
}
// Save to Cortex
await saveToCortex({
title,
content,
url: tab.url,
kind: 'memory',
tags: ['web-clip'],
});
// Show notification (if supported)
if (chrome.notifications) {
chrome.notifications.create({
type: 'basic',
iconUrl: 'icons/icon-48.png',
title: 'Saved to Cortex',
message: title.slice(0, 50),
});
}
} catch (err) {
console.error('Context menu action failed:', err);
}
});
// Function to extract page content (injected into page)
function extractPageContent() {
// Simple content extraction
// A full implementation would use Readability
const title = document.title;
// Try to get main content
const article = document.querySelector('article');
const main = document.querySelector('main');
const content = article?.textContent || main?.textContent || document.body.textContent || '';
// Clean up content
const cleanContent = content
.replace(/\s+/g, ' ')
.replace(/\n+/g, '\n')
.trim()
.slice(0, 50000); // Limit content size
return {
title,
content: cleanContent,
url: window.location.href,
};
}
// Save to Cortex API
async function saveToCortex(data) {
const response = await fetch(`${CORTEX_API}/nodes`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
kind: data.kind || 'memory',
title: data.title,
content: data.content,
tags: data.tags || ['web-clip'],
metadata: {
source: {
type: 'url',
url: data.url,
savedAt: Date.now(),
},
},
}),
});
if (!response.ok) {
throw new Error(`HTTP ${response.status}`);
}
return response.json();
}
// Listen for messages from content script or popup
chrome.runtime.onMessage.addListener((request, sender, sendResponse) => {
if (request.action === 'save') {
saveToCortex(request.data)
.then(node => sendResponse({ success: true, nodeId: node.id }))
.catch(err => sendResponse({ success: false, error: err.message }));
return true; // Keep channel open for async response
}
});

View File

@@ -0,0 +1,80 @@
// Content script for Cortex browser extension
// Injected into web pages to extract content
// Extract page content
function extractContent() {
const title = document.title;
// Get selected text
const selection = window.getSelection()?.toString() || '';
// Try to find main content using common selectors
let content = '';
const contentSelectors = [
'article',
'[role="main"]',
'main',
'.post-content',
'.article-content',
'.entry-content',
'.content',
'#content',
];
for (const selector of contentSelectors) {
const el = document.querySelector(selector);
if (el && el.textContent) {
content = el.textContent;
break;
}
}
// Fall back to body content
if (!content) {
content = document.body.textContent || '';
}
// Clean up content
content = content
.replace(/\s+/g, ' ')
.replace(/\n{3,}/g, '\n\n')
.trim()
.slice(0, 100000); // Limit to 100k chars
// Get meta description
const metaDesc = document.querySelector('meta[name="description"]')?.content || '';
// Get author
const author = document.querySelector('meta[name="author"]')?.content ||
document.querySelector('[rel="author"]')?.textContent || '';
// Get published date
const datePublished = document.querySelector('meta[property="article:published_time"]')?.content ||
document.querySelector('time[datetime]')?.getAttribute('datetime') || '';
return {
title,
content,
selection,
excerpt: metaDesc || content.slice(0, 200),
url: window.location.href,
author,
datePublished,
};
}
// Listen for messages from popup or background
chrome.runtime.onMessage.addListener((request, sender, sendResponse) => {
if (request.action === 'extract') {
try {
const data = extractContent();
sendResponse(data);
} catch (err) {
sendResponse({ error: err.message });
}
}
return true; // Keep channel open for async response
});
// Make extract function available globally for background script injection
window.__cortexExtract = extractContent;

23
extension/icons/icon.svg Normal file
View File

@@ -0,0 +1,23 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 128 128">
<defs>
<linearGradient id="grad" x1="0%" y1="0%" x2="100%" y2="100%">
<stop offset="0%" style="stop-color:#4fc3f7;stop-opacity:1" />
<stop offset="100%" style="stop-color:#29b6f6;stop-opacity:1" />
</linearGradient>
</defs>
<circle cx="64" cy="64" r="56" fill="url(#grad)"/>
<g fill="#fff">
<!-- Brain/memory icon representation -->
<circle cx="64" cy="48" r="8"/>
<circle cx="44" cy="68" r="6"/>
<circle cx="84" cy="68" r="6"/>
<circle cx="54" cy="88" r="5"/>
<circle cx="74" cy="88" r="5"/>
<!-- Connections -->
<path d="M64 56 L44 62" stroke="#fff" stroke-width="2" fill="none"/>
<path d="M64 56 L84 62" stroke="#fff" stroke-width="2" fill="none"/>
<path d="M44 74 L54 83" stroke="#fff" stroke-width="2" fill="none"/>
<path d="M84 74 L74 83" stroke="#fff" stroke-width="2" fill="none"/>
<path d="M54 88 L74 88" stroke="#fff" stroke-width="2" fill="none"/>
</g>
</svg>

After

Width:  |  Height:  |  Size: 1012 B

36
extension/manifest.json Normal file
View File

@@ -0,0 +1,36 @@
{
"manifest_version": 3,
"name": "Cortex - Save to Memory",
"version": "1.0.0",
"description": "Save web content to your Cortex knowledge graph",
"permissions": [
"activeTab",
"contextMenus",
"storage"
],
"host_permissions": [
"http://localhost:3100/*"
],
"action": {
"default_popup": "popup/popup.html",
"default_icon": {
"16": "icons/icon-16.png",
"48": "icons/icon-48.png",
"128": "icons/icon-128.png"
}
},
"background": {
"service_worker": "background/background.js"
},
"content_scripts": [
{
"matches": ["<all_urls>"],
"js": ["content/content.js"]
}
],
"icons": {
"16": "icons/icon-16.png",
"48": "icons/icon-48.png",
"128": "icons/icon-128.png"
}
}

136
extension/popup/popup.css Normal file
View File

@@ -0,0 +1,136 @@
* {
box-sizing: border-box;
margin: 0;
padding: 0;
}
body {
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
font-size: 14px;
width: 320px;
padding: 16px;
background: #1a1a2e;
color: #eee;
}
.container {
display: flex;
flex-direction: column;
gap: 12px;
}
h1 {
font-size: 18px;
font-weight: 600;
color: #4fc3f7;
margin-bottom: 8px;
}
.status {
padding: 8px;
border-radius: 4px;
font-size: 12px;
display: none;
}
.status.success {
display: block;
background: #1b5e20;
color: #a5d6a7;
}
.status.error {
display: block;
background: #b71c1c;
color: #ef9a9a;
}
.field {
display: flex;
flex-direction: column;
gap: 4px;
}
.field.checkbox {
flex-direction: row;
align-items: center;
}
.field.checkbox label {
display: flex;
align-items: center;
gap: 8px;
cursor: pointer;
}
label {
font-size: 12px;
color: #aaa;
font-weight: 500;
}
input[type="text"],
select {
padding: 8px 12px;
border: 1px solid #333;
border-radius: 4px;
background: #0f0f23;
color: #eee;
font-size: 14px;
}
input[type="text"]:focus,
select:focus {
outline: none;
border-color: #4fc3f7;
}
select {
cursor: pointer;
}
input[type="checkbox"] {
width: 16px;
height: 16px;
cursor: pointer;
}
.actions {
display: flex;
gap: 8px;
margin-top: 8px;
}
button {
flex: 1;
padding: 10px 16px;
border: none;
border-radius: 4px;
font-size: 14px;
font-weight: 500;
cursor: pointer;
transition: background 0.2s;
}
button.primary {
background: #4fc3f7;
color: #000;
}
button.primary:hover {
background: #29b6f6;
}
button:not(.primary) {
background: #333;
color: #eee;
}
button:not(.primary):hover {
background: #444;
}
button:disabled {
opacity: 0.5;
cursor: not-allowed;
}

View File

@@ -0,0 +1,55 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<link rel="stylesheet" href="popup.css">
</head>
<body>
<div class="container">
<h1>Save to Cortex</h1>
<div id="status" class="status"></div>
<div class="field">
<label for="title">Title</label>
<input type="text" id="title" placeholder="Page title">
</div>
<div class="field">
<label for="kind">Kind</label>
<select id="kind">
<option value="memory">Memory</option>
<option value="component">Component</option>
<option value="decision">Decision</option>
<option value="task">Task</option>
</select>
</div>
<div class="field">
<label for="tags">Tags</label>
<input type="text" id="tags" placeholder="comma, separated, tags">
</div>
<div class="field checkbox">
<label>
<input type="checkbox" id="includeContent" checked>
Include page content
</label>
</div>
<div class="field checkbox">
<label>
<input type="checkbox" id="selectionOnly">
Selection only
</label>
</div>
<div class="actions">
<button id="save" class="primary">Save</button>
<button id="cancel">Cancel</button>
</div>
</div>
<script src="popup.js"></script>
</body>
</html>

136
extension/popup/popup.js Normal file
View File

@@ -0,0 +1,136 @@
// Popup script for Cortex browser extension
const CORTEX_API = 'http://localhost:3100/api';
// DOM elements
const titleInput = document.getElementById('title');
const kindSelect = document.getElementById('kind');
const tagsInput = document.getElementById('tags');
const includeContent = document.getElementById('includeContent');
const selectionOnly = document.getElementById('selectionOnly');
const saveButton = document.getElementById('save');
const cancelButton = document.getElementById('cancel');
const statusDiv = document.getElementById('status');
let pageData = null;
// Initialize popup
async function init() {
try {
// Get current tab
const [tab] = await chrome.tabs.query({ active: true, currentWindow: true });
// Request content from content script
const response = await chrome.tabs.sendMessage(tab.id, { action: 'extract' });
pageData = response;
// Pre-fill form
titleInput.value = response.title || tab.title || '';
// Suggest tags based on URL
const url = new URL(tab.url);
const suggestedTags = ['web-clip'];
if (url.hostname.includes('github')) suggestedTags.push('github');
if (url.hostname.includes('docs')) suggestedTags.push('documentation');
if (url.hostname.includes('stackoverflow')) suggestedTags.push('stackoverflow');
tagsInput.value = suggestedTags.join(', ');
// Enable selection only if there's selected text
if (response.selection) {
selectionOnly.disabled = false;
} else {
selectionOnly.disabled = true;
}
} catch (err) {
showStatus('Could not extract page content', 'error');
console.error('Init error:', err);
}
}
// Save to Cortex
async function save() {
saveButton.disabled = true;
saveButton.textContent = 'Saving...';
try {
const content = selectionOnly.checked
? pageData?.selection
: (includeContent.checked ? pageData?.content : '');
const tags = tagsInput.value
.split(',')
.map(t => t.trim())
.filter(t => t.length > 0);
const [tab] = await chrome.tabs.query({ active: true, currentWindow: true });
const response = await fetch(`${CORTEX_API}/nodes`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
kind: kindSelect.value,
title: titleInput.value,
content: content,
tags: tags,
metadata: {
source: {
type: 'url',
url: tab.url,
savedAt: Date.now(),
},
},
}),
});
if (!response.ok) {
throw new Error(`HTTP ${response.status}`);
}
const node = await response.json();
showStatus(`Saved! ID: ${node.id.slice(0, 8)}`, 'success');
// Close popup after brief delay
setTimeout(() => window.close(), 1500);
} catch (err) {
showStatus(`Failed to save: ${err.message}`, 'error');
console.error('Save error:', err);
// Check if Cortex server is running
if (err.message.includes('fetch')) {
showStatus('Is Cortex server running? (cortex serve)', 'error');
}
} finally {
saveButton.disabled = false;
saveButton.textContent = 'Save';
}
}
// Show status message
function showStatus(message, type) {
statusDiv.textContent = message;
statusDiv.className = `status ${type}`;
}
// Event listeners
saveButton.addEventListener('click', save);
cancelButton.addEventListener('click', () => window.close());
// Toggle selection only checkbox when include content is unchecked
includeContent.addEventListener('change', () => {
if (!includeContent.checked) {
selectionOnly.checked = false;
}
});
selectionOnly.addEventListener('change', () => {
if (selectionOnly.checked) {
includeContent.checked = true;
}
});
// Initialize on load
init();

View File

@@ -0,0 +1,68 @@
import { Command } from 'commander';
import chalk from 'chalk';
import { createBackup, restoreBackup, listBackups } from '../../core/backup';
export const backupCommand = new Command('backup')
.description('Create a backup of the database')
.argument('<path>', 'Output file path')
.action(async (outputPath: string) => {
try {
console.log(chalk.cyan('Creating backup...'));
const result = await createBackup(outputPath);
console.log(chalk.green(`✓ Backup created: ${result.path}`));
console.log(chalk.dim(` Size: ${(result.size / 1024).toFixed(1)} KB`));
console.log(chalk.dim(` Nodes: ${result.nodes}`));
console.log(chalk.dim(` Edges: ${result.edges}`));
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});
export const restoreDbCommand = new Command('restore-backup')
.description('Restore database from a backup')
.argument('<path>', 'Backup file path')
.option('-y, --yes', 'Skip confirmation')
.action(async (backupPath: string, opts) => {
try {
if (!opts.yes) {
console.log(chalk.yellow('Warning: This will replace your current database.'));
console.log(chalk.yellow('A backup of the current database will be created first.'));
console.log(chalk.dim('Use --yes to skip this warning.'));
// In a real CLI we'd prompt for confirmation, but for simplicity we proceed
}
console.log(chalk.cyan('Restoring backup...'));
const result = await restoreBackup(backupPath);
console.log(chalk.green('✓ Database restored'));
console.log(chalk.dim(` Nodes: ${result.nodes}`));
console.log(chalk.dim(` Edges: ${result.edges}`));
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});
export const listBackupsCommand = new Command('list-backups')
.description('List backups in a directory')
.argument('[directory]', 'Directory to list', '.')
.action((directory: string) => {
try {
const backups = listBackups(directory);
if (backups.length === 0) {
console.log(chalk.yellow('No backup files found.'));
return;
}
console.log(chalk.cyan(`Found ${backups.length} backup(s):\n`));
for (const backup of backups) {
const sizeKb = (backup.size / 1024).toFixed(1);
const date = backup.modified.toISOString().replace('T', ' ').slice(0, 19);
console.log(` ${chalk.bold(backup.name)}`);
console.log(chalk.dim(` Size: ${sizeKb} KB | Modified: ${date}`));
}
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});

176
src/cli/commands/capture.ts Normal file
View File

@@ -0,0 +1,176 @@
import { Command } from 'commander';
import chalk from 'chalk';
import { captureConversation, captureText, getCaptureConfig, setCaptureConfig, CaptureMode } from '../../core/capture';
const VALID_MODES: CaptureMode[] = ['always', 'manual', 'decisions', 'off'];
export const captureCommand = new Command('capture')
.description('Capture text or conversation as a memory node')
.argument('[text]', 'Text to capture (or pipe via stdin)')
.option('--tags <tags>', 'Comma-separated tags')
.option('--source <source>', 'Source identifier', 'manual')
.action(async (text: string | undefined, opts) => {
let content = text;
// Read from stdin if no text provided
if (!content) {
content = await readStdin();
}
if (!content || content.trim().length === 0) {
console.error(chalk.red('No text provided. Pass text as argument or pipe via stdin.'));
process.exit(1);
}
const tags = opts.tags ? opts.tags.split(',').map((t: string) => t.trim()) : [];
const result = await captureText(content, { tags, source: opts.source });
if (!result.captured) {
console.log(chalk.yellow(`⚠ Not captured: ${result.reason}`));
return;
}
console.log(chalk.green(`✓ Memory ${result.action}`));
if (result.node) {
console.log(` ID: ${chalk.cyan(result.node.id)}`);
console.log(` Title: ${result.node.title}`);
if (result.node.tags.length) console.log(` Tags: ${result.node.tags.join(', ')}`);
}
if (result.reason) {
console.log(` ${chalk.dim(result.reason)}`);
}
});
export const captureHookCommand = new Command('capture-hook')
.description('Hook handler for Claude Code auto-capture (receives JSON on stdin)')
.option('--session <id>', 'Session ID')
.action(async (opts) => {
const input = await readStdin();
if (!input) {
// Silent exit - hook may be called with empty input
process.exit(0);
}
let data: { conversation?: string; files_changed?: string[]; session_id?: string };
try {
data = JSON.parse(input);
} catch {
// Not JSON, treat as plain conversation text
data = { conversation: input };
}
const conversation = data.conversation;
if (!conversation) {
process.exit(0);
}
const result = await captureConversation({
conversation,
sessionId: data.session_id || opts.session,
filesChanged: data.files_changed,
source: 'claude-code',
});
// Output result as JSON for hook system
console.log(JSON.stringify({
captured: result.captured,
action: result.action,
nodeId: result.node?.id,
reason: result.reason,
}));
});
export const configCommand = new Command('config')
.description('Manage capture configuration')
.argument('<action>', 'Action: get, set, or list')
.argument('[key]', 'Config key (for get/set)')
.argument('[value]', 'Config value (for set)')
.action(async (action: string, key?: string, value?: string) => {
const config = getCaptureConfig();
if (action === 'list') {
console.log(chalk.bold('Capture Configuration:'));
console.log(` mode: ${chalk.cyan(config.mode)}`);
console.log(` minLength: ${config.minLength}`);
console.log(` excludePatterns: ${config.excludePatterns.length ? config.excludePatterns.join(', ') : chalk.dim('(none)')}`);
console.log(` autoTag: ${config.autoTag}`);
console.log(` linkRelated: ${config.linkRelated}`);
console.log(` similarityThreshold: ${config.similarityThreshold}`);
console.log(` mergeThreshold: ${config.mergeThreshold}`);
return;
}
if (action === 'get') {
if (!key) {
console.error(chalk.red('Key required for get'));
process.exit(1);
}
if (!(key in config)) {
console.error(chalk.red(`Unknown config key: ${key}`));
process.exit(1);
}
const val = config[key as keyof typeof config];
console.log(Array.isArray(val) ? val.join(', ') : String(val));
return;
}
if (action === 'set') {
if (!key || value === undefined) {
console.error(chalk.red('Key and value required for set'));
process.exit(1);
}
let parsedValue: any = value;
// Parse value based on key type
if (key === 'mode') {
if (!VALID_MODES.includes(value as CaptureMode)) {
console.error(chalk.red(`Invalid mode. Must be one of: ${VALID_MODES.join(', ')}`));
process.exit(1);
}
parsedValue = value;
} else if (key === 'minLength' || key === 'similarityThreshold' || key === 'mergeThreshold') {
parsedValue = parseFloat(value);
if (isNaN(parsedValue)) {
console.error(chalk.red('Value must be a number'));
process.exit(1);
}
} else if (key === 'autoTag' || key === 'linkRelated') {
parsedValue = value === 'true' || value === '1';
} else if (key === 'excludePatterns') {
parsedValue = value.split(',').map(s => s.trim()).filter(Boolean);
} else {
console.error(chalk.red(`Unknown config key: ${key}`));
process.exit(1);
}
setCaptureConfig({ [key]: parsedValue });
console.log(chalk.green(`✓ Set ${key} = ${JSON.stringify(parsedValue)}`));
return;
}
console.error(chalk.red('Invalid action. Use: get, set, or list'));
process.exit(1);
});
async function readStdin(): Promise<string> {
return new Promise((resolve) => {
// Check if stdin has data (non-TTY mode)
if (process.stdin.isTTY) {
resolve('');
return;
}
let data = '';
process.stdin.setEncoding('utf8');
process.stdin.on('data', (chunk) => { data += chunk; });
process.stdin.on('end', () => { resolve(data.trim()); });
// Timeout after 100ms if no data
setTimeout(() => {
if (!data) resolve('');
}, 100);
});
}

View File

@@ -0,0 +1,169 @@
import { Command } from 'commander';
import chalk from 'chalk';
import * as fs from 'fs';
import * as path from 'path';
import * as os from 'os';
import { generateCompletions, Shell } from '../completions';
import { listNodes } from '../../core/store';
import { listGraphs } from '../../core/graphs';
import { getDb } from '../../core/db';
export const completionsCommand = new Command('completions')
.description('Generate shell completions')
.argument('[shell]', 'Shell type: bash, zsh, fish, powershell')
.option('--install', 'Install completions to appropriate location')
.action(async (shell: string | undefined, opts) => {
const validShells: Shell[] = ['bash', 'zsh', 'fish', 'powershell'];
if (!shell) {
console.log(chalk.cyan('Available shell completions:\n'));
console.log(' cortex completions bash');
console.log(' cortex completions zsh');
console.log(' cortex completions fish');
console.log(' cortex completions powershell');
console.log(chalk.dim('\nAdd --install to auto-install for your shell'));
return;
}
const shellType = shell.toLowerCase() as Shell;
if (!validShells.includes(shellType)) {
console.error(chalk.red(`Invalid shell: ${shell}`));
console.log(chalk.dim(`Valid shells: ${validShells.join(', ')}`));
process.exit(1);
}
const completionScript = generateCompletions(shellType);
if (opts.install) {
installCompletions(shellType, completionScript);
} else {
console.log(completionScript);
}
});
function installCompletions(shell: Shell, script: string): void {
const home = os.homedir();
switch (shell) {
case 'bash': {
const bashrc = path.join(home, '.bashrc');
const completionMarker = '# Cortex CLI completions';
// Check if already installed
if (fs.existsSync(bashrc)) {
const content = fs.readFileSync(bashrc, 'utf-8');
if (content.includes(completionMarker)) {
console.log(chalk.yellow('Completions already installed in ~/.bashrc'));
return;
}
}
fs.appendFileSync(bashrc, `\n${completionMarker}\n${script}\n`);
console.log(chalk.green('✓ Installed bash completions to ~/.bashrc'));
console.log(chalk.dim(' Run: source ~/.bashrc'));
break;
}
case 'zsh': {
const zshCompletions = path.join(home, '.zsh', 'completions');
const targetFile = path.join(zshCompletions, '_cortex');
fs.mkdirSync(zshCompletions, { recursive: true });
fs.writeFileSync(targetFile, script);
console.log(chalk.green(`✓ Installed zsh completions to ${targetFile}`));
console.log(chalk.dim(' Add to ~/.zshrc: fpath=(~/.zsh/completions $fpath)'));
console.log(chalk.dim(' Then run: autoload -Uz compinit && compinit'));
break;
}
case 'fish': {
const fishCompletions = path.join(home, '.config', 'fish', 'completions');
const targetFile = path.join(fishCompletions, 'cortex.fish');
fs.mkdirSync(fishCompletions, { recursive: true });
fs.writeFileSync(targetFile, script);
console.log(chalk.green(`✓ Installed fish completions to ${targetFile}`));
console.log(chalk.dim(' Completions will be loaded automatically'));
break;
}
case 'powershell': {
// PowerShell profile varies by platform
const profilePaths = [
path.join(home, 'Documents', 'WindowsPowerShell', 'Microsoft.PowerShell_profile.ps1'),
path.join(home, 'Documents', 'PowerShell', 'Microsoft.PowerShell_profile.ps1'),
path.join(home, '.config', 'powershell', 'Microsoft.PowerShell_profile.ps1'),
];
let profilePath = profilePaths.find(p => fs.existsSync(path.dirname(p)));
if (!profilePath) {
profilePath = profilePaths[0];
fs.mkdirSync(path.dirname(profilePath), { recursive: true });
}
const marker = '# Cortex CLI completions';
if (fs.existsSync(profilePath)) {
const content = fs.readFileSync(profilePath, 'utf-8');
if (content.includes(marker)) {
console.log(chalk.yellow(`Completions already installed in ${profilePath}`));
return;
}
}
fs.appendFileSync(profilePath, `\n${marker}\n${script}\n`);
console.log(chalk.green(`✓ Installed PowerShell completions to ${profilePath}`));
console.log(chalk.dim(' Restart PowerShell to load completions'));
break;
}
}
}
// Hidden helper commands for dynamic completions
export const getNodesCommand = new Command('--get-nodes')
.argument('[prefix]', 'Node ID prefix to filter')
.description('Helper for shell completions')
.action(async (prefix?: string) => {
try {
const nodes = listNodes({ limit: 30, includeStale: false });
const filtered = prefix
? nodes.filter(n => n.id.startsWith(prefix) || n.title.toLowerCase().includes(prefix.toLowerCase()))
: nodes;
// Output tab-separated: id<TAB>title for completion with description
for (const node of filtered.slice(0, 20)) {
console.log(`${node.id.slice(0, 8)}\t${node.title.slice(0, 40)}`);
}
} catch {
// Silently fail for completion scripts
}
});
export const getTagsCommand = new Command('--get-tags')
.description('Helper for shell completions')
.action(async () => {
try {
const db = getDb();
const tags = db.prepare('SELECT DISTINCT tag FROM node_tags ORDER BY tag').all() as { tag: string }[];
for (const { tag } of tags.slice(0, 50)) {
console.log(tag);
}
} catch {
// Silently fail
}
});
export const getGraphsCommand = new Command('--get-graphs')
.description('Helper for shell completions')
.action(async () => {
try {
const graphs = listGraphs();
for (const graph of graphs) {
console.log(graph.name);
}
} catch {
// Silently fail
}
});

90
src/cli/commands/diff.ts Normal file
View File

@@ -0,0 +1,90 @@
import { Command } from 'commander';
import chalk from 'chalk';
import { findNodeByPrefix, diffVersions, getNodeAtTime, getNodeHistory } from '../../core/store';
export const diffCommand = new Command('diff')
.argument('<id>', 'Node ID (or prefix)')
.option('--v1 <n>', 'First version number')
.option('--v2 <n>', 'Second version number')
.option('--from <date>', 'Start date (ISO format or timestamp)')
.option('--to <date>', 'End date (ISO format or timestamp)')
.option('--format <fmt>', 'Output format: text or json', 'text')
.description('Compare two versions of a node')
.action(async (idRaw: string, opts) => {
const node = findNodeByPrefix(idRaw);
if (!node) {
console.error(chalk.red(`Node not found: ${idRaw}`));
process.exit(1);
}
let v1: number;
let v2: number;
if (opts.v1 && opts.v2) {
v1 = parseInt(opts.v1);
v2 = parseInt(opts.v2);
} else if (opts.from && opts.to) {
// Parse dates
const fromTs = Date.parse(opts.from);
const toTs = Date.parse(opts.to);
if (isNaN(fromTs) || isNaN(toTs)) {
console.error(chalk.red('Invalid date format. Use ISO format (e.g., 2024-01-01) or timestamp.'));
process.exit(1);
}
const fromNode = getNodeAtTime(node.id, fromTs);
const toNode = getNodeAtTime(node.id, toTs);
if (!fromNode || !toNode) {
console.error(chalk.red('Could not find versions for the specified dates.'));
process.exit(1);
}
v1 = fromNode.version;
v2 = toNode.version;
} else {
// Default: compare latest two versions
const history = getNodeHistory(node.id);
if (history.length < 2) {
console.log(chalk.yellow('Not enough versions to compare.'));
return;
}
v1 = history[1].version; // Second latest
v2 = history[0].version; // Latest
}
const diff = diffVersions(node.id, v1, v2);
if (!diff) {
console.error(chalk.red('One or both versions not found.'));
process.exit(1);
}
if (opts.format === 'json') {
console.log(JSON.stringify(diff, null, 2));
return;
}
console.log(chalk.bold.cyan(`[${node.kind}] ${node.title}`));
console.log(`ID: ${node.id}`);
console.log(`Comparing: ${chalk.cyan(`v${v1}`)} -> ${chalk.cyan(`v${v2}`)}`);
console.log('');
if (diff.changes.length === 0) {
console.log(chalk.green('No changes between versions.'));
return;
}
for (const change of diff.changes) {
console.log(chalk.bold(`${change.field}:`));
const oldStr = typeof change.old === 'string' ? change.old : JSON.stringify(change.old);
const newStr = typeof change.new === 'string' ? change.new : JSON.stringify(change.new);
console.log(chalk.red(` - ${oldStr}`));
console.log(chalk.green(` + ${newStr}`));
console.log('');
}
console.log(chalk.dim(`${diff.changes.length} field(s) changed`));
});

120
src/cli/commands/export.ts Normal file
View File

@@ -0,0 +1,120 @@
import { Command } from 'commander';
import * as fs from 'fs';
import chalk from 'chalk';
import { exportGraph, ExportFormat, exportMarkdown, exportJsonLd } from '../../core/export';
export const exportCommand = new Command('export')
.description('Export the knowledge graph (html, svg, mermaid, markdown, jsonld)')
.argument('[rootId]', 'Root node ID for subgraph export')
.option('-f, --format <format>', 'Output format: html, svg, mermaid, markdown, jsonld', 'html')
.option('-o, --output <file>', 'Output file/directory path')
.option('-d, --depth <n>', 'Depth for subgraph export', '3')
.option('-k, --kind <kind>', 'Filter by node kind')
.option('-t, --tags <tags>', 'Filter by tags (comma-separated)')
.option('--theme <theme>', 'Theme: light or dark', 'dark')
.option('--width <n>', 'Width for SVG', '800')
.option('--height <n>', 'Height for SVG', '600')
.option('--direction <dir>', 'Mermaid direction: TD, LR, BT, RL', 'TD')
.option('--title <title>', 'Title for HTML export')
.option('--no-frontmatter', 'Skip frontmatter in markdown export')
.option('--no-wikilinks', 'Skip wikilinks in markdown export')
.action(async (rootId: string | undefined, opts) => {
try {
const format = opts.format.toLowerCase() as ExportFormat;
const validFormats = ['html', 'svg', 'mermaid', 'markdown', 'jsonld'];
if (!validFormats.includes(format)) {
console.error(chalk.red(`Invalid format: ${format}. Use: ${validFormats.join(', ')}`));
process.exit(1);
}
console.log(chalk.cyan(`Exporting graph as ${format}...`));
// Handle markdown export (outputs to directory)
if (format === 'markdown') {
const outputDir = opts.output || './exported-markdown';
const result = await exportMarkdown(outputDir, {
kind: opts.kind,
tags: opts.tags?.split(',').map((t: string) => t.trim()),
frontmatter: opts.frontmatter !== false,
wikilinks: opts.wikilinks !== false,
});
console.log(chalk.green(`✓ Exported ${result.exported} files to ${outputDir}`));
return;
}
// Handle jsonld export
if (format === 'jsonld') {
const content = await exportJsonLd({
kind: opts.kind,
tags: opts.tags?.split(',').map((t: string) => t.trim()),
pretty: true,
});
if (opts.output) {
fs.writeFileSync(opts.output, content);
console.log(chalk.green(`✓ Exported to ${opts.output}`));
const stats = fs.statSync(opts.output);
console.log(chalk.dim(` Size: ${(stats.size / 1024).toFixed(1)} KB`));
} else {
console.log(content);
}
return;
}
// Handle graph exports (html, svg, mermaid)
const content = await exportGraph({
format,
rootId,
depth: parseInt(opts.depth),
kind: opts.kind,
tags: opts.tags?.split(',').map((t: string) => t.trim()),
theme: opts.theme,
width: parseInt(opts.width),
height: parseInt(opts.height),
direction: opts.direction,
title: opts.title,
});
if (opts.output) {
fs.writeFileSync(opts.output, content);
console.log(chalk.green(`✓ Exported to ${opts.output}`));
// Show file size
const stats = fs.statSync(opts.output);
console.log(chalk.dim(` Size: ${(stats.size / 1024).toFixed(1)} KB`));
} else {
// Output to stdout
console.log(content);
}
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});
// Shorthand for HTML visualization
export const vizCommand = new Command('viz')
.description('Export interactive HTML visualization')
.argument('[rootId]', 'Root node ID for subgraph')
.option('-o, --output <file>', 'Output file', 'graph.html')
.option('--theme <theme>', 'Theme: light or dark', 'dark')
.option('-d, --depth <n>', 'Depth for subgraph', '3')
.action(async (rootId: string | undefined, opts) => {
try {
console.log(chalk.cyan('Generating visualization...'));
const content = await exportGraph({
format: 'html',
rootId,
depth: parseInt(opts.depth),
theme: opts.theme,
});
fs.writeFileSync(opts.output, content);
console.log(chalk.green(`✓ Created ${opts.output}`));
console.log(chalk.dim(` Open in browser to view interactive graph`));
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});

156
src/cli/commands/graphs.ts Normal file
View File

@@ -0,0 +1,156 @@
import { Command } from 'commander';
import chalk from 'chalk';
import {
listGraphs,
createGraph,
deleteGraph,
useGraph,
getActiveGraph,
initProject,
graphExists,
} from '../../core/graphs';
export const graphsCommand = new Command('graphs')
.description('Manage knowledge graphs')
.action(() => {
try {
const graphs = listGraphs();
const active = getActiveGraph();
if (graphs.length === 0) {
console.log(chalk.yellow('No graphs found. Create one with: cortex graphs create <name>'));
return;
}
console.log(chalk.cyan('Knowledge Graphs:\n'));
for (const graph of graphs) {
const isActive = graph.name === active;
const prefix = isActive ? chalk.green('→ ') : ' ';
const name = isActive ? chalk.bold.green(graph.name) : graph.name;
const lastAccessed = new Date(graph.lastAccessed).toLocaleDateString();
const sizeKb = (graph.size / 1024).toFixed(1);
console.log(`${prefix}${name}`);
console.log(chalk.dim(` Nodes: ${graph.nodeCount} | Edges: ${graph.edgeCount} | Size: ${sizeKb} KB`));
console.log(chalk.dim(` Last accessed: ${lastAccessed}`));
}
if (graphs.length > 0) {
console.log(chalk.dim(`\n Active graph: ${active}`));
}
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});
// Subcommand: create
graphsCommand
.command('create <name>')
.description('Create a new graph')
.action((name: string) => {
try {
const graph = createGraph(name);
console.log(chalk.green(`✓ Created graph '${name}'`));
console.log(chalk.dim(` Path: ${graph.path}`));
console.log(chalk.dim(`\nSwitch to it with: cortex use ${name}`));
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});
// Subcommand: delete
graphsCommand
.command('delete <name>')
.description('Delete a graph')
.option('-y, --yes', 'Skip confirmation')
.action((name: string, opts) => {
try {
if (!graphExists(name)) {
console.error(chalk.red(`Graph '${name}' does not exist`));
process.exit(1);
}
if (!opts.yes) {
console.log(chalk.yellow(`Warning: This will permanently delete the '${name}' graph and all its data.`));
console.log(chalk.dim('Use --yes to skip this warning.'));
// In a real CLI we'd prompt for confirmation
}
deleteGraph(name);
console.log(chalk.green(`✓ Deleted graph '${name}'`));
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});
// Subcommand: info
graphsCommand
.command('info [name]')
.description('Show detailed info about a graph')
.action((name?: string) => {
try {
const graphName = name || getActiveGraph();
const graphs = listGraphs();
const graph = graphs.find(g => g.name === graphName);
if (!graph) {
console.error(chalk.red(`Graph '${graphName}' not found`));
process.exit(1);
}
console.log(chalk.cyan(`Graph: ${chalk.bold(graph.name)}\n`));
console.log(` Path: ${graph.path}`);
console.log(` Nodes: ${graph.nodeCount}`);
console.log(` Edges: ${graph.edgeCount}`);
console.log(` Size: ${(graph.size / 1024).toFixed(1)} KB`);
console.log(` Created: ${new Date(graph.createdAt).toISOString()}`);
console.log(` Last accessed: ${new Date(graph.lastAccessed).toISOString()}`);
if (graph.name === getActiveGraph()) {
console.log(chalk.green(`\n ✓ Currently active`));
}
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});
// Use command (switch active graph)
export const useCommand = new Command('use')
.description('Switch to a different graph')
.argument('<name>', 'Graph name to switch to')
.action((name: string) => {
try {
if (!graphExists(name)) {
console.error(chalk.red(`Graph '${name}' does not exist`));
console.log(chalk.dim(`Create it with: cortex graphs create ${name}`));
process.exit(1);
}
useGraph(name);
console.log(chalk.green(`✓ Switched to graph '${name}'`));
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});
// Init command (initialize project with graph)
export const initCommand = new Command('init')
.description('Initialize a graph for the current project')
.argument('[name]', 'Graph name (default: from git remote or directory name)')
.action((name?: string) => {
try {
const graphName = initProject(name);
console.log(chalk.green(`✓ Initialized project with graph '${graphName}'`));
console.log(chalk.dim(' Created .cortex.json'));
console.log(chalk.dim(`\nThis project will now use the '${graphName}' graph automatically.`));
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});

View File

@@ -0,0 +1,48 @@
import { Command } from 'commander';
import chalk from 'chalk';
import { findNodeByPrefix, getNodeHistory } from '../../core/store';
export const historyCommand = new Command('history')
.argument('<id>', 'Node ID (or prefix)')
.option('--format <fmt>', 'Output format: text or json', 'text')
.description('Show version history for a node')
.action(async (idRaw: string, opts) => {
const node = findNodeByPrefix(idRaw);
if (!node) {
console.error(chalk.red(`Node not found: ${idRaw}`));
process.exit(1);
}
const history = getNodeHistory(node.id);
if (history.length === 0) {
console.log(chalk.yellow('No version history found.'));
return;
}
if (opts.format === 'json') {
console.log(JSON.stringify({ nodeId: node.id, title: node.title, versions: history }, null, 2));
return;
}
console.log(chalk.bold.cyan(`[${node.kind}] ${node.title}`));
console.log(`ID: ${node.id}`);
console.log(chalk.bold('\nVersion History:'));
console.log('');
for (const v of history) {
const validFrom = new Date(v.validFrom).toLocaleString();
const validUntil = v.validUntil ? new Date(v.validUntil).toLocaleString() : chalk.green('current');
const createdBy = chalk.dim(`(${v.createdBy})`);
console.log(` ${chalk.cyan(`v${v.version}`)} ${createdBy}`);
console.log(` ${chalk.dim('From:')} ${validFrom}`);
console.log(` ${chalk.dim('Until:')} ${validUntil}`);
console.log(` ${chalk.dim('Title:')} ${v.title}`);
if (v.status) console.log(` ${chalk.dim('Status:')} ${v.status}`);
if (v.tags.length) console.log(` ${chalk.dim('Tags:')} ${v.tags.join(', ')}`);
console.log('');
}
console.log(chalk.dim(`${history.length} version(s)`));
});

View File

@@ -0,0 +1,66 @@
import { Command } from 'commander';
import chalk from 'chalk';
import { importObsidian } from '../../core/import/obsidian';
import { importMarkdown } from '../../core/import/markdown';
export const importCommand = new Command('import')
.description('Import data from external sources')
.argument('<source>', 'Source type: obsidian, markdown')
.argument('<path>', 'Path to import from')
.option('-t, --tags <tags>', 'Additional tags (comma-separated)')
.option('-k, --kind <kind>', 'Node kind (default: memory)')
.option('--hierarchy', 'Create folder hierarchy (obsidian only)')
.option('--dry-run', 'Preview import without making changes')
.action(async (source: string, inputPath: string, opts) => {
try {
const tags = opts.tags?.split(',').map((t: string) => t.trim());
switch (source.toLowerCase()) {
case 'obsidian': {
console.log(chalk.cyan(`Importing Obsidian vault from ${inputPath}...`));
const result = await importObsidian(inputPath, {
kind: opts.kind,
hierarchy: opts.hierarchy,
dryRun: opts.dryRun,
});
if (opts.dryRun) {
console.log(chalk.yellow(`Dry run: would import ${result.imported} notes`));
console.log(chalk.dim(` Would create ${result.edges} edges from wikilinks`));
} else {
console.log(chalk.green(`✓ Imported ${result.imported} notes`));
if (result.edges > 0) {
console.log(chalk.dim(` Created ${result.edges} edges from wikilinks`));
}
if (result.skipped > 0) {
console.log(chalk.yellow(` Skipped ${result.skipped} files (already exist)`));
}
}
break;
}
case 'markdown':
case 'md': {
console.log(chalk.cyan(`Importing markdown files from ${inputPath}...`));
const result = await importMarkdown(inputPath, {
kind: opts.kind as any,
tags,
dryRun: opts.dryRun,
});
if (opts.dryRun) {
console.log(chalk.yellow(`Dry run: would import ${result.imported} files`));
} else {
console.log(chalk.green(`✓ Imported ${result.imported} files`));
}
break;
}
default:
console.error(chalk.red(`Unknown source type: ${source}`));
console.log(chalk.dim('Supported: obsidian, markdown'));
process.exit(1);
}
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});

View File

@@ -0,0 +1,61 @@
import { Command } from 'commander';
import chalk from 'chalk';
import { indexProject } from '../../core/indexer';
export const indexCommand = new Command('index')
.description('Index a codebase to create component nodes')
.argument('[path]', 'Path to index', '.')
.option('--update', 'Only update changed files (incremental)')
.option('--dry-run', 'Preview what would be indexed without making changes')
.option('--depth <n>', 'Maximum directory depth', '10')
.option('--lang <language>', 'Only index specific language (ts, js, py)')
.option('--ignore <patterns>', 'Additional ignore patterns (comma-separated)')
.action(async (inputPath: string, opts) => {
const startTime = Date.now();
if (opts.dryRun) {
console.log(chalk.yellow('Dry run mode - no changes will be made\n'));
}
console.log(chalk.cyan(`Indexing ${inputPath}...`));
try {
const result = await indexProject(inputPath, {
update: opts.update,
dryRun: opts.dryRun,
maxDepth: parseInt(opts.depth),
language: opts.lang,
ignore: opts.ignore?.split(',').map((s: string) => s.trim()),
});
const elapsed = ((Date.now() - startTime) / 1000).toFixed(2);
console.log();
console.log(chalk.green(`✓ Indexed ${result.projectName} (${result.projectType})`));
console.log();
console.log(` Files scanned: ${result.files.length}`);
console.log(` Components created: ${result.componentsCreated}`);
console.log(` Components updated: ${result.componentsUpdated}`);
console.log(` Components removed: ${result.componentsRemoved}`);
console.log(` Relationships: ${result.relationshipsCreated}`);
if (result.architectureNodeId) {
console.log(` Architecture node: ${result.architectureNodeId.slice(0, 8)}`);
}
console.log();
console.log(chalk.dim(`Completed in ${elapsed}s`));
if (opts.dryRun && result.files.length > 0) {
console.log();
console.log(chalk.yellow('Files that would be indexed:'));
for (const file of result.files.slice(0, 20)) {
console.log(` ${file}`);
}
if (result.files.length > 20) {
console.log(chalk.dim(` ... and ${result.files.length - 20} more`));
}
}
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});

View File

@@ -0,0 +1,95 @@
import { Command } from 'commander';
import chalk from 'chalk';
import { ingest } from '../../core/ingest';
export const ingestCommand = new Command('ingest')
.description('Ingest content from URLs, files, or stdin into the knowledge graph')
.argument('[source]', 'URL or file path to ingest')
.option('-t, --title <title>', 'Override title')
.option('--tags <tags>', 'Tags to apply (comma-separated)')
.option('--stdin', 'Read content from stdin')
.option('--chunk-size <n>', 'Max tokens per chunk (default: 1000)')
.option('--no-link', 'Skip auto-linking to related nodes')
.action(async (source: string | undefined, opts) => {
try {
if (!source && !opts.stdin) {
console.error(chalk.red('Error: Provide a source URL/file or use --stdin'));
process.exit(1);
}
if (opts.stdin) {
console.log(chalk.cyan('Reading from stdin... (Ctrl+D to end)'));
} else {
console.log(chalk.cyan(`Ingesting: ${source}`));
}
const result = await ingest(source || '', {
title: opts.title,
tags: opts.tags?.split(',').map((t: string) => t.trim()),
stdin: opts.stdin,
noLink: !opts.link,
chunkStrategy: opts.chunkSize ? {
maxTokens: parseInt(opts.chunkSize),
} : undefined,
});
if (!result.success) {
console.log(chalk.yellow('Content already exists (duplicate checksum)'));
return;
}
console.log();
console.log(chalk.green(`✓ Ingested: ${result.title}`));
console.log();
console.log(` Type: ${result.sourceType}`);
console.log(` Nodes: ${result.nodeCount}`);
if (result.parentId) {
console.log(` Parent: ${result.parentId.slice(0, 8)}`);
}
for (const node of result.nodes.slice(0, 5)) {
console.log(chalk.dim(` - ${node.id.slice(0, 8)} ${node.title}`));
}
if (result.nodes.length > 5) {
console.log(chalk.dim(` ... and ${result.nodes.length - 5} more`));
}
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});
// Alias for quick URL clipping
export const clipCommand = new Command('clip')
.description('Quick clip a URL (alias for ingest)')
.argument('<url>', 'URL to clip')
.option('-t, --title <title>', 'Override title')
.option('--tags <tags>', 'Tags to apply (comma-separated)')
.action(async (url: string, opts) => {
try {
if (!url.startsWith('http://') && !url.startsWith('https://')) {
console.error(chalk.red('Error: clip expects a URL'));
process.exit(1);
}
console.log(chalk.cyan(`Clipping: ${url}`));
const result = await ingest(url, {
title: opts.title,
tags: opts.tags?.split(',').map((t: string) => t.trim()),
});
if (!result.success) {
console.log(chalk.yellow('Already clipped (duplicate)'));
return;
}
console.log(chalk.green(`${result.title}`));
console.log(chalk.dim(` ${result.nodes[0].id.slice(0, 8)}`));
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});

160
src/cli/commands/journal.ts Normal file
View File

@@ -0,0 +1,160 @@
import { Command } from 'commander';
import chalk from 'chalk';
import {
getOrCreateJournal,
appendToJournal,
listJournals,
searchJournals,
generateJournalSummary,
getJournalByDate,
formatDate,
getDateFromOffset,
JournalMetadata,
} from '../../core/journal';
export const journalCommand = new Command('journal')
.description('Daily journal - quick capture and daily notes')
.argument('[text...]', 'Text to add to today\'s journal')
.option('-d, --date <date>', 'Specific date (YYYY-MM-DD)')
.option('--yesterday', 'Yesterday\'s journal')
.option('-l, --list', 'List recent journals')
.option('--month <month>', 'Filter by month (YYYY-MM)')
.option('-s, --search <query>', 'Search journals')
.option('--summarize', 'Generate AI summary')
.option('-t, --tags <tags>', 'Tags for the entry (comma-separated)')
.action(async (textParts: string[], opts) => {
try {
// Determine target date
let targetDate: string | undefined;
if (opts.date) {
targetDate = opts.date;
} else if (opts.yesterday) {
targetDate = getDateFromOffset('yesterday');
}
// List journals
if (opts.list) {
const journals = listJournals({ month: opts.month, limit: 20 });
if (journals.length === 0) {
console.log(chalk.yellow('No journals found.'));
return;
}
console.log(chalk.cyan('Recent Journals:\n'));
for (const j of journals) {
const meta = j.metadata as JournalMetadata;
const entryCount = meta.entries?.length || 0;
const summaryIndicator = meta.summary ? ' [summarized]' : '';
console.log(` ${chalk.white(meta.date)} ${chalk.dim(`${entryCount} entries`)}${chalk.green(summaryIndicator)}`);
}
return;
}
// Search journals
if (opts.search) {
const results = await searchJournals(opts.search);
if (results.length === 0) {
console.log(chalk.yellow('No matching journals found.'));
return;
}
console.log(chalk.cyan(`Found ${results.length} journals:\n`));
for (const j of results) {
const meta = j.metadata as JournalMetadata;
console.log(chalk.white(`${meta.date}:`));
// Show matching entries
const entries = meta.entries?.filter(e =>
e.text.toLowerCase().includes(opts.search.toLowerCase())
) || [];
for (const e of entries.slice(0, 3)) {
console.log(chalk.dim(` ${e.time}`) + ` ${e.text}`);
}
if (entries.length > 3) {
console.log(chalk.dim(` ... and ${entries.length - 3} more`));
}
console.log();
}
return;
}
// Generate summary
if (opts.summarize) {
console.log(chalk.cyan('Generating summary...'));
const summary = await generateJournalSummary(targetDate);
if (!summary) {
console.log(chalk.yellow('No entries to summarize.'));
return;
}
console.log();
console.log(chalk.green('Summary:'));
console.log(summary);
return;
}
// Add entry or view journal
const text = textParts.join(' ').trim();
if (text) {
// Add entry
const tags = opts.tags?.split(',').map((t: string) => t.trim());
const { journal, entry } = await appendToJournal(text, { tags, date: targetDate });
const meta = journal.metadata as JournalMetadata;
console.log(chalk.green(`✓ Added to ${meta.date}`));
console.log(chalk.dim(` ${entry.time}`) + ` ${text}`);
} else {
// View journal
const journal = await getOrCreateJournal(targetDate);
const meta = journal.metadata as JournalMetadata;
console.log(chalk.cyan(`Journal: ${meta.date}`));
console.log(chalk.dim(`ID: ${journal.id.slice(0, 8)}`));
console.log();
if (!meta.entries?.length) {
console.log(chalk.dim('No entries yet. Add one with: cortex journal "your text"'));
} else {
for (const entry of meta.entries) {
const tagStr = entry.tags?.length ? chalk.blue(` [${entry.tags.join(', ')}]`) : '';
console.log(chalk.dim(`${entry.time}`) + tagStr + ` ${entry.text}`);
}
}
if (meta.summary) {
console.log();
console.log(chalk.green('Summary:'));
console.log(meta.summary);
}
}
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});
// Alias: cortex j
export const journalAliasCommand = new Command('j')
.description('Alias for journal command')
.argument('[text...]', 'Text to add')
.option('-t, --tags <tags>', 'Tags (comma-separated)')
.action(async (textParts: string[], opts) => {
const text = textParts.join(' ').trim();
if (!text) {
const journal = await getOrCreateJournal();
const meta = journal.metadata as JournalMetadata;
console.log(chalk.cyan(`Journal: ${meta.date} (${meta.entries?.length || 0} entries)`));
return;
}
const tags = opts.tags?.split(',').map((t: string) => t.trim());
const { journal, entry } = await appendToJournal(text, { tags });
const meta = journal.metadata as JournalMetadata;
console.log(chalk.green(`${meta.date} ${entry.time}`) + ` ${text}`);
});
// Quick capture: cortex c
export const quickCaptureCommand = new Command('c')
.description('Quick capture to today\'s journal')
.argument('<text...>', 'Text to capture')
.option('-t, --tags <tags>', 'Tags (comma-separated)')
.action(async (textParts: string[], opts) => {
const text = textParts.join(' ').trim();
const tags = opts.tags?.split(',').map((t: string) => t.trim());
const { entry } = await appendToJournal(text, { tags });
console.log(chalk.green(`${entry.time}`) + ` ${text}`);
});

View File

@@ -2,14 +2,72 @@ import { Command } from 'commander';
import chalk from 'chalk';
import { query } from '../../core/store';
import { NodeKind } from '../../types';
import { useGraph, getActiveGraph, listGraphs, graphExists } from '../../core/graphs';
import { closeDb, getDbForGraph } from '../../core/db';
export const queryCommand = new Command('query')
.argument('<text>', 'Natural language search query')
.option('--kind <kind>', 'Filter by node kind')
.option('--limit <n>', 'Max results', '10')
.option('--format <fmt>', 'Output format: text or json', 'text')
.option('--graph <name>', 'Query specific graph')
.option('--all-graphs', 'Search across all graphs')
.description('Search the knowledge graph')
.action(async (text: string, opts) => {
// Handle specific graph
if (opts.graph) {
if (!graphExists(opts.graph)) {
console.error(chalk.red(`Graph '${opts.graph}' does not exist`));
process.exit(1);
}
useGraph(opts.graph);
}
// Handle all-graphs search
if (opts.allGraphs) {
const allResults: Array<{ graph: string; node: any; score: number }> = [];
const graphs = listGraphs();
for (const graph of graphs) {
useGraph(graph.name);
closeDb(); // Force reconnect to new graph
const results = await query(text, {
kind: opts.kind as NodeKind | undefined,
limit: parseInt(opts.limit),
});
for (const r of results) {
allResults.push({ graph: graph.name, ...r });
}
}
// Sort by score and limit
allResults.sort((a, b) => b.score - a.score);
const limited = allResults.slice(0, parseInt(opts.limit));
if (limited.length === 0) {
console.log(chalk.yellow('No results found across any graphs.'));
return;
}
if (opts.format === 'json') {
console.log(JSON.stringify(limited.map(r => ({
graph: r.graph,
...r.node,
score: r.score,
embedding: undefined,
})), null, 2));
return;
}
for (const r of limited) {
const n = r.node;
console.log(`${chalk.cyan(n.id.slice(0, 8))} [${chalk.blue(r.graph)}] [${chalk.magenta(n.kind)}] ${chalk.bold(n.title)} ${chalk.dim(`(${r.score.toFixed(3)})`)}`);
if (n.content) console.log(` ${chalk.dim(n.content.slice(0, 120))}`);
if (n.tags.length) console.log(` ${chalk.yellow(n.tags.join(', '))}`);
}
return;
}
const results = await query(text, {
kind: opts.kind as NodeKind | undefined,
limit: parseInt(opts.limit),

View File

@@ -0,0 +1,65 @@
import { Command } from 'commander';
import chalk from 'chalk';
import { findNodeByPrefix, restoreVersion, getNodeHistory, getCurrentVersion } from '../../core/store';
export const restoreCommand = new Command('restore')
.argument('<id>', 'Node ID (or prefix)')
.option('-v, --to-version <n>', 'Version number to restore')
.option('--format <fmt>', 'Output format: text or json', 'text')
.description('Restore a node to a previous version (creates new version)')
.action(async (idRaw: string, opts) => {
const node = findNodeByPrefix(idRaw);
if (!node) {
console.error(chalk.red(`Node not found: ${idRaw}`));
process.exit(1);
}
if (!opts.toVersion) {
// Show available versions and ask user to specify
const history = getNodeHistory(node.id);
console.log(chalk.bold.cyan(`[${node.kind}] ${node.title}`));
console.log(`ID: ${node.id}`);
console.log(`Current version: ${chalk.cyan(`v${getCurrentVersion(node.id)}`)}`);
console.log('');
console.log(chalk.bold('Available versions:'));
for (const v of history) {
const validFrom = new Date(v.validFrom).toLocaleString();
const current = v.validUntil === null ? chalk.green(' (current)') : '';
console.log(` ${chalk.cyan(`v${v.version}`)} - ${validFrom} - ${v.title}${current}`);
}
console.log('');
console.log(chalk.yellow('Use --to-version <n> or -v <n> to restore to a specific version.'));
return;
}
const targetVersion = parseInt(opts.toVersion);
const currentVersion = getCurrentVersion(node.id);
if (targetVersion === currentVersion) {
console.log(chalk.yellow('Cannot restore to the current version.'));
return;
}
const restored = await restoreVersion(node.id, targetVersion, 'restore');
if (!restored) {
console.error(chalk.red(`Version ${targetVersion} not found.`));
process.exit(1);
}
if (opts.format === 'json') {
console.log(JSON.stringify({ message: `Restored to version ${targetVersion}`, node: { ...restored, embedding: undefined } }, null, 2));
return;
}
console.log(chalk.green(`Restored node to version ${targetVersion}`));
console.log(`New version: ${chalk.cyan(`v${getCurrentVersion(node.id)}`)}`);
console.log('');
console.log(chalk.bold.cyan(`[${restored.kind}] ${restored.title}`));
console.log(`ID: ${restored.id}`);
if (restored.status) console.log(`Status: ${restored.status}`);
if (restored.tags.length) console.log(`Tags: ${restored.tags.join(', ')}`);
if (restored.content) {
console.log('');
console.log(restored.content);
}
});

View File

@@ -1,11 +1,12 @@
import { Command } from 'commander';
import chalk from 'chalk';
import { findNodeByPrefix } from '../../core/store';
import { findNodeByPrefix, getNodeAtTime, getCurrentVersion } from '../../core/store';
import { getConnections } from '../../core/graph';
export const showCommand = new Command('show')
.argument('<id>', 'Node ID (or prefix)')
.option('--format <fmt>', 'Output format: text or json', 'text')
.option('--at <timestamp>', 'Show node at a specific point in time (ISO date or timestamp)')
.description('Show a node and its connections')
.action(async (idRaw: string, opts) => {
const node = findNodeByPrefix(idRaw);
@@ -14,15 +15,56 @@ export const showCommand = new Command('show')
process.exit(1);
}
// If --at is specified, show historical state
if (opts.at) {
const ts = isNaN(Number(opts.at)) ? Date.parse(opts.at) : Number(opts.at);
if (isNaN(ts)) {
console.error(chalk.red('Invalid timestamp format. Use ISO date (e.g., 2024-01-01) or Unix timestamp.'));
process.exit(1);
}
const historical = getNodeAtTime(node.id, ts);
if (!historical) {
console.error(chalk.red('No version found for the specified time.'));
process.exit(1);
}
if (opts.format === 'json') {
console.log(JSON.stringify(historical, null, 2));
return;
}
console.log(chalk.dim(`Viewing historical state at: ${new Date(ts).toLocaleString()}`));
console.log('');
console.log(chalk.bold.cyan(`[${historical.kind}] ${historical.title}`));
console.log(`ID: ${historical.id}`);
console.log(`Version: v${historical.version}`);
if (historical.status) console.log(`Status: ${historical.status}`);
if (historical.tags.length) console.log(`Tags: ${historical.tags.join(', ')}`);
console.log(`Valid: ${new Date(historical.validFrom).toLocaleString()} - ${historical.validUntil ? new Date(historical.validUntil).toLocaleString() : 'current'}`);
if (historical.content) console.log(`\n${historical.content}`);
// Render structured sections
if (historical.metadata?.sections && Array.isArray(historical.metadata.sections)) {
for (const sec of historical.metadata.sections) {
console.log(`\n${chalk.bold(`-- ${sec.label} --`)}`);
if (sec.body) console.log(sec.body);
}
}
return;
}
const conns = getConnections(node.id);
const version = getCurrentVersion(node.id);
if (opts.format === 'json') {
console.log(JSON.stringify({ ...node, embedding: undefined, connections: conns }, null, 2));
console.log(JSON.stringify({ ...node, embedding: undefined, version, connections: conns }, null, 2));
return;
}
console.log(chalk.bold.cyan(`[${node.kind}] ${node.title}`));
console.log(`ID: ${node.id}`);
console.log(`Version: v${version}`);
if (node.status) console.log(`Status: ${node.status}`);
if (node.tags.length) console.log(`Tags: ${node.tags.join(', ')}`);
console.log(`Created: ${new Date(node.createdAt).toLocaleString()}`);

166
src/cli/commands/smart.ts Normal file
View File

@@ -0,0 +1,166 @@
import { Command } from 'commander';
import chalk from 'chalk';
import { smartSearch, gatherWhatContext, formatWhatContext } from '../../core/search/smart';
import { NodeKind } from '../../types';
export const smartSearchCommand = new Command('smart-search')
.description('Context-aware search using git and file signals')
.argument('[query]', 'Optional explicit search query')
.option('--kind <kind>', 'Filter by node kind')
.option('--limit <n>', 'Max results', '10')
.option('--expand', 'Include related nodes')
.option('--format <fmt>', 'Output format: text or json', 'text')
.action(async (queryText: string | undefined, opts) => {
try {
const results = await smartSearch(queryText, {
kind: opts.kind as NodeKind | undefined,
limit: parseInt(opts.limit),
includeRelated: opts.expand,
});
if (results.length === 0) {
console.log(chalk.yellow('No relevant results found.'));
console.log(chalk.dim('Try adding more context or using a specific query.'));
return;
}
if (opts.format === 'json') {
console.log(JSON.stringify(results.map(r => ({
...r.node,
embedding: undefined,
score: r.score,
originalScore: r.originalScore,
boosts: r.boosts,
})), null, 2));
return;
}
console.log(chalk.cyan(`Found ${results.length} relevant results:\n`));
for (const r of results) {
const n = r.node;
const boostInfo = Object.entries(r.boosts)
.filter(([, v]) => v !== undefined)
.map(([k, v]) => `${k}:${(v as number).toFixed(2)}`)
.join(' ');
console.log(`${chalk.cyan(n.id.slice(0, 8))} [${chalk.magenta(n.kind)}] ${chalk.bold(n.title)}`);
console.log(chalk.dim(` Score: ${r.score.toFixed(3)} (base: ${r.originalScore.toFixed(3)}) ${boostInfo ? `[${boostInfo}]` : ''}`));
if (n.content) {
const preview = n.content.slice(0, 100).replace(/\n/g, ' ');
console.log(chalk.dim(` ${preview}${n.content.length > 100 ? '...' : ''}`));
}
if (n.tags.length) {
console.log(` ${chalk.yellow(n.tags.join(', '))}`);
}
console.log();
}
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});
// Alias
export const ssCommand = new Command('ss')
.description('Alias for smart-search')
.argument('[query]', 'Optional search query')
.option('--kind <kind>', 'Filter by node kind')
.option('--limit <n>', 'Max results', '10')
.option('--expand', 'Include related nodes')
.option('--format <fmt>', 'Output format: text or json', 'text')
.action(async (queryText: string | undefined, opts) => {
const results = await smartSearch(queryText, {
kind: opts.kind as NodeKind | undefined,
limit: parseInt(opts.limit),
includeRelated: opts.expand,
});
if (results.length === 0) {
console.log(chalk.yellow('No relevant results found.'));
return;
}
if (opts.format === 'json') {
console.log(JSON.stringify(results.map(r => ({
...r.node,
embedding: undefined,
score: r.score,
})), null, 2));
return;
}
for (const r of results) {
const n = r.node;
console.log(`${chalk.cyan(n.id.slice(0, 8))} [${chalk.magenta(n.kind)}] ${chalk.bold(n.title)} ${chalk.dim(`(${r.score.toFixed(3)})`)}`);
}
});
export const whatCommand = new Command('what')
.description('What should I know right now? Shows relevant context.')
.option('--format <fmt>', 'Output format: text or json', 'text')
.action(async (opts) => {
try {
const context = await gatherWhatContext();
if (opts.format === 'json') {
console.log(JSON.stringify({
gitContext: {
branch: context.gitContext.branch,
modifiedFiles: context.gitContext.modifiedFiles.length,
stagedFiles: context.gitContext.stagedFiles.length,
isGitRepo: context.gitContext.isGitRepo,
},
projectName: context.fileContext.projectName,
branchRelated: context.branchRelated.map(n => ({ id: n.id, title: n.title, kind: n.kind })),
fileRelated: context.fileRelated.map(n => ({ id: n.id, title: n.title, kind: n.kind })),
tasks: context.tasks.map(t => ({ id: t.id, title: t.title, status: t.status })),
decisions: context.decisions.map(d => ({ id: d.id, title: d.title })),
recentMemories: context.recentMemories.map(m => ({ id: m.id, title: m.title })),
}, null, 2));
return;
}
const formatted = formatWhatContext(context);
if (!formatted.trim()) {
console.log(chalk.yellow('No relevant context found.'));
console.log(chalk.dim('Add some memories or open tasks to see context here.'));
return;
}
console.log(chalk.bold.cyan('\n📚 What you should know:\n'));
console.log(formatted);
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});
// Alias for context command that was already defined
export const contextAwareCommand = new Command('now')
.description('Show current context (alias for what)')
.option('--format <fmt>', 'Output format: text or json', 'text')
.action(async (opts) => {
const context = await gatherWhatContext();
if (opts.format === 'json') {
console.log(JSON.stringify({
projectName: context.fileContext.projectName,
branch: context.gitContext.branch,
tasks: context.tasks.length,
decisions: context.decisions.length,
}, null, 2));
return;
}
const formatted = formatWhatContext(context);
if (!formatted.trim()) {
console.log(chalk.yellow('No relevant context found.'));
return;
}
console.log(chalk.bold.cyan('\n📚 Current context:\n'));
console.log(formatted);
});

38
src/cli/commands/tui.ts Normal file
View File

@@ -0,0 +1,38 @@
import { Command } from 'commander';
import chalk from 'chalk';
export const tuiCommand = new Command('tui')
.description('Launch interactive terminal user interface')
.action(async () => {
try {
// Check if running in a TTY
if (!process.stdin.isTTY) {
console.error(chalk.red('Error: TUI requires an interactive terminal'));
process.exit(1);
}
const { launchTui } = await import('../../tui');
await launchTui();
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});
// Alias
export const uiCommand = new Command('ui')
.description('Alias for tui')
.action(async () => {
try {
if (!process.stdin.isTTY) {
console.error(chalk.red('Error: UI requires an interactive terminal'));
process.exit(1);
}
const { launchTui } = await import('../../tui');
await launchTui();
} catch (err: any) {
console.error(chalk.red(`Error: ${err.message}`));
process.exit(1);
}
});

View File

@@ -0,0 +1,397 @@
/**
* Shell completion generators for Cortex CLI
*/
export type Shell = 'bash' | 'zsh' | 'fish' | 'powershell';
const COMMANDS = [
'add', 'query', 'show', 'list', 'update', 'remove', 'link', 'graph',
'children', 'decay', 'serve', 'history', 'diff', 'restore', 'capture',
'context', 'config', 'index', 'journal', 'ingest', 'clip', 'export',
'viz', 'import', 'backup', 'restore-backup', 'list-backups', 'graphs',
'use', 'init', 'smart-search', 'ss', 'what', 'now', 'tui', 'ui',
];
const KINDS = ['memory', 'component', 'task', 'decision'];
const STATUSES = ['active', 'todo', 'in_progress', 'done', 'deprecated'];
const EDGE_TYPES = ['depends_on', 'contains', 'implements', 'blocked_by', 'subtask_of', 'relates_to', 'supersedes', 'about'];
export function generateCompletions(shell: Shell): string {
switch (shell) {
case 'bash':
return generateBashCompletions();
case 'zsh':
return generateZshCompletions();
case 'fish':
return generateFishCompletions();
case 'powershell':
return generatePowerShellCompletions();
}
}
function generateBashCompletions(): string {
return `# Cortex CLI completions for Bash
# Add to ~/.bashrc or /etc/bash_completion.d/cortex
_cortex_completions() {
local cur prev words cword
_init_completion -n : || return
local commands="${COMMANDS.join(' ')}"
local kinds="${KINDS.join(' ')}"
local statuses="${STATUSES.join(' ')}"
local edge_types="${EDGE_TYPES.join(' ')}"
case "\${prev}" in
cortex)
COMPREPLY=($(compgen -W "\${commands}" -- "\${cur}"))
return 0
;;
add)
COMPREPLY=($(compgen -W "\${kinds}" -- "\${cur}"))
return 0
;;
--kind|-k)
COMPREPLY=($(compgen -W "\${kinds}" -- "\${cur}"))
return 0
;;
--status|-s)
COMPREPLY=($(compgen -W "\${statuses}" -- "\${cur}"))
return 0
;;
--type)
COMPREPLY=($(compgen -W "\${edge_types}" -- "\${cur}"))
return 0
;;
--tags|-t)
local tags=$(cortex --get-tags 2>/dev/null)
COMPREPLY=($(compgen -W "\${tags}" -- "\${cur}"))
return 0
;;
--graph)
local graphs=$(cortex --get-graphs 2>/dev/null)
COMPREPLY=($(compgen -W "\${graphs}" -- "\${cur}"))
return 0
;;
show|update|remove|children|history|diff|restore)
local nodes=$(cortex --get-nodes "\${cur}" 2>/dev/null)
COMPREPLY=($(compgen -W "\${nodes}" -- "\${cur}"))
return 0
;;
link)
local nodes=$(cortex --get-nodes "\${cur}" 2>/dev/null)
COMPREPLY=($(compgen -W "\${nodes}" -- "\${cur}"))
return 0
;;
use)
local graphs=$(cortex --get-graphs 2>/dev/null)
COMPREPLY=($(compgen -W "\${graphs}" -- "\${cur}"))
return 0
;;
import)
COMPREPLY=($(compgen -W "obsidian markdown" -- "\${cur}"))
return 0
;;
export|--format|-f)
COMPREPLY=($(compgen -W "html svg mermaid markdown jsonld" -- "\${cur}"))
return 0
;;
esac
if [[ "\${cur}" == -* ]]; then
local opts="--help --version --kind --status --tags --limit --format --output --graph --all-graphs"
COMPREPLY=($(compgen -W "\${opts}" -- "\${cur}"))
return 0
fi
}
complete -F _cortex_completions cortex
`;
}
function generateZshCompletions(): string {
return `#compdef cortex
# Cortex CLI completions for Zsh
# Add to ~/.zsh/completions/_cortex
_cortex() {
local -a commands
commands=(
'add:Add a node to the knowledge graph'
'query:Search the knowledge graph'
'show:Show a node and its connections'
'list:List nodes'
'update:Update a node'
'remove:Remove a node'
'link:Create a link between nodes'
'graph:Display graph structure'
'children:List child nodes'
'decay:Mark old nodes as stale'
'serve:Start web server'
'history:View node version history'
'diff:Compare node versions'
'restore:Restore node to previous version'
'capture:Capture conversation as memory'
'context:Show context for session'
'config:Configure capture settings'
'index:Index project codebase'
'journal:Daily journal'
'ingest:Ingest content from URL'
'clip:Quick clip a URL'
'export:Export graph as HTML/SVG/Mermaid'
'viz:Export interactive visualization'
'import:Import from Obsidian/Markdown'
'backup:Create database backup'
'restore-backup:Restore from backup'
'list-backups:List backup files'
'graphs:Manage knowledge graphs'
'use:Switch active graph'
'init:Initialize project with graph'
'smart-search:Context-aware search'
'ss:Alias for smart-search'
'what:What should I know right now?'
'now:Show current context'
'tui:Launch interactive TUI'
'ui:Alias for tui'
)
local -a kinds=(memory component task decision)
local -a statuses=(active todo in_progress done deprecated)
local -a edge_types=(depends_on contains implements blocked_by subtask_of relates_to supersedes about)
_arguments -C \\
'1:command:->command' \\
'*::arg:->args'
case "$state" in
command)
_describe -t commands 'cortex commands' commands
;;
args)
case "$words[1]" in
add)
_arguments \\
'1:kind:(memory component task decision)' \\
'--title[Node title]:title:' \\
'--content[Node content]:content:' \\
'--tags[Tags]:tags:->tags' \\
'--status[Status]:status:(${STATUSES.join(' ')})'
;;
show|update|remove|children|history)
_arguments '1:node:->nodes'
;;
link)
_arguments \\
'1:from:->nodes' \\
'2:to:->nodes' \\
'--type[Edge type]:type:(${EDGE_TYPES.join(' ')})'
;;
list|query)
_arguments \\
'--kind[Filter by kind]:kind:(${KINDS.join(' ')})' \\
'--status[Filter by status]:status:(${STATUSES.join(' ')})' \\
'--tags[Filter by tags]:tags:->tags' \\
'--limit[Max results]:limit:' \\
'--graph[Query specific graph]:graph:->graphs' \\
'--all-graphs[Search all graphs]'
;;
use)
_arguments '1:graph:->graphs'
;;
import)
_arguments \\
'1:source:(obsidian markdown)' \\
'2:path:_files -/'
;;
export)
_arguments \\
'--format[Export format]:format:(html svg mermaid markdown jsonld)' \\
'--output[Output file]:file:_files' \\
'--kind[Filter by kind]:kind:(${KINDS.join(' ')})'
;;
esac
;;
esac
case "$state" in
nodes)
local -a nodes
nodes=(\${(f)"$(cortex --get-nodes 2>/dev/null)"})
_describe -t nodes 'nodes' nodes
;;
tags)
local -a tags
tags=(\${(f)"$(cortex --get-tags 2>/dev/null)"})
_describe -t tags 'tags' tags
;;
graphs)
local -a graphs
graphs=(\${(f)"$(cortex --get-graphs 2>/dev/null)"})
_describe -t graphs 'graphs' graphs
;;
esac
}
_cortex
`;
}
function generateFishCompletions(): string {
return `# Cortex CLI completions for Fish
# Add to ~/.config/fish/completions/cortex.fish
# Disable file completions for cortex
complete -c cortex -f
# Main commands
complete -c cortex -n __fish_use_subcommand -a add -d 'Add a node'
complete -c cortex -n __fish_use_subcommand -a query -d 'Search the graph'
complete -c cortex -n __fish_use_subcommand -a show -d 'Show a node'
complete -c cortex -n __fish_use_subcommand -a list -d 'List nodes'
complete -c cortex -n __fish_use_subcommand -a update -d 'Update a node'
complete -c cortex -n __fish_use_subcommand -a remove -d 'Remove a node'
complete -c cortex -n __fish_use_subcommand -a link -d 'Link nodes'
complete -c cortex -n __fish_use_subcommand -a graph -d 'Display graph'
complete -c cortex -n __fish_use_subcommand -a children -d 'List children'
complete -c cortex -n __fish_use_subcommand -a decay -d 'Mark stale nodes'
complete -c cortex -n __fish_use_subcommand -a serve -d 'Start server'
complete -c cortex -n __fish_use_subcommand -a history -d 'View history'
complete -c cortex -n __fish_use_subcommand -a diff -d 'Compare versions'
complete -c cortex -n __fish_use_subcommand -a restore -d 'Restore version'
complete -c cortex -n __fish_use_subcommand -a capture -d 'Capture memory'
complete -c cortex -n __fish_use_subcommand -a context -d 'Show context'
complete -c cortex -n __fish_use_subcommand -a config -d 'Configure'
complete -c cortex -n __fish_use_subcommand -a index -d 'Index project'
complete -c cortex -n __fish_use_subcommand -a journal -d 'Daily journal'
complete -c cortex -n __fish_use_subcommand -a ingest -d 'Ingest URL'
complete -c cortex -n __fish_use_subcommand -a clip -d 'Clip URL'
complete -c cortex -n __fish_use_subcommand -a export -d 'Export graph'
complete -c cortex -n __fish_use_subcommand -a viz -d 'Visualize'
complete -c cortex -n __fish_use_subcommand -a import -d 'Import data'
complete -c cortex -n __fish_use_subcommand -a backup -d 'Backup database'
complete -c cortex -n __fish_use_subcommand -a restore-backup -d 'Restore backup'
complete -c cortex -n __fish_use_subcommand -a list-backups -d 'List backups'
complete -c cortex -n __fish_use_subcommand -a graphs -d 'Manage graphs'
complete -c cortex -n __fish_use_subcommand -a use -d 'Switch graph'
complete -c cortex -n __fish_use_subcommand -a init -d 'Init project'
complete -c cortex -n __fish_use_subcommand -a smart-search -d 'Smart search'
complete -c cortex -n __fish_use_subcommand -a ss -d 'Smart search alias'
complete -c cortex -n __fish_use_subcommand -a what -d 'What should I know?'
complete -c cortex -n __fish_use_subcommand -a now -d 'Current context'
complete -c cortex -n __fish_use_subcommand -a tui -d 'Interactive TUI'
complete -c cortex -n __fish_use_subcommand -a ui -d 'TUI alias'
# Kind completions
complete -c cortex -n '__fish_seen_subcommand_from add' -a 'memory component task decision'
# Node ID completions
complete -c cortex -n '__fish_seen_subcommand_from show update remove children history' -a '(cortex --get-nodes 2>/dev/null)'
# Graph completions
complete -c cortex -n '__fish_seen_subcommand_from use' -a '(cortex --get-graphs 2>/dev/null)'
# Import source completions
complete -c cortex -n '__fish_seen_subcommand_from import' -a 'obsidian markdown'
# Export format completions
complete -c cortex -n '__fish_seen_subcommand_from export' -l format -a 'html svg mermaid markdown jsonld'
# Common options
complete -c cortex -l kind -s k -d 'Filter by kind' -a 'memory component task decision'
complete -c cortex -l status -s s -d 'Filter by status' -a 'active todo in_progress done deprecated'
complete -c cortex -l tags -s t -d 'Filter by tags' -a '(cortex --get-tags 2>/dev/null)'
complete -c cortex -l limit -s l -d 'Max results'
complete -c cortex -l graph -d 'Specific graph' -a '(cortex --get-graphs 2>/dev/null)'
complete -c cortex -l all-graphs -d 'Search all graphs'
complete -c cortex -l help -s h -d 'Show help'
`;
}
function generatePowerShellCompletions(): string {
return `# Cortex CLI completions for PowerShell
# Add to your $PROFILE
Register-ArgumentCompleter -Native -CommandName cortex -ScriptBlock {
param($wordToComplete, $commandAst, $cursorPosition)
$commands = @('add', 'query', 'show', 'list', 'update', 'remove', 'link', 'graph',
'children', 'decay', 'serve', 'history', 'diff', 'restore', 'capture',
'context', 'config', 'index', 'journal', 'ingest', 'clip', 'export',
'viz', 'import', 'backup', 'restore-backup', 'list-backups', 'graphs',
'use', 'init', 'smart-search', 'ss', 'what', 'now', 'tui', 'ui')
$kinds = @('memory', 'component', 'task', 'decision')
$statuses = @('active', 'todo', 'in_progress', 'done', 'deprecated')
$edgeTypes = @('depends_on', 'contains', 'implements', 'blocked_by', 'subtask_of', 'relates_to', 'supersedes', 'about')
$elements = $commandAst.CommandElements
$command = if ($elements.Count -gt 1) { $elements[1].Value } else { $null }
# Command completion
if ($elements.Count -le 2) {
$commands | Where-Object { $_ -like "$wordToComplete*" } | ForEach-Object {
[System.Management.Automation.CompletionResult]::new($_, $_, 'ParameterValue', $_)
}
return
}
# Sub-command argument completion
switch ($command) {
'add' {
if ($elements.Count -eq 3) {
$kinds | Where-Object { $_ -like "$wordToComplete*" } | ForEach-Object {
[System.Management.Automation.CompletionResult]::new($_, $_, 'ParameterValue', $_)
}
}
}
{ $_ -in @('show', 'update', 'remove', 'children', 'history') } {
$nodes = cortex --get-nodes $wordToComplete 2>$null
if ($nodes) {
$nodes -split "\\n" | Where-Object { $_ -like "$wordToComplete*" } | ForEach-Object {
[System.Management.Automation.CompletionResult]::new($_, $_, 'ParameterValue', $_)
}
}
}
'use' {
$graphs = cortex --get-graphs 2>$null
if ($graphs) {
$graphs -split "\\n" | Where-Object { $_ -like "$wordToComplete*" } | ForEach-Object {
[System.Management.Automation.CompletionResult]::new($_, $_, 'ParameterValue', $_)
}
}
}
'import' {
if ($elements.Count -eq 3) {
@('obsidian', 'markdown') | Where-Object { $_ -like "$wordToComplete*" } | ForEach-Object {
[System.Management.Automation.CompletionResult]::new($_, $_, 'ParameterValue', $_)
}
}
}
}
# Option completion
$prevElement = if ($elements.Count -gt 2) { $elements[-2].Value } else { $null }
switch ($prevElement) {
{ $_ -in @('--kind', '-k') } {
$kinds | Where-Object { $_ -like "$wordToComplete*" } | ForEach-Object {
[System.Management.Automation.CompletionResult]::new($_, $_, 'ParameterValue', $_)
}
}
{ $_ -in @('--status', '-s') } {
$statuses | Where-Object { $_ -like "$wordToComplete*" } | ForEach-Object {
[System.Management.Automation.CompletionResult]::new($_, $_, 'ParameterValue', $_)
}
}
'--type' {
$edgeTypes | Where-Object { $_ -like "$wordToComplete*" } | ForEach-Object {
[System.Management.Automation.CompletionResult]::new($_, $_, 'ParameterValue', $_)
}
}
{ $_ -in @('--format', '-f') } {
@('html', 'svg', 'mermaid', 'markdown', 'jsonld') | Where-Object { $_ -like "$wordToComplete*" } | ForEach-Object {
[System.Management.Automation.CompletionResult]::new($_, $_, 'ParameterValue', $_)
}
}
}
}
`;
}

View File

@@ -11,9 +11,23 @@ import { graphCommand } from './commands/graph';
import { serveCommand } from './commands/serve';
import { decayCommand } from './commands/decay';
import { childrenCommand } from './commands/children';
import { historyCommand } from './commands/history';
import { diffCommand } from './commands/diff';
import { restoreCommand } from './commands/restore';
import { captureCommand, captureHookCommand, configCommand } from './commands/capture';
import { contextCommand, contextHookCommand } from './commands/context';
import { configCommand } from './commands/config';
import { indexCommand } from './commands/index-cmd';
import { journalCommand, journalAliasCommand, quickCaptureCommand } from './commands/journal';
import { ingestCommand, clipCommand } from './commands/ingest';
import { exportCommand, vizCommand } from './commands/export';
import { importCommand } from './commands/import';
import { backupCommand, restoreDbCommand, listBackupsCommand } from './commands/backup-cmd';
import { graphsCommand, useCommand, initCommand } from './commands/graphs';
import { smartSearchCommand, ssCommand, whatCommand, contextAwareCommand } from './commands/smart';
import { tuiCommand, uiCommand } from './commands/tui';
import { completionsCommand, getNodesCommand, getTagsCommand, getGraphsCommand } from './commands/completions';
import { closeDb } from '../core/db';
import { migrateOldDatabase } from '../core/db';
const program = new Command();
@@ -33,9 +47,42 @@ program.addCommand(graphCommand);
program.addCommand(serveCommand);
program.addCommand(decayCommand);
program.addCommand(childrenCommand);
program.addCommand(historyCommand);
program.addCommand(diffCommand);
program.addCommand(restoreCommand);
program.addCommand(captureCommand);
program.addCommand(captureHookCommand);
program.addCommand(contextCommand);
program.addCommand(contextHookCommand);
program.addCommand(configCommand);
program.addCommand(indexCommand);
program.addCommand(journalCommand);
program.addCommand(journalAliasCommand);
program.addCommand(quickCaptureCommand);
program.addCommand(ingestCommand);
program.addCommand(clipCommand);
program.addCommand(exportCommand);
program.addCommand(vizCommand);
program.addCommand(importCommand);
program.addCommand(backupCommand);
program.addCommand(restoreDbCommand);
program.addCommand(listBackupsCommand);
program.addCommand(graphsCommand);
program.addCommand(useCommand);
program.addCommand(initCommand);
program.addCommand(smartSearchCommand);
program.addCommand(ssCommand);
program.addCommand(whatCommand);
program.addCommand(contextAwareCommand);
program.addCommand(tuiCommand);
program.addCommand(uiCommand);
program.addCommand(completionsCommand);
program.addCommand(getNodesCommand);
program.addCommand(getTagsCommand);
program.addCommand(getGraphsCommand);
// Check for old database migration
migrateOldDatabase();
program.hook('postAction', () => {
closeDb();

98
src/core/backup.ts Normal file
View File

@@ -0,0 +1,98 @@
import * as fs from 'fs';
import * as path from 'path';
import { getDb, closeDb, getMemoryDir } from './db';
export interface BackupOptions {
compress?: boolean;
}
export interface BackupResult {
path: string;
size: number;
nodes: number;
edges: number;
}
export async function createBackup(outputPath: string, options: BackupOptions = {}): Promise<BackupResult> {
const db = getDb();
const absPath = path.resolve(outputPath);
// Ensure output directory exists
const dir = path.dirname(absPath);
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, { recursive: true });
}
// Get counts
const nodeCount = (db.prepare('SELECT COUNT(*) as count FROM nodes').get() as any).count;
const edgeCount = (db.prepare('SELECT COUNT(*) as count FROM edges').get() as any).count;
// Use SQLite backup API via VACUUM INTO
db.exec(`VACUUM INTO '${absPath.replace(/'/g, "''")}'`);
const stats = fs.statSync(absPath);
return {
path: absPath,
size: stats.size,
nodes: nodeCount,
edges: edgeCount,
};
}
export async function restoreBackup(backupPath: string): Promise<{ nodes: number; edges: number }> {
const absBackupPath = path.resolve(backupPath);
if (!fs.existsSync(absBackupPath)) {
throw new Error(`Backup file does not exist: ${absBackupPath}`);
}
const dbPath = path.join(getMemoryDir(), 'cortex.db');
// Close current database
closeDb();
// Create backup of current database
const timestamp = new Date().toISOString().replace(/[:.]/g, '-');
const currentBackup = `${dbPath}.before-restore-${timestamp}`;
if (fs.existsSync(dbPath)) {
fs.copyFileSync(dbPath, currentBackup);
}
// Copy backup file to database path
fs.copyFileSync(absBackupPath, dbPath);
// Reopen database and get counts
const db = getDb();
const nodeCount = (db.prepare('SELECT COUNT(*) as count FROM nodes').get() as any).count;
const edgeCount = (db.prepare('SELECT COUNT(*) as count FROM edges').get() as any).count;
return {
nodes: nodeCount,
edges: edgeCount,
};
}
export function listBackups(directory: string): { name: string; size: number; modified: Date }[] {
const absDir = path.resolve(directory);
if (!fs.existsSync(absDir)) {
return [];
}
const files = fs.readdirSync(absDir);
const backups: { name: string; size: number; modified: Date }[] = [];
for (const file of files) {
if (file.endsWith('.cortex') || file.endsWith('.db') || file.endsWith('.sqlite')) {
const stats = fs.statSync(path.join(absDir, file));
backups.push({
name: file,
size: stats.size,
modified: stats.mtime,
});
}
}
return backups.sort((a, b) => b.modified.getTime() - a.modified.getTime());
}

View File

@@ -0,0 +1,68 @@
import { getDb } from '../db';
export type CaptureMode = 'always' | 'manual' | 'decisions' | 'off';
export interface CaptureConfig {
mode: CaptureMode;
minLength: number;
excludePatterns: string[];
autoTag: boolean;
linkRelated: boolean;
similarityThreshold: number;
mergeThreshold: number;
}
const DEFAULT_CONFIG: CaptureConfig = {
mode: 'always',
minLength: 100,
excludePatterns: [],
autoTag: true,
linkRelated: true,
similarityThreshold: 0.75,
mergeThreshold: 0.90,
};
function ensureConfigTable(): void {
const db = getDb();
db.exec(`
CREATE TABLE IF NOT EXISTS system_config (
key TEXT PRIMARY KEY,
value TEXT NOT NULL,
updated_at INTEGER NOT NULL
)
`);
}
export function getCaptureConfig(): CaptureConfig {
ensureConfigTable();
const db = getDb();
const row = db.prepare('SELECT value FROM system_config WHERE key = ?').get('capture') as { value: string } | undefined;
if (!row) return DEFAULT_CONFIG;
try {
return { ...DEFAULT_CONFIG, ...JSON.parse(row.value) };
} catch {
return DEFAULT_CONFIG;
}
}
export function setCaptureConfig(updates: Partial<CaptureConfig>): CaptureConfig {
ensureConfigTable();
const db = getDb();
const current = getCaptureConfig();
const updated = { ...current, ...updates };
db.prepare(`
INSERT INTO system_config (key, value, updated_at) VALUES (?, ?, ?)
ON CONFLICT(key) DO UPDATE SET value = excluded.value, updated_at = excluded.updated_at
`).run('capture', JSON.stringify(updated), Date.now());
return updated;
}
export function getConfigValue<K extends keyof CaptureConfig>(key: K): CaptureConfig[K] {
return getCaptureConfig()[key];
}
export function setConfigValue<K extends keyof CaptureConfig>(key: K, value: CaptureConfig[K]): void {
setCaptureConfig({ [key]: value } as Partial<CaptureConfig>);
}

113
src/core/capture/dedupe.ts Normal file
View File

@@ -0,0 +1,113 @@
import { listNodes, addEdge, updateNode } from '../store';
import { getEmbedding } from '../search/ollama';
import { cosineSimilarity } from '../search/vector';
import { Node } from '../../types';
import { getCaptureConfig } from './config';
export interface SimilarNode {
node: Node;
similarity: number;
}
export interface DedupeResult {
action: 'create' | 'merge' | 'link';
existingNode?: Node;
similarity?: number;
}
export async function findSimilarNodes(
text: string,
limit: number = 5
): Promise<SimilarNode[]> {
const embedding = await getEmbedding(text);
if (!embedding) return [];
const nodes = listNodes({ includeStale: false });
const withEmbeddings = nodes.filter(n => n.embedding && n.embedding.length > 0);
const scored: SimilarNode[] = [];
for (const node of withEmbeddings) {
const similarity = cosineSimilarity(embedding, node.embedding!);
if (similarity > 0.5) {
scored.push({ node, similarity });
}
}
return scored
.sort((a, b) => b.similarity - a.similarity)
.slice(0, limit);
}
export async function checkDuplicate(
summary: string,
content: string
): Promise<DedupeResult> {
const config = getCaptureConfig();
const textToCompare = `${summary} ${content}`;
const similar = await findSimilarNodes(textToCompare, 1);
if (similar.length === 0) {
return { action: 'create' };
}
const { node, similarity } = similar[0];
if (similarity >= config.mergeThreshold) {
return {
action: 'merge',
existingNode: node,
similarity,
};
}
if (similarity >= config.similarityThreshold) {
return {
action: 'link',
existingNode: node,
similarity,
};
}
return { action: 'create' };
}
export async function mergeIntoNode(
existingId: string,
newSummary: string,
newContent: string,
newTags: string[]
): Promise<Node | null> {
const existing = listNodes({ includeStale: false }).find(n => n.id === existingId);
if (!existing) return null;
// Append new content with timestamp
const timestamp = new Date().toISOString().slice(0, 10);
const mergedContent = existing.content
? `${existing.content}\n\n---\n[${timestamp}]\n${newContent}`
: newContent;
// Merge tags (dedupe)
const mergedTags = [...new Set([...existing.tags, ...newTags])];
// Update the existing node
return updateNode(existingId, {
content: mergedContent,
tags: mergedTags,
metadata: {
...existing.metadata,
lastMergedAt: Date.now(),
mergeCount: (existing.metadata.mergeCount || 0) + 1,
},
});
}
export async function linkRelatedNode(
newNodeId: string,
existingNodeId: string
): Promise<void> {
addEdge(newNodeId, existingNodeId, 'relates_to', {
reason: 'auto-capture-similarity',
linkedAt: Date.now(),
});
}

192
src/core/capture/index.ts Normal file
View File

@@ -0,0 +1,192 @@
import { addNode } from '../store';
import { getCaptureConfig, CaptureConfig } from './config';
import { extractMemoryData, shouldCapture, ExtractedMemory } from './summarize';
import { checkDuplicate, mergeIntoNode, linkRelatedNode } from './dedupe';
import { Node } from '../../types';
export { getCaptureConfig, setCaptureConfig, CaptureMode, CaptureConfig } from './config';
export { extractMemoryData, shouldCapture, ExtractedMemory } from './summarize';
export { findSimilarNodes, checkDuplicate, mergeIntoNode } from './dedupe';
export interface CaptureInput {
conversation: string;
sessionId?: string;
filesChanged?: string[];
source?: string;
}
export interface CaptureResult {
captured: boolean;
action: 'created' | 'merged' | 'linked' | 'skipped';
node?: Node;
reason?: string;
}
export async function captureConversation(input: CaptureInput): Promise<CaptureResult> {
const config = getCaptureConfig();
// Check if capture is enabled
if (config.mode === 'off') {
return { captured: false, action: 'skipped', reason: 'capture disabled' };
}
// Check minimum length
if (!shouldCapture(input.conversation, config.minLength)) {
return { captured: false, action: 'skipped', reason: 'conversation too short or trivial' };
}
// Check exclude patterns
for (const pattern of config.excludePatterns) {
try {
if (new RegExp(pattern, 'i').test(input.conversation)) {
return { captured: false, action: 'skipped', reason: `matched exclude pattern: ${pattern}` };
}
} catch {
// Invalid regex, skip
}
}
// Extract memory data using Ollama
const extracted = await extractMemoryData(input.conversation);
if (!extracted) {
return { captured: false, action: 'skipped', reason: 'failed to extract memory data' };
}
// For "decisions" mode, only capture if decisions were found
if (config.mode === 'decisions' && extracted.decisions.length === 0) {
return { captured: false, action: 'skipped', reason: 'no decisions found (decisions mode)' };
}
// Build content
const contentParts: string[] = [extracted.summary];
if (extracted.decisions.length > 0) {
contentParts.push('\n## Decisions');
for (const d of extracted.decisions) {
contentParts.push(`- ${d}`);
}
}
if (extracted.filesDiscussed.length > 0 || input.filesChanged?.length) {
const files = [...new Set([...extracted.filesDiscussed, ...(input.filesChanged || [])])];
contentParts.push('\n## Files');
for (const f of files) {
contentParts.push(`- ${f}`);
}
}
const content = contentParts.join('\n');
// Check for duplicates
const dedupeResult = await checkDuplicate(extracted.summary, content);
// Build tags
const tags = ['auto-capture'];
if (config.autoTag && extracted.topics.length > 0) {
tags.push(...extracted.topics);
}
if (input.source) {
tags.push(`source:${input.source}`);
}
if (dedupeResult.action === 'merge' && dedupeResult.existingNode) {
// Merge into existing node
const merged = await mergeIntoNode(
dedupeResult.existingNode.id,
extracted.summary,
content,
tags
);
return {
captured: true,
action: 'merged',
node: merged || undefined,
reason: `merged with existing node (similarity: ${(dedupeResult.similarity! * 100).toFixed(1)}%)`,
};
}
// Create new node
const node = await addNode({
kind: 'memory',
title: extracted.summary.slice(0, 100),
content,
tags,
status: 'active',
metadata: {
sessionId: input.sessionId,
filesChanged: input.filesChanged,
source: input.source || 'claude-code',
capturedAt: Date.now(),
decisions: extracted.decisions,
},
});
// Link to related node if found
if (dedupeResult.action === 'link' && dedupeResult.existingNode && config.linkRelated) {
await linkRelatedNode(node.id, dedupeResult.existingNode.id);
return {
captured: true,
action: 'linked',
node,
reason: `linked to related node (similarity: ${(dedupeResult.similarity! * 100).toFixed(1)}%)`,
};
}
return {
captured: true,
action: 'created',
node,
};
}
export async function captureText(
text: string,
options: { tags?: string[]; source?: string } = {}
): Promise<CaptureResult> {
const config = getCaptureConfig();
if (config.mode === 'off') {
return { captured: false, action: 'skipped', reason: 'capture disabled' };
}
// Simple text capture - no summarization needed
const dedupeResult = await checkDuplicate(text, text);
const tags = ['manual-capture', ...(options.tags || [])];
if (options.source) {
tags.push(`source:${options.source}`);
}
if (dedupeResult.action === 'merge' && dedupeResult.existingNode) {
const merged = await mergeIntoNode(
dedupeResult.existingNode.id,
text.slice(0, 100),
text,
tags
);
return {
captured: true,
action: 'merged',
node: merged || undefined,
};
}
const node = await addNode({
kind: 'memory',
title: text.slice(0, 100),
content: text,
tags,
status: 'active',
metadata: {
source: options.source || 'manual',
capturedAt: Date.now(),
},
});
if (dedupeResult.action === 'link' && dedupeResult.existingNode && config.linkRelated) {
await linkRelatedNode(node.id, dedupeResult.existingNode.id);
return { captured: true, action: 'linked', node };
}
return { captured: true, action: 'created', node };
}

View File

@@ -0,0 +1,160 @@
import { generate, isGenAvailable } from '../search/ollamaGen';
export interface ExtractedMemory {
summary: string;
topics: string[];
decisions: string[];
filesDiscussed: string[];
}
const SUMMARIZE_PROMPT = `Summarize this Claude Code conversation in 1-2 sentences.
Focus on: what was accomplished, decisions made, problems solved.
Do NOT include greetings or meta-discussion.
Conversation:
{conversation}
Summary:`;
const EXTRACT_PROMPT = `Extract from this conversation:
1. Main topics (as tags, lowercase, hyphenated, max 5)
2. Decisions made (if any, max 3)
3. Code files discussed or modified (if any)
Conversation:
{conversation}
Output as JSON only, no explanation:
{"topics": [], "decisions": [], "files": []}`;
export async function summarizeConversation(conversation: string): Promise<string | null> {
if (!(await isGenAvailable())) return null;
const prompt = SUMMARIZE_PROMPT.replace('{conversation}', conversation);
return generate(prompt);
}
export async function extractMemoryData(conversation: string): Promise<ExtractedMemory | null> {
const available = await isGenAvailable();
// Get summary
const summary = available
? await summarizeConversation(conversation)
: createFallbackSummary(conversation);
if (!summary) return null;
// Extract structured data
let topics: string[] = [];
let decisions: string[] = [];
let filesDiscussed: string[] = [];
if (available) {
const extractPrompt = EXTRACT_PROMPT.replace('{conversation}', conversation);
const extracted = await generate(extractPrompt);
if (extracted) {
try {
// Find JSON in response (handle cases where model adds explanation)
const jsonMatch = extracted.match(/\{[\s\S]*\}/);
if (jsonMatch) {
const data = JSON.parse(jsonMatch[0]);
topics = Array.isArray(data.topics) ? data.topics.slice(0, 5) : [];
decisions = Array.isArray(data.decisions) ? data.decisions.slice(0, 3) : [];
filesDiscussed = Array.isArray(data.files) ? data.files : [];
}
} catch {
// Fall back to basic extraction
topics = extractTopicsBasic(conversation);
filesDiscussed = extractFilesBasic(conversation);
}
}
} else {
// Basic extraction without AI
topics = extractTopicsBasic(conversation);
filesDiscussed = extractFilesBasic(conversation);
}
return {
summary,
topics: sanitizeTags(topics),
decisions,
filesDiscussed,
};
}
function createFallbackSummary(conversation: string): string {
// Take first meaningful line as summary
const lines = conversation.split('\n').filter(l => l.trim().length > 20);
if (lines.length === 0) return 'Conversation captured';
const first = lines[0].trim();
return first.length > 150 ? first.slice(0, 147) + '...' : first;
}
function extractTopicsBasic(conversation: string): string[] {
const topics: string[] = [];
const lower = conversation.toLowerCase();
// Common programming topics
const keywords = [
'typescript', 'javascript', 'python', 'rust', 'go',
'react', 'vue', 'angular', 'node', 'express',
'database', 'sql', 'api', 'auth', 'authentication',
'bug', 'fix', 'error', 'refactor', 'test', 'deploy',
'git', 'docker', 'kubernetes', 'aws', 'cloud',
];
for (const kw of keywords) {
if (lower.includes(kw) && topics.length < 5) {
topics.push(kw);
}
}
return topics;
}
function extractFilesBasic(conversation: string): string[] {
const files: string[] = [];
// Match file paths
const filePatterns = [
/[\w\-\/]+\.(ts|js|tsx|jsx|py|rs|go|md|json|yaml|yml|toml|sql)/gi,
/src\/[\w\-\/]+/gi,
];
for (const pattern of filePatterns) {
const matches = conversation.match(pattern);
if (matches) {
for (const m of matches) {
if (!files.includes(m) && files.length < 10) {
files.push(m);
}
}
}
}
return files;
}
function sanitizeTags(tags: string[]): string[] {
return tags
.map(t => t.toLowerCase().trim().replace(/\s+/g, '-').replace(/[^a-z0-9\-]/g, ''))
.filter(t => t.length > 0 && t.length < 30);
}
export function shouldCapture(conversation: string, minLength: number): boolean {
// Skip very short conversations
if (conversation.length < minLength) return false;
// Skip if mostly greetings/pleasantries
const lower = conversation.toLowerCase();
const greetings = ['hello', 'hi ', 'hey', 'thanks', 'thank you', 'goodbye', 'bye'];
const greetingCount = greetings.filter(g => lower.includes(g)).length;
// If more than half the "content" is greetings, skip
const words = conversation.split(/\s+/).length;
if (words < 20 && greetingCount > 2) return false;
return true;
}

View File

@@ -1,6 +1,7 @@
import Database from 'better-sqlite3';
import path from 'path';
import fs from 'fs';
import { getActiveGraph, getGraphDbPath, graphExists, createGraph, getGraphsDir } from './graphs';
const SCHEMA = `
CREATE TABLE IF NOT EXISTS nodes (
@@ -32,6 +33,21 @@ CREATE TABLE IF NOT EXISTS node_tags (
PRIMARY KEY (node_id, tag)
);
CREATE TABLE IF NOT EXISTS node_versions (
id TEXT PRIMARY KEY,
node_id TEXT NOT NULL,
version INTEGER NOT NULL,
title TEXT NOT NULL,
content TEXT NOT NULL DEFAULT '',
status TEXT,
tags TEXT DEFAULT '[]',
metadata TEXT DEFAULT '{}',
valid_from INTEGER NOT NULL,
valid_until INTEGER,
created_by TEXT DEFAULT 'user',
FOREIGN KEY (node_id) REFERENCES nodes(id) ON DELETE CASCADE
);
CREATE INDEX IF NOT EXISTS idx_nodes_kind ON nodes(kind);
CREATE INDEX IF NOT EXISTS idx_nodes_status ON nodes(status);
CREATE INDEX IF NOT EXISTS idx_nodes_created ON nodes(created_at DESC);
@@ -40,26 +56,64 @@ CREATE INDEX IF NOT EXISTS idx_edges_from ON edges(from_id);
CREATE INDEX IF NOT EXISTS idx_edges_to ON edges(to_id);
CREATE INDEX IF NOT EXISTS idx_edges_type ON edges(type);
CREATE INDEX IF NOT EXISTS idx_tags_tag ON node_tags(tag);
CREATE INDEX IF NOT EXISTS idx_versions_node ON node_versions(node_id, version);
CREATE INDEX IF NOT EXISTS idx_versions_time ON node_versions(valid_from, valid_until);
CREATE UNIQUE INDEX IF NOT EXISTS idx_versions_node_version ON node_versions(node_id, version);
`;
let _db: Database.Database | null = null;
let _currentGraph: string | null = null;
/**
* Get the memory directory for backward compatibility
* Now returns the active graph's directory
*/
export function getMemoryDir(): string {
return path.join(process.cwd(), '.memory');
const activeGraph = getActiveGraph();
const graphDir = path.dirname(getGraphDbPath(activeGraph));
if (!fs.existsSync(graphDir)) {
fs.mkdirSync(graphDir, { recursive: true });
}
return graphDir;
}
/**
* Get the database connection, creating it if necessary
* Automatically handles graph switching
*/
export function getDb(): Database.Database {
if (_db) return _db;
const activeGraph = getActiveGraph();
// Check if we need to switch graphs
if (_db && _currentGraph === activeGraph) {
return _db;
}
// Close existing connection if switching graphs
if (_db && _currentGraph !== activeGraph) {
_db.close();
_db = null;
}
// Ensure graph exists
if (!graphExists(activeGraph)) {
createGraph(activeGraph);
}
const dbPath = getGraphDbPath(activeGraph);
const dir = path.dirname(dbPath);
const dir = getMemoryDir();
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, { recursive: true });
}
_db = new Database(path.join(dir, 'cortex.db'));
_db = new Database(dbPath);
_db.pragma('journal_mode = WAL');
_db.pragma('foreign_keys = ON');
_db.exec(SCHEMA);
_currentGraph = activeGraph;
// Migration: add last_accessed_at column
const cols = _db.prepare("PRAGMA table_info(nodes)").all() as any[];
@@ -68,12 +122,113 @@ export function getDb(): Database.Database {
_db.exec('UPDATE nodes SET last_accessed_at = updated_at WHERE last_accessed_at IS NULL');
}
// Migration: add version column to nodes table
if (!cols.some((c: any) => c.name === 'version')) {
_db.exec('ALTER TABLE nodes ADD COLUMN version INTEGER DEFAULT 1');
_db.exec('UPDATE nodes SET version = 1 WHERE version IS NULL');
}
// Migration: backfill node_versions for existing nodes without versions
const existingWithoutVersion = _db.prepare(`
SELECT * FROM nodes WHERE id NOT IN (SELECT DISTINCT node_id FROM node_versions)
`).all() as any[];
if (existingWithoutVersion.length > 0) {
const insertVersion = _db.prepare(`
INSERT INTO node_versions (id, node_id, version, title, content, status, tags, metadata, valid_from, valid_until, created_by)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`);
for (const node of existingWithoutVersion) {
const versionId = require('crypto').randomUUID();
insertVersion.run(
versionId,
node.id,
1,
node.title,
node.content,
node.status,
node.tags,
node.metadata,
node.created_at,
null,
'migration'
);
}
}
return _db;
}
/**
* Close the database connection
*/
export function closeDb(): void {
if (_db) {
_db.close();
_db = null;
_currentGraph = null;
}
}
/**
* Get a database connection for a specific graph
* Does not change the active graph
*/
export function getDbForGraph(graphName: string): Database.Database {
if (!graphExists(graphName)) {
throw new Error(`Graph '${graphName}' does not exist`);
}
const dbPath = getGraphDbPath(graphName);
const db = new Database(dbPath);
db.pragma('journal_mode = WAL');
db.pragma('foreign_keys = ON');
return db;
}
/**
* Migrate existing .memory directory to new graphs system
*/
export function migrateOldDatabase(): boolean {
try {
const oldDir = path.join(process.cwd(), '.memory');
const oldDbPath = path.join(oldDir, 'cortex.db');
if (!fs.existsSync(oldDbPath)) {
return false;
}
// Check if we've already migrated
const defaultDbPath = getGraphDbPath('default');
if (fs.existsSync(defaultDbPath)) {
return false;
}
// Create default graph directory
const defaultGraphDir = path.dirname(defaultDbPath);
fs.mkdirSync(defaultGraphDir, { recursive: true });
// Copy old database to new location
fs.copyFileSync(oldDbPath, defaultDbPath);
// Try to rename old directory as backup (may fail if locked)
try {
const backupDir = `${oldDir}.migrated-${Date.now()}`;
fs.renameSync(oldDir, backupDir);
console.log(`Migrated .memory database to new multi-graph system`);
console.log(`Old database backed up to ${backupDir}`);
} catch {
// Directory might be locked, just leave it
console.log(`Migrated .memory database to new multi-graph system`);
console.log(`Note: Old .memory directory is in use and was not renamed`);
}
return true;
} catch {
// Migration failed, probably permissions or file lock
// Silently continue - the old database will still work
return false;
}
}

368
src/core/export/html.ts Normal file
View File

@@ -0,0 +1,368 @@
import { listNodes } from '../store';
import { getDb } from '../db';
import { Node } from '../../types';
export interface HtmlExportOptions {
rootId?: string;
depth?: number;
kind?: string;
tags?: string[];
theme?: 'light' | 'dark';
layout?: 'force' | 'radial';
title?: string;
}
interface GraphNode {
id: string;
label: string;
kind: string;
tags: string[];
group: number;
}
interface GraphLink {
source: string;
target: string;
type: string;
}
interface GraphData {
nodes: GraphNode[];
links: GraphLink[];
}
const KIND_GROUPS: Record<string, number> = {
component: 1,
decision: 2,
task: 3,
memory: 4,
};
const KIND_COLORS: Record<string, { light: string; dark: string }> = {
component: { light: '#4CAF50', dark: '#81C784' },
decision: { light: '#2196F3', dark: '#64B5F6' },
task: { light: '#FF9800', dark: '#FFB74D' },
memory: { light: '#9C27B0', dark: '#BA68C8' },
};
export async function exportHtml(options: HtmlExportOptions = {}): Promise<string> {
const data = await getGraphData(options);
return generateHtmlTemplate(data, options);
}
async function getGraphData(options: HtmlExportOptions): Promise<GraphData> {
const db = getDb();
// Get nodes
let nodes: Node[];
if (options.rootId) {
nodes = getSubgraphNodes(options.rootId, options.depth || 3);
} else {
nodes = listNodes({
kind: options.kind as any,
tags: options.tags,
limit: 500,
includeStale: false,
});
}
// Get edges between these nodes
const nodeIds = new Set(nodes.map(n => n.id));
const edges = db.prepare(`
SELECT * FROM edges
WHERE from_id IN (${[...nodeIds].map(() => '?').join(',')})
AND to_id IN (${[...nodeIds].map(() => '?').join(',')})
`).all([...nodeIds, ...nodeIds]) as any[];
return {
nodes: nodes.map(n => ({
id: n.id,
label: n.title.length > 40 ? n.title.slice(0, 37) + '...' : n.title,
kind: n.kind,
tags: n.tags,
group: KIND_GROUPS[n.kind] || 4,
})),
links: edges.map(e => ({
source: e.from_id,
target: e.to_id,
type: e.type,
})),
};
}
function getSubgraphNodes(rootId: string, maxDepth: number): Node[] {
const db = getDb();
const visited = new Set<string>();
const nodes: Node[] = [];
function traverse(id: string, depth: number) {
if (depth > maxDepth || visited.has(id)) return;
visited.add(id);
const row = db.prepare('SELECT * FROM nodes WHERE id = ? AND is_stale = 0').get(id) as any;
if (!row) return;
nodes.push({
id: row.id,
kind: row.kind,
title: row.title,
content: row.content,
status: row.status,
tags: JSON.parse(row.tags || '[]'),
metadata: JSON.parse(row.metadata || '{}'),
embedding: null,
createdAt: row.created_at,
updatedAt: row.updated_at,
lastAccessedAt: row.last_accessed_at,
isStale: false,
});
// Get connected nodes
const edges = db.prepare('SELECT to_id FROM edges WHERE from_id = ?').all(id) as any[];
for (const edge of edges) {
traverse(edge.to_id, depth + 1);
}
const reverseEdges = db.prepare('SELECT from_id FROM edges WHERE to_id = ?').all(id) as any[];
for (const edge of reverseEdges) {
traverse(edge.from_id, depth + 1);
}
}
traverse(rootId, 0);
return nodes;
}
function generateHtmlTemplate(data: GraphData, options: HtmlExportOptions): string {
const theme = options.theme || 'dark';
const title = options.title || 'Cortex Knowledge Graph';
const isDark = theme === 'dark';
const styles = `
* { margin: 0; padding: 0; box-sizing: border-box; }
body {
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
background: ${isDark ? '#1a1a2e' : '#f5f5f5'};
color: ${isDark ? '#eee' : '#333'};
overflow: hidden;
}
#graph { width: 100vw; height: 100vh; }
#sidebar {
position: fixed;
top: 20px;
right: 20px;
width: 300px;
max-height: calc(100vh - 40px);
background: ${isDark ? '#16213e' : '#fff'};
border-radius: 8px;
padding: 16px;
box-shadow: 0 4px 20px rgba(0,0,0,0.3);
overflow-y: auto;
display: none;
}
#sidebar.active { display: block; }
#sidebar h3 { margin-bottom: 8px; font-size: 16px; }
#sidebar .kind {
display: inline-block;
padding: 2px 8px;
border-radius: 4px;
font-size: 12px;
margin-bottom: 8px;
}
#sidebar .content {
font-size: 13px;
line-height: 1.5;
white-space: pre-wrap;
max-height: 300px;
overflow-y: auto;
}
#sidebar .tags { margin-top: 8px; }
#sidebar .tag {
display: inline-block;
padding: 2px 6px;
background: ${isDark ? '#0f3460' : '#e0e0e0'};
border-radius: 3px;
font-size: 11px;
margin: 2px;
}
#legend {
position: fixed;
bottom: 20px;
left: 20px;
background: ${isDark ? '#16213e' : '#fff'};
border-radius: 8px;
padding: 12px;
font-size: 12px;
}
#legend div { display: flex; align-items: center; margin: 4px 0; }
#legend .dot { width: 12px; height: 12px; border-radius: 50%; margin-right: 8px; }
#controls {
position: fixed;
top: 20px;
left: 20px;
background: ${isDark ? '#16213e' : '#fff'};
border-radius: 8px;
padding: 12px;
}
#search {
padding: 8px;
border: 1px solid ${isDark ? '#333' : '#ddd'};
border-radius: 4px;
background: ${isDark ? '#1a1a2e' : '#fff'};
color: ${isDark ? '#eee' : '#333'};
width: 200px;
}
.node { cursor: pointer; }
.node:hover { filter: brightness(1.2); }
.link { stroke-opacity: 0.6; }
`;
const script = `
const data = ${JSON.stringify(data)};
const width = window.innerWidth;
const height = window.innerHeight;
const kindColors = ${JSON.stringify(isDark ?
Object.fromEntries(Object.entries(KIND_COLORS).map(([k, v]) => [k, v.dark])) :
Object.fromEntries(Object.entries(KIND_COLORS).map(([k, v]) => [k, v.light]))
)};
const svg = d3.select("#graph")
.append("svg")
.attr("width", width)
.attr("height", height);
const g = svg.append("g");
// Zoom behavior
const zoom = d3.zoom()
.scaleExtent([0.1, 4])
.on("zoom", (event) => g.attr("transform", event.transform));
svg.call(zoom);
// Arrow markers
svg.append("defs").selectAll("marker")
.data(["depends_on", "contains", "relates_to", "implements"])
.join("marker")
.attr("id", d => "arrow-" + d)
.attr("viewBox", "0 -5 10 10")
.attr("refX", 20)
.attr("refY", 0)
.attr("markerWidth", 6)
.attr("markerHeight", 6)
.attr("orient", "auto")
.append("path")
.attr("fill", "${isDark ? '#666' : '#999'}")
.attr("d", "M0,-5L10,0L0,5");
const simulation = d3.forceSimulation(data.nodes)
.force("link", d3.forceLink(data.links).id(d => d.id).distance(120))
.force("charge", d3.forceManyBody().strength(-400))
.force("center", d3.forceCenter(width / 2, height / 2))
.force("collision", d3.forceCollide().radius(40));
const link = g.append("g")
.selectAll("line")
.data(data.links)
.join("line")
.attr("class", "link")
.attr("stroke", "${isDark ? '#444' : '#ccc'}")
.attr("stroke-width", 1.5)
.attr("marker-end", d => "url(#arrow-" + d.type + ")");
const node = g.append("g")
.selectAll("g")
.data(data.nodes)
.join("g")
.attr("class", "node")
.call(d3.drag()
.on("start", dragstarted)
.on("drag", dragged)
.on("end", dragended));
node.append("circle")
.attr("r", 12)
.attr("fill", d => kindColors[d.kind] || "#888");
node.append("text")
.text(d => d.label)
.attr("x", 16)
.attr("y", 4)
.attr("font-size", "11px")
.attr("fill", "${isDark ? '#ccc' : '#333'}");
node.on("click", (event, d) => {
const sidebar = document.getElementById("sidebar");
sidebar.classList.add("active");
sidebar.innerHTML = \`
<h3>\${d.label}</h3>
<span class="kind" style="background:\${kindColors[d.kind]}">\${d.kind}</span>
<div class="tags">\${d.tags.map(t => '<span class="tag">' + t + '</span>').join('')}</div>
<p style="margin-top:12px;font-size:11px;color:${isDark ? '#888' : '#666'}">ID: \${d.id.slice(0,8)}</p>
\`;
});
simulation.on("tick", () => {
link
.attr("x1", d => d.source.x)
.attr("y1", d => d.source.y)
.attr("x2", d => d.target.x)
.attr("y2", d => d.target.y);
node.attr("transform", d => "translate(" + d.x + "," + d.y + ")");
});
function dragstarted(event) {
if (!event.active) simulation.alphaTarget(0.3).restart();
event.subject.fx = event.subject.x;
event.subject.fy = event.subject.y;
}
function dragged(event) {
event.subject.fx = event.x;
event.subject.fy = event.y;
}
function dragended(event) {
if (!event.active) simulation.alphaTarget(0);
event.subject.fx = null;
event.subject.fy = null;
}
// Search
document.getElementById("search").addEventListener("input", (e) => {
const query = e.target.value.toLowerCase();
node.attr("opacity", d =>
query === "" || d.label.toLowerCase().includes(query) ? 1 : 0.2
);
});
// Close sidebar on click outside
svg.on("click", () => {
document.getElementById("sidebar").classList.remove("active");
});
`;
return `<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>${title}</title>
<script src="https://d3js.org/d3.v7.min.js"></script>
<style>${styles}</style>
</head>
<body>
<div id="graph"></div>
<div id="sidebar"></div>
<div id="controls">
<input type="text" id="search" placeholder="Search nodes...">
</div>
<div id="legend">
<div><span class="dot" style="background:${KIND_COLORS.component[theme]}"></span>Component</div>
<div><span class="dot" style="background:${KIND_COLORS.decision[theme]}"></span>Decision</div>
<div><span class="dot" style="background:${KIND_COLORS.task[theme]}"></span>Task</div>
<div><span class="dot" style="background:${KIND_COLORS.memory[theme]}"></span>Memory</div>
</div>
<script>${script}</script>
</body>
</html>`;
}

36
src/core/export/index.ts Normal file
View File

@@ -0,0 +1,36 @@
export { exportHtml, HtmlExportOptions } from './html';
export { exportMermaid, MermaidExportOptions } from './mermaid';
export { exportSvg, SvgExportOptions } from './svg';
export { exportMarkdown, MarkdownExportOptions } from './markdown';
export { exportJsonLd, JsonLdExportOptions } from './jsonld';
export type ExportFormat = 'html' | 'mermaid' | 'svg' | 'markdown' | 'jsonld';
export interface ExportOptions {
format: ExportFormat;
rootId?: string;
depth?: number;
kind?: string;
tags?: string[];
theme?: 'light' | 'dark';
width?: number;
height?: number;
direction?: 'TD' | 'LR' | 'BT' | 'RL';
title?: string;
}
export async function exportGraph(options: ExportOptions): Promise<string> {
switch (options.format) {
case 'html':
const { exportHtml } = await import('./html');
return exportHtml(options);
case 'mermaid':
const { exportMermaid } = await import('./mermaid');
return exportMermaid(options);
case 'svg':
const { exportSvg } = await import('./svg');
return exportSvg(options);
default:
throw new Error(`Unknown format: ${options.format}`);
}
}

83
src/core/export/jsonld.ts Normal file
View File

@@ -0,0 +1,83 @@
import { listNodes } from '../store';
import { getDb } from '../db';
import { Node, NodeKind } from '../../types';
export interface JsonLdExportOptions {
kind?: NodeKind;
tags?: string[];
pretty?: boolean;
}
export async function exportJsonLd(options: JsonLdExportOptions = {}): Promise<string> {
const db = getDb();
// Get nodes
const nodes = listNodes({
kind: options.kind,
tags: options.tags,
limit: 10000,
includeStale: false,
});
// Get all edges
const edges = db.prepare('SELECT * FROM edges').all() as any[];
// Build edge map for quick lookup
const edgesBySource = new Map<string, any[]>();
for (const edge of edges) {
const existing = edgesBySource.get(edge.from_id) || [];
existing.push(edge);
edgesBySource.set(edge.from_id, existing);
}
const jsonLd = {
'@context': {
'@vocab': 'https://schema.org/',
'cortex': 'https://cortex.memory/',
'node': 'cortex:Node',
'relates_to': { '@id': 'cortex:relatesTo', '@type': '@id' },
'contains': { '@id': 'cortex:contains', '@type': '@id' },
'depends_on': { '@id': 'cortex:dependsOn', '@type': '@id' },
'implements': { '@id': 'cortex:implements', '@type': '@id' },
'blocked_by': { '@id': 'cortex:blockedBy', '@type': '@id' },
'subtask_of': { '@id': 'cortex:subtaskOf', '@type': '@id' },
},
'@graph': nodes.map(node => {
const nodeEdges = edgesBySource.get(node.id) || [];
const relations: Record<string, string[]> = {};
for (const edge of nodeEdges) {
const type = edge.type.replace(/_/g, '-');
if (!relations[type]) relations[type] = [];
relations[type].push(`cortex:node/${edge.to_id}`);
}
return {
'@id': `cortex:node/${node.id}`,
'@type': kindToSchemaType(node.kind),
'identifier': node.id,
'name': node.title,
'description': node.content,
'keywords': node.tags,
'dateCreated': new Date(node.createdAt).toISOString(),
'dateModified': new Date(node.updatedAt).toISOString(),
...(node.status && { 'cortex:status': node.status }),
...(Object.keys(relations).length > 0 && relations),
};
}),
};
return options.pretty !== false
? JSON.stringify(jsonLd, null, 2)
: JSON.stringify(jsonLd);
}
function kindToSchemaType(kind: string): string {
switch (kind) {
case 'component': return 'SoftwareSourceCode';
case 'decision': return 'ChooseAction';
case 'task': return 'Action';
case 'memory': return 'Thing';
default: return 'Thing';
}
}

View File

@@ -0,0 +1,92 @@
import * as fs from 'fs';
import * as path from 'path';
import { listNodes, getNode } from '../store';
import { getConnections } from '../graph';
import { Node, NodeKind } from '../../types';
export interface MarkdownExportOptions {
kind?: NodeKind;
tags?: string[];
frontmatter?: boolean;
wikilinks?: boolean;
}
export async function exportMarkdown(outputDir: string, options: MarkdownExportOptions = {}): Promise<{ exported: number; files: string[] }> {
const absPath = path.resolve(outputDir);
// Create output directory
fs.mkdirSync(absPath, { recursive: true });
// Get nodes to export
const nodes = listNodes({
kind: options.kind,
tags: options.tags,
limit: 10000,
includeStale: false,
});
const files: string[] = [];
for (const node of nodes) {
const filename = sanitizeFilename(node.title) + '.md';
const filepath = path.join(absPath, filename);
const content = formatNodeAsMarkdown(node, options);
fs.writeFileSync(filepath, content);
files.push(filename);
}
return { exported: files.length, files };
}
function formatNodeAsMarkdown(node: Node, options: MarkdownExportOptions): string {
const lines: string[] = [];
// Frontmatter
if (options.frontmatter !== false) {
lines.push('---');
lines.push(`id: ${node.id}`);
lines.push(`kind: ${node.kind}`);
if (node.status) lines.push(`status: ${node.status}`);
if (node.tags.length) lines.push(`tags: [${node.tags.join(', ')}]`);
lines.push(`created: ${new Date(node.createdAt).toISOString()}`);
lines.push(`updated: ${new Date(node.updatedAt).toISOString()}`);
lines.push('---');
lines.push('');
}
// Title
lines.push(`# ${node.title}`);
lines.push('');
// Content
if (node.content) {
lines.push(node.content);
lines.push('');
}
// Related nodes as wikilinks
if (options.wikilinks !== false) {
const connections = getConnections(node.id);
const outgoing = connections.outgoing || [];
if (outgoing.length > 0) {
lines.push('## Related');
lines.push('');
for (const conn of outgoing) {
lines.push(`- [[${conn.node.title}]] (${conn.type})`);
}
lines.push('');
}
}
return lines.join('\n');
}
function sanitizeFilename(title: string): string {
return title
.replace(/[<>:"/\\|?*]/g, '-')
.replace(/\s+/g, ' ')
.trim()
.slice(0, 100);
}

133
src/core/export/mermaid.ts Normal file
View File

@@ -0,0 +1,133 @@
import { listNodes } from '../store';
import { getDb } from '../db';
import { Node } from '../../types';
export interface MermaidExportOptions {
rootId?: string;
depth?: number;
kind?: string;
tags?: string[];
direction?: 'TD' | 'LR' | 'BT' | 'RL';
}
export async function exportMermaid(options: MermaidExportOptions = {}): Promise<string> {
const db = getDb();
const direction = options.direction || 'TD';
// Get nodes
let nodes: Node[];
if (options.rootId) {
nodes = getSubgraphNodes(options.rootId, options.depth || 3, db);
} else {
nodes = listNodes({
kind: options.kind as any,
tags: options.tags,
limit: 100,
includeStale: false,
});
}
// Get edges between these nodes
const nodeIds = new Set(nodes.map(n => n.id));
const edges = db.prepare(`
SELECT * FROM edges
WHERE from_id IN (${[...nodeIds].map(() => '?').join(',')})
AND to_id IN (${[...nodeIds].map(() => '?').join(',')})
`).all([...nodeIds, ...nodeIds]) as any[];
const lines: string[] = [`graph ${direction}`];
// Generate short IDs for readability
const shortIds = new Map<string, string>();
nodes.forEach((n, i) => {
shortIds.set(n.id, `N${i}`);
});
// Define nodes with shapes based on kind
for (const node of nodes) {
const shortId = shortIds.get(node.id)!;
const label = escapeLabel(node.title);
const shape = kindToShape(node.kind);
lines.push(` ${shortId}${shape.open}"${label}"${shape.close}`);
}
// Add empty line before edges
lines.push('');
// Define edges
for (const edge of edges) {
const fromShort = shortIds.get(edge.from_id);
const toShort = shortIds.get(edge.to_id);
if (fromShort && toShort) {
const arrow = typeToArrow(edge.type);
lines.push(` ${fromShort} ${arrow} ${toShort}`);
}
}
return lines.join('\n');
}
function getSubgraphNodes(rootId: string, maxDepth: number, db: any): Node[] {
const visited = new Set<string>();
const nodes: Node[] = [];
function traverse(id: string, depth: number) {
if (depth > maxDepth || visited.has(id)) return;
visited.add(id);
const row = db.prepare('SELECT * FROM nodes WHERE id = ? AND is_stale = 0').get(id) as any;
if (!row) return;
nodes.push({
id: row.id,
kind: row.kind,
title: row.title,
content: row.content,
status: row.status,
tags: JSON.parse(row.tags || '[]'),
metadata: JSON.parse(row.metadata || '{}'),
embedding: null,
createdAt: row.created_at,
updatedAt: row.updated_at,
lastAccessedAt: row.last_accessed_at,
isStale: false,
});
const edges = db.prepare('SELECT to_id FROM edges WHERE from_id = ?').all(id) as any[];
for (const edge of edges) {
traverse(edge.to_id, depth + 1);
}
}
traverse(rootId, 0);
return nodes;
}
function kindToShape(kind: string): { open: string; close: string } {
switch (kind) {
case 'component': return { open: '[', close: ']' }; // Rectangle
case 'decision': return { open: '{', close: '}' }; // Diamond
case 'task': return { open: '([', close: '])' }; // Stadium
case 'memory': return { open: '(', close: ')' }; // Rounded
default: return { open: '[', close: ']' };
}
}
function typeToArrow(type: string): string {
switch (type) {
case 'contains': return '-->'; // Solid arrow
case 'depends_on': return '-.->'; // Dotted arrow
case 'implements': return '==>'; // Thick arrow
case 'relates_to': return '---'; // Line (no arrow)
default: return '-->';
}
}
function escapeLabel(text: string): string {
// Truncate and escape for Mermaid
const truncated = text.length > 30 ? text.slice(0, 27) + '...' : text;
return truncated
.replace(/"/g, "'")
.replace(/\n/g, ' ')
.replace(/[[\]{}()]/g, '');
}

200
src/core/export/svg.ts Normal file
View File

@@ -0,0 +1,200 @@
import { listNodes } from '../store';
import { getDb } from '../db';
import { Node } from '../../types';
export interface SvgExportOptions {
rootId?: string;
depth?: number;
kind?: string;
tags?: string[];
width?: number;
height?: number;
}
const KIND_COLORS: Record<string, string> = {
component: '#4CAF50',
decision: '#2196F3',
task: '#FF9800',
memory: '#9C27B0',
};
export async function exportSvg(options: SvgExportOptions = {}): Promise<string> {
const db = getDb();
const width = options.width || 800;
const height = options.height || 600;
// Get nodes
let nodes: Node[];
if (options.rootId) {
nodes = getSubgraphNodes(options.rootId, options.depth || 3, db);
} else {
nodes = listNodes({
kind: options.kind as any,
tags: options.tags,
limit: 100,
includeStale: false,
});
}
// Get edges
const nodeIds = new Set(nodes.map(n => n.id));
const edges = db.prepare(`
SELECT * FROM edges
WHERE from_id IN (${[...nodeIds].map(() => '?').join(',')})
AND to_id IN (${[...nodeIds].map(() => '?').join(',')})
`).all([...nodeIds, ...nodeIds]) as any[];
// Simple force-directed layout (basic version)
const positions = calculateLayout(nodes, edges, width, height);
// Generate SVG
const elements: string[] = [];
// Arrow marker definition
elements.push(`
<defs>
<marker id="arrowhead" markerWidth="10" markerHeight="7" refX="20" refY="3.5" orient="auto">
<polygon points="0 0, 10 3.5, 0 7" fill="#666"/>
</marker>
</defs>
`);
// Draw edges
for (const edge of edges) {
const from = positions.get(edge.from_id);
const to = positions.get(edge.to_id);
if (from && to) {
elements.push(`<line x1="${from.x}" y1="${from.y}" x2="${to.x}" y2="${to.y}" stroke="#666" stroke-width="1" marker-end="url(#arrowhead)"/>`);
}
}
// Draw nodes
for (const node of nodes) {
const pos = positions.get(node.id);
if (!pos) continue;
const color = KIND_COLORS[node.kind] || '#888';
const label = node.title.length > 20 ? node.title.slice(0, 17) + '...' : node.title;
elements.push(`
<g transform="translate(${pos.x}, ${pos.y})">
<circle r="15" fill="${color}"/>
<text x="20" y="5" font-size="11" font-family="sans-serif" fill="#333">${escapeXml(label)}</text>
</g>
`);
}
return `<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 ${width} ${height}" width="${width}" height="${height}">
<rect width="100%" height="100%" fill="#f5f5f5"/>
${elements.join('\n')}
</svg>`;
}
function getSubgraphNodes(rootId: string, maxDepth: number, db: any): Node[] {
const visited = new Set<string>();
const nodes: Node[] = [];
function traverse(id: string, depth: number) {
if (depth > maxDepth || visited.has(id)) return;
visited.add(id);
const row = db.prepare('SELECT * FROM nodes WHERE id = ? AND is_stale = 0').get(id) as any;
if (!row) return;
nodes.push({
id: row.id,
kind: row.kind,
title: row.title,
content: row.content,
status: row.status,
tags: JSON.parse(row.tags || '[]'),
metadata: JSON.parse(row.metadata || '{}'),
embedding: null,
createdAt: row.created_at,
updatedAt: row.updated_at,
lastAccessedAt: row.last_accessed_at,
isStale: false,
});
const edges = db.prepare('SELECT to_id FROM edges WHERE from_id = ?').all(id) as any[];
for (const edge of edges) {
traverse(edge.to_id, depth + 1);
}
}
traverse(rootId, 0);
return nodes;
}
function calculateLayout(nodes: Node[], edges: any[], width: number, height: number): Map<string, { x: number; y: number }> {
const positions = new Map<string, { x: number; y: number }>();
// Simple grid layout with some randomization
const cols = Math.ceil(Math.sqrt(nodes.length));
const cellWidth = (width - 100) / cols;
const cellHeight = (height - 100) / Math.ceil(nodes.length / cols);
nodes.forEach((node, i) => {
const col = i % cols;
const row = Math.floor(i / cols);
positions.set(node.id, {
x: 50 + col * cellWidth + cellWidth / 2 + (Math.random() - 0.5) * 30,
y: 50 + row * cellHeight + cellHeight / 2 + (Math.random() - 0.5) * 30,
});
});
// Simple force-directed adjustment (few iterations)
for (let iter = 0; iter < 50; iter++) {
// Repulsion between nodes
for (const n1 of nodes) {
const p1 = positions.get(n1.id)!;
for (const n2 of nodes) {
if (n1.id === n2.id) continue;
const p2 = positions.get(n2.id)!;
const dx = p1.x - p2.x;
const dy = p1.y - p2.y;
const dist = Math.sqrt(dx * dx + dy * dy) || 1;
if (dist < 100) {
const force = (100 - dist) / dist * 0.5;
p1.x += dx * force;
p1.y += dy * force;
}
}
}
// Attraction along edges
for (const edge of edges) {
const p1 = positions.get(edge.from_id);
const p2 = positions.get(edge.to_id);
if (p1 && p2) {
const dx = p2.x - p1.x;
const dy = p2.y - p1.y;
const dist = Math.sqrt(dx * dx + dy * dy) || 1;
if (dist > 100) {
const force = (dist - 100) / dist * 0.1;
p1.x += dx * force;
p1.y += dy * force;
p2.x -= dx * force;
p2.y -= dy * force;
}
}
}
// Keep in bounds
for (const [id, pos] of positions) {
pos.x = Math.max(50, Math.min(width - 50, pos.x));
pos.y = Math.max(50, Math.min(height - 50, pos.y));
}
}
return positions;
}
function escapeXml(text: string): string {
return text
.replace(/&/g, '&amp;')
.replace(/</g, '&lt;')
.replace(/>/g, '&gt;')
.replace(/"/g, '&quot;');
}

432
src/core/graphs.ts Normal file
View File

@@ -0,0 +1,432 @@
import * as fs from 'fs';
import * as path from 'path';
import Database from 'better-sqlite3';
// Global state for active graph
let _activeGraph: string = 'default';
let _configDir: string | null = null;
export interface GraphInfo {
name: string;
path: string;
nodeCount: number;
edgeCount: number;
lastAccessed: number;
createdAt: number;
size: number;
}
export interface LocalConfig {
graph?: string;
autoCapture?: boolean;
contextInjection?: {
maxTokens?: number;
includeTasks?: boolean;
};
}
export interface GlobalConfig {
defaultGraph: string;
lastUsedGraph: string;
}
/**
* Get the Cortex configuration directory
*/
export function getConfigDir(): string {
if (_configDir) return _configDir;
// Check environment variable first
if (process.env.CORTEX_HOME) {
_configDir = process.env.CORTEX_HOME;
} else if (process.platform === 'win32') {
// Windows: use %APPDATA%\cortex
_configDir = path.join(process.env.APPDATA || path.join(require('os').homedir(), 'AppData', 'Roaming'), 'cortex');
} else {
// Unix: use ~/.cortex
_configDir = path.join(require('os').homedir(), '.cortex');
}
// Ensure directory exists
if (!fs.existsSync(_configDir)) {
fs.mkdirSync(_configDir, { recursive: true });
}
return _configDir;
}
/**
* Get the graphs directory
*/
export function getGraphsDir(): string {
const dir = path.join(getConfigDir(), 'graphs');
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, { recursive: true });
}
return dir;
}
/**
* Get path to a specific graph's database
*/
export function getGraphDbPath(graphName: string): string {
return path.join(getGraphsDir(), graphName, 'cortex.db');
}
/**
* Check if a graph exists
*/
export function graphExists(name: string): boolean {
return fs.existsSync(getGraphDbPath(name));
}
/**
* List all available graphs
*/
export function listGraphs(): GraphInfo[] {
const graphsDir = getGraphsDir();
if (!fs.existsSync(graphsDir)) {
return [];
}
const dirs = fs.readdirSync(graphsDir, { withFileTypes: true })
.filter(d => d.isDirectory())
.map(d => d.name);
const graphs: GraphInfo[] = [];
for (const name of dirs) {
const dbPath = getGraphDbPath(name);
if (!fs.existsSync(dbPath)) continue;
try {
const stats = fs.statSync(dbPath);
const db = new Database(dbPath, { readonly: true });
const nodeCount = (db.prepare('SELECT COUNT(*) as count FROM nodes').get() as any)?.count || 0;
const edgeCount = (db.prepare('SELECT COUNT(*) as count FROM edges').get() as any)?.count || 0;
db.close();
graphs.push({
name,
path: path.join(graphsDir, name),
nodeCount,
edgeCount,
lastAccessed: stats.mtimeMs,
createdAt: stats.birthtimeMs || stats.ctimeMs,
size: stats.size,
});
} catch {
// Skip corrupted databases
}
}
return graphs.sort((a, b) => b.lastAccessed - a.lastAccessed);
}
/**
* Create a new graph
*/
export function createGraph(name: string): GraphInfo {
if (!/^[a-z0-9_-]+$/i.test(name)) {
throw new Error('Graph name must be alphanumeric with dashes or underscores');
}
const graphDir = path.join(getGraphsDir(), name);
if (fs.existsSync(graphDir)) {
throw new Error(`Graph '${name}' already exists`);
}
fs.mkdirSync(graphDir, { recursive: true });
// Initialize empty database
const dbPath = path.join(graphDir, 'cortex.db');
const db = new Database(dbPath);
// Create schema
db.exec(`
CREATE TABLE IF NOT EXISTS nodes (
id TEXT PRIMARY KEY,
kind TEXT NOT NULL,
title TEXT NOT NULL,
content TEXT NOT NULL DEFAULT '',
status TEXT,
tags TEXT DEFAULT '[]',
metadata TEXT DEFAULT '{}',
embedding BLOB,
created_at INTEGER NOT NULL,
updated_at INTEGER NOT NULL,
last_accessed_at INTEGER,
version INTEGER DEFAULT 1,
is_stale INTEGER DEFAULT 0
);
CREATE TABLE IF NOT EXISTS edges (
id TEXT PRIMARY KEY,
from_id TEXT NOT NULL REFERENCES nodes(id) ON DELETE CASCADE,
to_id TEXT NOT NULL REFERENCES nodes(id) ON DELETE CASCADE,
type TEXT NOT NULL,
metadata TEXT DEFAULT '{}',
created_at INTEGER NOT NULL
);
CREATE TABLE IF NOT EXISTS node_tags (
node_id TEXT NOT NULL REFERENCES nodes(id) ON DELETE CASCADE,
tag TEXT NOT NULL,
PRIMARY KEY (node_id, tag)
);
CREATE TABLE IF NOT EXISTS node_versions (
id TEXT PRIMARY KEY,
node_id TEXT NOT NULL,
version INTEGER NOT NULL,
title TEXT NOT NULL,
content TEXT NOT NULL DEFAULT '',
status TEXT,
tags TEXT DEFAULT '[]',
metadata TEXT DEFAULT '{}',
valid_from INTEGER NOT NULL,
valid_until INTEGER,
created_by TEXT DEFAULT 'user',
FOREIGN KEY (node_id) REFERENCES nodes(id) ON DELETE CASCADE
);
CREATE INDEX IF NOT EXISTS idx_nodes_kind ON nodes(kind);
CREATE INDEX IF NOT EXISTS idx_nodes_status ON nodes(status);
CREATE INDEX IF NOT EXISTS idx_nodes_created ON nodes(created_at DESC);
CREATE INDEX IF NOT EXISTS idx_nodes_stale ON nodes(is_stale);
CREATE INDEX IF NOT EXISTS idx_edges_from ON edges(from_id);
CREATE INDEX IF NOT EXISTS idx_edges_to ON edges(to_id);
CREATE INDEX IF NOT EXISTS idx_edges_type ON edges(type);
CREATE INDEX IF NOT EXISTS idx_tags_tag ON node_tags(tag);
CREATE INDEX IF NOT EXISTS idx_versions_node ON node_versions(node_id, version);
CREATE INDEX IF NOT EXISTS idx_versions_time ON node_versions(valid_from, valid_until);
CREATE UNIQUE INDEX IF NOT EXISTS idx_versions_node_version ON node_versions(node_id, version);
`);
db.pragma('journal_mode = WAL');
db.pragma('foreign_keys = ON');
db.close();
const stats = fs.statSync(dbPath);
return {
name,
path: graphDir,
nodeCount: 0,
edgeCount: 0,
lastAccessed: stats.mtimeMs,
createdAt: stats.birthtimeMs || stats.ctimeMs,
size: stats.size,
};
}
/**
* Delete a graph
*/
export function deleteGraph(name: string): void {
if (name === 'default') {
throw new Error("Cannot delete the 'default' graph");
}
const graphDir = path.join(getGraphsDir(), name);
if (!fs.existsSync(graphDir)) {
throw new Error(`Graph '${name}' does not exist`);
}
// Recursively delete directory
fs.rmSync(graphDir, { recursive: true });
// Reset to default if this was the active graph
if (_activeGraph === name) {
_activeGraph = 'default';
}
}
/**
* Set the active graph for the current session
*/
export function useGraph(name: string): void {
if (!graphExists(name)) {
throw new Error(`Graph '${name}' does not exist`);
}
_activeGraph = name;
// Update last used in global config
updateGlobalConfig({ lastUsedGraph: name });
}
/**
* Get the currently active graph name
*/
export function getActiveGraph(): string {
// Check project-local .cortex file first
const localConfig = findLocalConfig();
if (localConfig?.graph && graphExists(localConfig.graph)) {
return localConfig.graph;
}
// Check if explicitly set via useGraph()
if (_activeGraph !== 'default' && graphExists(_activeGraph)) {
return _activeGraph;
}
// Try git remote detection
const gitProject = detectGitProject();
if (gitProject && graphExists(gitProject)) {
return gitProject;
}
// Fall back to global config's last used or default
const config = getGlobalConfig();
if (config.lastUsedGraph && graphExists(config.lastUsedGraph)) {
return config.lastUsedGraph;
}
// Ensure default graph exists
if (!graphExists('default')) {
createGraph('default');
}
return 'default';
}
/**
* Find local .cortex or .cortex.json config
*/
export function findLocalConfig(): LocalConfig | null {
let dir = process.cwd();
const root = path.parse(dir).root;
while (dir !== root) {
// Check .cortex.json
const jsonPath = path.join(dir, '.cortex.json');
if (fs.existsSync(jsonPath)) {
try {
return JSON.parse(fs.readFileSync(jsonPath, 'utf-8'));
} catch {
// Invalid JSON
}
}
// Check .cortex (JSON or just graph name)
const cortexPath = path.join(dir, '.cortex');
if (fs.existsSync(cortexPath)) {
try {
const content = fs.readFileSync(cortexPath, 'utf-8').trim();
// Try JSON first
if (content.startsWith('{')) {
return JSON.parse(content);
}
// Otherwise treat as graph name
return { graph: content };
} catch {
// Invalid config
}
}
dir = path.dirname(dir);
}
return null;
}
/**
* Detect project name from git remote
*/
export function detectGitProject(): string | null {
try {
const { execSync } = require('child_process');
const remote = execSync('git remote get-url origin', {
encoding: 'utf-8',
stdio: ['pipe', 'pipe', 'pipe'],
}).trim();
// Extract project name from various URL formats
// git@github.com:user/repo.git
// https://github.com/user/repo.git
// ssh://git@host/user/repo.git
const match = remote.match(/[/:]([^/]+?)(?:\.git)?$/);
if (match) {
return match[1].toLowerCase().replace(/[^a-z0-9_-]/g, '-');
}
} catch {
// Not a git repo or no remote
}
return null;
}
/**
* Get global config
*/
export function getGlobalConfig(): GlobalConfig {
const configPath = path.join(getConfigDir(), 'config.json');
if (fs.existsSync(configPath)) {
try {
return JSON.parse(fs.readFileSync(configPath, 'utf-8'));
} catch {
// Invalid config
}
}
return {
defaultGraph: 'default',
lastUsedGraph: 'default',
};
}
/**
* Update global config
*/
export function updateGlobalConfig(updates: Partial<GlobalConfig>): void {
const config = { ...getGlobalConfig(), ...updates };
const configPath = path.join(getConfigDir(), 'config.json');
fs.writeFileSync(configPath, JSON.stringify(config, null, 2));
}
/**
* Initialize a project with a .cortex.json file
*/
export function initProject(graphName?: string): string {
const name = graphName || detectGitProject() || path.basename(process.cwd()).toLowerCase();
const configPath = path.join(process.cwd(), '.cortex.json');
if (fs.existsSync(configPath)) {
throw new Error('.cortex.json already exists');
}
// Create graph if needed
if (!graphExists(name)) {
createGraph(name);
}
const config: LocalConfig = {
graph: name,
autoCapture: true,
contextInjection: {
maxTokens: 4000,
includeTasks: true,
},
};
fs.writeFileSync(configPath, JSON.stringify(config, null, 2));
return name;
}
/**
* Parse a node reference that may include graph prefix
* Format: "graph:nodeId" or just "nodeId"
*/
export function parseNodeRef(ref: string): { graph?: string; nodeId: string } {
const colonIndex = ref.indexOf(':');
if (colonIndex > 0 && colonIndex < ref.length - 1) {
return {
graph: ref.slice(0, colonIndex),
nodeId: ref.slice(colonIndex + 1),
};
}
return { nodeId: ref };
}

2
src/core/import/index.ts Normal file
View File

@@ -0,0 +1,2 @@
export { importObsidian, ObsidianImportOptions, ImportResult } from './obsidian';
export { importMarkdown, MarkdownImportOptions, MarkdownImportResult } from './markdown';

View File

@@ -0,0 +1,93 @@
import * as fs from 'fs';
import * as path from 'path';
import { addNode } from '../store';
import { NodeKind } from '../../types';
export interface MarkdownImportOptions {
kind?: NodeKind;
tags?: string[];
dryRun?: boolean;
}
export interface MarkdownImportResult {
imported: number;
files: string[];
}
export async function importMarkdown(folderPath: string, options: MarkdownImportOptions = {}): Promise<MarkdownImportResult> {
const absPath = path.resolve(folderPath);
if (!fs.existsSync(absPath)) {
throw new Error(`Folder does not exist: ${absPath}`);
}
const files = findMarkdownFiles(absPath);
const defaultKind = options.kind || 'memory';
const defaultTags = options.tags || [];
if (options.dryRun) {
return {
imported: files.length,
files: files.map(f => path.relative(absPath, f)),
};
}
let imported = 0;
const importedFiles: string[] = [];
for (const file of files) {
const content = fs.readFileSync(file, 'utf-8');
const relativePath = path.relative(absPath, file);
const title = path.basename(file, '.md');
// Extract title from first H1 if present
const h1Match = content.match(/^#\s+(.+)$/m);
const nodeTitle = h1Match ? h1Match[1] : title;
// Remove the H1 from content if it was used as title
const nodeContent = h1Match ? content.replace(/^#\s+.+\n*/, '') : content;
await addNode({
kind: defaultKind,
title: nodeTitle,
content: nodeContent.trim(),
tags: ['imported', 'markdown', ...defaultTags],
metadata: {
importedFrom: 'markdown',
originalPath: relativePath,
importedAt: Date.now(),
},
});
imported++;
importedFiles.push(relativePath);
}
return {
imported,
files: importedFiles,
};
}
function findMarkdownFiles(dir: string): string[] {
const files: string[] = [];
function walk(currentDir: string) {
const entries = fs.readdirSync(currentDir, { withFileTypes: true });
for (const entry of entries) {
const fullPath = path.join(currentDir, entry.name);
if (entry.name.startsWith('.')) continue;
if (entry.isDirectory()) {
walk(fullPath);
} else if (entry.isFile() && entry.name.endsWith('.md')) {
files.push(fullPath);
}
}
}
walk(dir);
return files;
}

263
src/core/import/obsidian.ts Normal file
View File

@@ -0,0 +1,263 @@
import * as fs from 'fs';
import * as path from 'path';
import { addNode, addEdge, listNodes } from '../store';
import { Node } from '../../types';
export interface ObsidianImportOptions {
mapTags?: boolean;
hierarchy?: boolean;
dryRun?: boolean;
kind?: string;
}
export interface ImportResult {
imported: number;
skipped: number;
edges: number;
files: string[];
}
interface ParsedFile {
title: string;
content: string;
frontmatter: Record<string, any>;
tags: string[];
wikilinks: string[];
relativePath: string;
}
export async function importObsidian(vaultPath: string, options: ObsidianImportOptions = {}): Promise<ImportResult> {
const absPath = path.resolve(vaultPath);
if (!fs.existsSync(absPath)) {
throw new Error(`Vault path does not exist: ${absPath}`);
}
// Find all markdown files
const files = findMarkdownFiles(absPath);
const parsed: ParsedFile[] = [];
// Parse all files
for (const file of files) {
const content = fs.readFileSync(file, 'utf-8');
const relativePath = path.relative(absPath, file);
const title = path.basename(file, '.md');
const { frontmatter, body } = parseFrontmatter(content);
const tags = extractTags(content, frontmatter.tags);
const wikilinks = extractWikilinks(content);
parsed.push({
title,
content: body,
frontmatter,
tags,
wikilinks,
relativePath,
});
}
if (options.dryRun) {
return {
imported: parsed.length,
skipped: 0,
edges: parsed.reduce((sum, p) => sum + p.wikilinks.length, 0),
files: parsed.map(p => p.relativePath),
};
}
// Create nodes
const nodeMap = new Map<string, Node>();
let imported = 0;
let skipped = 0;
for (const file of parsed) {
// Check for existing node with same title
const existing = listNodes({ kind: 'memory', limit: 1000 })
.find(n => n.title === file.title && n.tags.includes('obsidian'));
if (existing) {
nodeMap.set(file.title.toLowerCase(), existing);
skipped++;
continue;
}
const node = await addNode({
kind: (options.kind || file.frontmatter.kind || 'memory') as any,
title: file.title,
content: file.content,
tags: ['obsidian', 'imported', ...file.tags],
status: file.frontmatter.status,
metadata: {
...file.frontmatter,
importedFrom: 'obsidian',
originalPath: file.relativePath,
importedAt: Date.now(),
},
});
nodeMap.set(file.title.toLowerCase(), node);
imported++;
}
// Create edges from wikilinks
let edgeCount = 0;
for (const file of parsed) {
const sourceNode = nodeMap.get(file.title.toLowerCase());
if (!sourceNode) continue;
for (const link of file.wikilinks) {
const targetTitle = link.split('|')[0].toLowerCase(); // Handle [[Page|Alias]]
const targetNode = nodeMap.get(targetTitle);
if (targetNode && targetNode.id !== sourceNode.id) {
try {
addEdge(sourceNode.id, targetNode.id, 'relates_to', { reason: 'wikilink' });
edgeCount++;
} catch {
// Edge might already exist
}
}
}
}
// Create folder hierarchy if requested
if (options.hierarchy) {
await createFolderHierarchy(parsed, nodeMap);
}
return {
imported,
skipped,
edges: edgeCount,
files: parsed.map(p => p.relativePath),
};
}
function findMarkdownFiles(dir: string): string[] {
const files: string[] = [];
function walk(currentDir: string) {
const entries = fs.readdirSync(currentDir, { withFileTypes: true });
for (const entry of entries) {
const fullPath = path.join(currentDir, entry.name);
// Skip hidden files/folders and common non-content folders
if (entry.name.startsWith('.') || entry.name === 'node_modules') continue;
if (entry.isDirectory()) {
walk(fullPath);
} else if (entry.isFile() && entry.name.endsWith('.md')) {
files.push(fullPath);
}
}
}
walk(dir);
return files;
}
function parseFrontmatter(content: string): { frontmatter: Record<string, any>; body: string } {
const match = content.match(/^---\n([\s\S]*?)\n---\n([\s\S]*)$/);
if (!match) {
return { frontmatter: {}, body: content };
}
const frontmatterStr = match[1];
const body = match[2];
// Simple YAML parsing (handles basic key: value and arrays)
const frontmatter: Record<string, any> = {};
const lines = frontmatterStr.split('\n');
for (const line of lines) {
const colonIndex = line.indexOf(':');
if (colonIndex === -1) continue;
const key = line.slice(0, colonIndex).trim();
let value: any = line.slice(colonIndex + 1).trim();
// Handle arrays [a, b, c]
if (value.startsWith('[') && value.endsWith(']')) {
value = value.slice(1, -1).split(',').map((s: string) => s.trim().replace(/^["']|["']$/g, ''));
}
// Handle quoted strings
else if ((value.startsWith('"') && value.endsWith('"')) || (value.startsWith("'") && value.endsWith("'"))) {
value = value.slice(1, -1);
}
// Handle booleans
else if (value === 'true') value = true;
else if (value === 'false') value = false;
if (key) frontmatter[key] = value;
}
return { frontmatter, body };
}
function extractTags(content: string, frontmatterTags?: string[]): string[] {
const tags = new Set<string>();
// Add frontmatter tags
if (Array.isArray(frontmatterTags)) {
frontmatterTags.forEach(t => tags.add(t));
}
// Extract #hashtags from content
const hashtagMatches = content.match(/#[\w-]+/g);
if (hashtagMatches) {
hashtagMatches.forEach(tag => tags.add(tag.slice(1))); // Remove #
}
return [...tags];
}
function extractWikilinks(content: string): string[] {
const links: string[] = [];
const matches = content.matchAll(/\[\[([^\]]+)\]\]/g);
for (const match of matches) {
links.push(match[1]);
}
return links;
}
async function createFolderHierarchy(parsed: ParsedFile[], nodeMap: Map<string, Node>): Promise<void> {
const folders = new Map<string, Node>();
for (const file of parsed) {
const dir = path.dirname(file.relativePath);
if (dir === '.') continue;
const parts = dir.split(path.sep);
let currentPath = '';
for (const part of parts) {
currentPath = currentPath ? `${currentPath}/${part}` : part;
if (!folders.has(currentPath)) {
// Create folder node
const folderNode = await addNode({
kind: 'component',
title: `Folder: ${part}`,
content: `Imported folder from Obsidian vault`,
tags: ['obsidian', 'folder'],
metadata: { folderPath: currentPath },
});
folders.set(currentPath, folderNode);
}
}
// Link file to its parent folder
const parentFolder = folders.get(dir);
const fileNode = nodeMap.get(file.title.toLowerCase());
if (parentFolder && fileNode) {
try {
addEdge(parentFolder.id, fileNode.id, 'contains');
} catch { /* Edge exists */ }
}
}
}

View File

@@ -0,0 +1,79 @@
import * as path from 'path';
import { Node } from '../../types';
import { ProjectInfo } from './detector';
import { getDirectoryTree } from './scanner';
export interface ArchitectureSummary {
projectName: string;
projectType: string;
description: string;
techStack: string[];
keyComponents: { name: string; path: string; exports: number }[];
directoryStructure: string;
}
export function generateArchitectureSummary(
projectRoot: string,
projectInfo: ProjectInfo,
components: Node[]
): ArchitectureSummary {
// Determine tech stack from dependencies
const techStack: string[] = [projectInfo.type];
const deps = [...projectInfo.dependencies, ...(projectInfo.devDependencies || [])];
// Detect frameworks/libraries
if (deps.includes('react') || deps.includes('react-dom')) techStack.push('React');
if (deps.includes('vue')) techStack.push('Vue');
if (deps.includes('express')) techStack.push('Express');
if (deps.includes('fastify')) techStack.push('Fastify');
if (deps.includes('next')) techStack.push('Next.js');
if (deps.includes('typescript')) techStack.push('TypeScript');
if (deps.includes('prisma')) techStack.push('Prisma');
if (deps.includes('sequelize')) techStack.push('Sequelize');
if (deps.includes('mongoose')) techStack.push('MongoDB');
if (deps.includes('better-sqlite3') || deps.includes('sqlite3')) techStack.push('SQLite');
// Get key components (most exports or largest)
const keyComponents = components
.map(n => ({
name: n.title,
path: n.metadata?.filePath as string || '',
exports: (n.metadata?.exports as string[])?.length || 0,
}))
.sort((a, b) => b.exports - a.exports)
.slice(0, 10);
return {
projectName: projectInfo.name,
projectType: projectInfo.type,
description: projectInfo.description || `A ${projectInfo.type} project`,
techStack,
keyComponents,
directoryStructure: getDirectoryTree(projectRoot, 3),
};
}
export function formatArchitectureAsMarkdown(summary: ArchitectureSummary): string {
const sections: string[] = [];
sections.push(`# ${summary.projectName} Architecture\n`);
sections.push(`${summary.description}\n`);
sections.push(`## Tech Stack\n`);
sections.push(summary.techStack.map(t => `- ${t}`).join('\n') + '\n');
if (summary.keyComponents.length > 0) {
sections.push(`## Key Components\n`);
for (const comp of summary.keyComponents) {
sections.push(`- **${comp.name}** (${comp.path}) - ${comp.exports} exports`);
}
sections.push('');
}
sections.push(`## Directory Structure\n`);
sections.push('```');
sections.push(summary.directoryStructure);
sections.push('```');
return sections.join('\n');
}

View File

@@ -0,0 +1,145 @@
import * as fs from 'fs';
import * as path from 'path';
export type ProjectType = 'nodejs' | 'python' | 'rust' | 'go' | 'generic';
export interface ProjectInfo {
type: ProjectType;
name: string;
description?: string;
dependencies: string[];
devDependencies?: string[];
scripts?: Record<string, string>;
entryPoints: string[];
}
export async function detectProjectType(root: string): Promise<ProjectInfo> {
const absRoot = path.resolve(root);
// Check for Node.js (package.json)
const packageJsonPath = path.join(absRoot, 'package.json');
if (fs.existsSync(packageJsonPath)) {
return parseNodeProject(absRoot, packageJsonPath);
}
// Check for Python
const pyprojectPath = path.join(absRoot, 'pyproject.toml');
const setupPyPath = path.join(absRoot, 'setup.py');
const requirementsPath = path.join(absRoot, 'requirements.txt');
if (fs.existsSync(pyprojectPath) || fs.existsSync(setupPyPath) || fs.existsSync(requirementsPath)) {
return parsePythonProject(absRoot);
}
// Check for Rust (Cargo.toml)
const cargoPath = path.join(absRoot, 'Cargo.toml');
if (fs.existsSync(cargoPath)) {
return parseRustProject(absRoot, cargoPath);
}
// Check for Go (go.mod)
const goModPath = path.join(absRoot, 'go.mod');
if (fs.existsSync(goModPath)) {
return parseGoProject(absRoot, goModPath);
}
return parseGenericProject(absRoot);
}
function parseNodeProject(root: string, packageJsonPath: string): ProjectInfo {
const content = fs.readFileSync(packageJsonPath, 'utf-8');
const pkg = JSON.parse(content);
const deps = Object.keys(pkg.dependencies || {});
const devDeps = Object.keys(pkg.devDependencies || {});
const entryPoints: string[] = [];
if (pkg.main) entryPoints.push(pkg.main);
if (pkg.bin) {
if (typeof pkg.bin === 'string') entryPoints.push(pkg.bin);
else entryPoints.push(...Object.values(pkg.bin) as string[]);
}
return {
type: 'nodejs',
name: pkg.name || path.basename(root),
description: pkg.description,
dependencies: deps,
devDependencies: devDeps,
scripts: pkg.scripts,
entryPoints,
};
}
function parsePythonProject(root: string): ProjectInfo {
const name = path.basename(root);
const deps: string[] = [];
const entryPoints: string[] = [];
const reqPath = path.join(root, 'requirements.txt');
if (fs.existsSync(reqPath)) {
const content = fs.readFileSync(reqPath, 'utf-8');
for (const line of content.split('\n')) {
const trimmed = line.trim();
if (trimmed && !trimmed.startsWith('#')) {
const pkg = trimmed.split(/[=<>!]/)[0].trim();
if (pkg) deps.push(pkg);
}
}
}
for (const entry of ['main.py', 'app.py', '__main__.py']) {
if (fs.existsSync(path.join(root, entry))) entryPoints.push(entry);
}
return { type: 'python', name, dependencies: deps, entryPoints };
}
function parseRustProject(root: string, cargoPath: string): ProjectInfo {
const content = fs.readFileSync(cargoPath, 'utf-8');
const nameMatch = content.match(/^name\s*=\s*["']([^"']+)["']/m);
const descMatch = content.match(/^description\s*=\s*["']([^"']+)["']/m);
const deps: string[] = [];
const depsSection = content.match(/\[dependencies\]([\s\S]*?)(?:\[|$)/);
if (depsSection) {
const depLines = depsSection[1].match(/^(\w[\w-]*)\s*=/gm);
if (depLines) deps.push(...depLines.map(d => d.replace(/\s*=.*/, '')));
}
const entryPoints: string[] = [];
if (fs.existsSync(path.join(root, 'src/main.rs'))) entryPoints.push('src/main.rs');
if (fs.existsSync(path.join(root, 'src/lib.rs'))) entryPoints.push('src/lib.rs');
return {
type: 'rust',
name: nameMatch?.[1] || path.basename(root),
description: descMatch?.[1],
dependencies: deps,
entryPoints,
};
}
function parseGoProject(root: string, goModPath: string): ProjectInfo {
const content = fs.readFileSync(goModPath, 'utf-8');
const moduleMatch = content.match(/^module\s+(\S+)/m);
const deps: string[] = [];
const requireMatch = content.match(/require\s*\(([\s\S]*?)\)/);
if (requireMatch) {
const reqLines = requireMatch[1].match(/^\s*(\S+)\s+v/gm);
if (reqLines) deps.push(...reqLines.map(d => d.trim().split(/\s/)[0]));
}
const entryPoints: string[] = [];
if (fs.existsSync(path.join(root, 'main.go'))) entryPoints.push('main.go');
return {
type: 'go',
name: moduleMatch?.[1]?.split('/').pop() || path.basename(root),
dependencies: deps,
entryPoints,
};
}
function parseGenericProject(root: string): ProjectInfo {
const name = path.basename(root);
let description: string | undefined;
for (const readme of ['README.md', 'README.txt', 'README']) {
const readmePath = path.join(root, readme);
if (fs.existsSync(readmePath)) {
const content = fs.readFileSync(readmePath, 'utf-8');
const firstPara = content.split(/\n\n/)[0].replace(/^#.*\n/, '').trim();
if (firstPara) description = firstPara.slice(0, 200);
break;
}
}
return { type: 'generic', name, description, dependencies: [], entryPoints: [] };
}

View File

@@ -0,0 +1,6 @@
export { detectProjectType, type ProjectType, type ProjectInfo } from './detector';
export { scanFiles, getDirectoryTree, type ScanOptions, type ScannedFile } from './scanner';
export { mapRelationships, resolveImportPath, loadIndexState, saveIndexState, type ComponentNode, type MappedRelationship, type IndexState } from './mapper';
export { generateArchitectureSummary, formatArchitectureAsMarkdown, type ArchitectureSummary } from './architecture';
export { indexProject, type IndexOptions, type IndexResult } from './indexProject';
export * from './parsers';

View File

@@ -0,0 +1,246 @@
import * as path from 'path';
import { addNode, addEdge, listNodes, updateNode, removeNode } from '../store';
import { detectProjectType, ProjectInfo } from './detector';
import { scanFiles, ScannedFile } from './scanner';
import { parseTypeScript, parsePython } from './parsers';
import { mapRelationships, loadIndexState, saveIndexState, IndexState, ComponentNode } from './mapper';
import { generateArchitectureSummary, formatArchitectureAsMarkdown } from './architecture';
import { Node } from '../../types';
export interface IndexOptions {
update?: boolean;
dryRun?: boolean;
maxDepth?: number;
language?: string;
ignore?: string[];
}
export interface IndexResult {
projectName: string;
projectType: string;
componentsCreated: number;
componentsUpdated: number;
componentsRemoved: number;
relationshipsCreated: number;
architectureNodeId: string | null;
files: string[];
}
export async function indexProject(root: string, options: IndexOptions = {}): Promise<IndexResult> {
const absRoot = path.resolve(root);
const projectInfo = await detectProjectType(absRoot);
const state = options.update ? loadIndexState(absRoot) : null;
// Filter extensions by language if specified
let extensions: string[] | undefined;
if (options.language) {
const langMap: Record<string, string[]> = {
ts: ['.ts', '.tsx'],
typescript: ['.ts', '.tsx'],
js: ['.js', '.jsx', '.mjs', '.cjs'],
javascript: ['.js', '.jsx', '.mjs', '.cjs'],
py: ['.py'],
python: ['.py'],
};
extensions = langMap[options.language.toLowerCase()];
}
// Scan files
const files = await scanFiles(absRoot, {
maxDepth: options.maxDepth,
ignore: options.ignore,
extensions,
});
if (options.dryRun) {
return {
projectName: projectInfo.name,
projectType: projectInfo.type,
componentsCreated: files.length,
componentsUpdated: 0,
componentsRemoved: 0,
relationshipsCreated: 0,
architectureNodeId: null,
files: files.map(f => f.relativePath),
};
}
// Track what we're creating
const newState: IndexState = {
projectPath: absRoot,
projectName: projectInfo.name,
lastIndexed: Date.now(),
fileHashes: {},
nodeIds: {},
};
const componentNodes: ComponentNode[] = [];
let created = 0;
let updated = 0;
// Process each file
for (const file of files) {
// Check if file changed (for incremental updates)
if (state && state.fileHashes[file.relativePath] === file.hash) {
// File unchanged, keep existing node
if (state.nodeIds[file.relativePath]) {
newState.fileHashes[file.relativePath] = file.hash;
newState.nodeIds[file.relativePath] = state.nodeIds[file.relativePath];
// Still need to add to componentNodes for relationship mapping
const existingNode = listNodes({ limit: 1 }).find(n => n.id === state.nodeIds[file.relativePath]);
if (existingNode) {
componentNodes.push({
node: existingNode,
filePath: file.relativePath,
imports: (existingNode.metadata?.imports as string[]) || [],
});
}
}
continue;
}
// Parse file based on extension
let parsed: { exports: any[]; imports: any[]; classes: any[]; functions: any[]; loc: number } | null = null;
try {
if (['.ts', '.tsx', '.js', '.jsx', '.mjs', '.cjs'].includes(file.extension)) {
parsed = await parseTypeScript(file.path);
} else if (file.extension === '.py') {
parsed = await parsePython(file.path);
}
} catch {
// Skip files that fail to parse
continue;
}
if (!parsed) continue;
// Create component title from file path
const fileName = path.basename(file.relativePath, file.extension);
const dirName = path.dirname(file.relativePath);
const title = dirName === '.' ? fileName : `${dirName}/${fileName}`;
// Build content summary
const exportNames = parsed.exports.map((e: any) => e.name || e).slice(0, 20);
const classNames = parsed.classes.map((c: any) => c.name);
const funcNames = parsed.functions.map((f: any) => f.name).slice(0, 10);
const contentParts: string[] = [];
if (exportNames.length) contentParts.push(`Exports: ${exportNames.join(', ')}`);
if (classNames.length) contentParts.push(`Classes: ${classNames.join(', ')}`);
if (funcNames.length) contentParts.push(`Functions: ${funcNames.join(', ')}`);
contentParts.push(`Lines: ${parsed.loc}`);
const content = contentParts.join('\n');
const imports = parsed.imports.map((i: any) => i.source || i.module);
// Check if node exists
const existingNodeId = state?.nodeIds[file.relativePath];
let node: Node;
if (existingNodeId) {
// Update existing
const updatedNode = await updateNode(existingNodeId, {
content,
metadata: {
filePath: file.relativePath,
extension: file.extension,
exports: exportNames,
imports,
loc: parsed.loc,
indexedAt: Date.now(),
},
});
if (updatedNode) {
node = updatedNode;
updated++;
} else {
continue;
}
} else {
// Create new
node = await addNode({
kind: 'component',
title,
content,
tags: [projectInfo.name, file.extension.slice(1), 'indexed'],
metadata: {
filePath: file.relativePath,
extension: file.extension,
exports: exportNames,
imports,
loc: parsed.loc,
indexedAt: Date.now(),
},
});
created++;
}
newState.fileHashes[file.relativePath] = file.hash;
newState.nodeIds[file.relativePath] = node.id;
componentNodes.push({ node, filePath: file.relativePath, imports });
}
// Remove nodes for deleted files
let removed = 0;
if (state) {
for (const [filePath, nodeId] of Object.entries(state.nodeIds)) {
if (!newState.nodeIds[filePath]) {
removeNode(nodeId, true);
removed++;
}
}
}
// Map relationships
const relationships = mapRelationships(componentNodes, absRoot);
let relCreated = 0;
for (const rel of relationships) {
try {
addEdge(rel.fromId, rel.toId, rel.type);
relCreated++;
} catch {
// Edge might already exist
}
}
// Create/update architecture summary node
let archNodeId: string | null = null;
const archTitle = `${projectInfo.name} Architecture`;
const existingArch = listNodes({ kind: 'component', tags: [projectInfo.name, 'architecture'] });
const summary = generateArchitectureSummary(absRoot, projectInfo, componentNodes.map(c => c.node));
const archContent = formatArchitectureAsMarkdown(summary);
if (existingArch.length > 0) {
await updateNode(existingArch[0].id, { content: archContent });
archNodeId = existingArch[0].id;
} else {
const archNode = await addNode({
kind: 'component',
title: archTitle,
content: archContent,
tags: [projectInfo.name, 'architecture', 'indexed'],
metadata: {
projectType: projectInfo.type,
techStack: summary.techStack,
componentCount: componentNodes.length,
indexedAt: Date.now(),
},
});
archNodeId = archNode.id;
}
// Save state
saveIndexState(absRoot, newState);
return {
projectName: projectInfo.name,
projectType: projectInfo.type,
componentsCreated: created,
componentsUpdated: updated,
componentsRemoved: removed,
relationshipsCreated: relCreated,
architectureNodeId: archNodeId,
files: files.map(f => f.relativePath),
};
}

109
src/core/indexer/mapper.ts Normal file
View File

@@ -0,0 +1,109 @@
import * as path from 'path';
import * as fs from 'fs';
import { Node } from '../../types';
export interface ComponentNode {
node: Node;
filePath: string;
imports: string[];
}
export interface MappedRelationship {
fromId: string;
toId: string;
fromPath: string;
toPath: string;
type: 'depends_on' | 'contains';
}
export function mapRelationships(components: ComponentNode[], projectRoot: string): MappedRelationship[] {
const relationships: MappedRelationship[] = [];
const pathToComponent = new Map<string, ComponentNode>();
// Build lookup map
for (const comp of components) {
pathToComponent.set(comp.filePath, comp);
// Also map without extension for JS/TS imports
const withoutExt = comp.filePath.replace(/\.(ts|tsx|js|jsx|mjs|cjs)$/, '');
pathToComponent.set(withoutExt, comp);
}
// Map import relationships
for (const comp of components) {
for (const importPath of comp.imports) {
const resolved = resolveImportPath(importPath, comp.filePath, projectRoot);
if (resolved) {
const target = pathToComponent.get(resolved) || pathToComponent.get(resolved.replace(/\.(ts|tsx|js|jsx)$/, ''));
if (target && target.node.id !== comp.node.id) {
relationships.push({
fromId: comp.node.id,
toId: target.node.id,
fromPath: comp.filePath,
toPath: target.filePath,
type: 'depends_on',
});
}
}
}
}
return relationships;
}
export function resolveImportPath(importSource: string, fromFile: string, projectRoot: string): string | null {
// Skip external packages
if (!importSource.startsWith('.') && !importSource.startsWith('/')) {
return null;
}
const fromDir = path.dirname(fromFile);
let resolved: string;
if (importSource.startsWith('.')) {
resolved = path.resolve(fromDir, importSource);
} else {
resolved = path.resolve(projectRoot, importSource);
}
// Try with various extensions
const extensions = ['.ts', '.tsx', '.js', '.jsx', '.mjs', '/index.ts', '/index.js'];
for (const ext of extensions) {
const withExt = resolved + ext;
if (fs.existsSync(withExt)) {
return path.relative(projectRoot, withExt);
}
}
// Check if it exists as-is
if (fs.existsSync(resolved)) {
return path.relative(projectRoot, resolved);
}
return null;
}
export interface IndexState {
projectPath: string;
projectName: string;
lastIndexed: number;
fileHashes: Record<string, string>;
nodeIds: Record<string, string>;
}
const STATE_FILE = '.cortex-index.json';
export function loadIndexState(projectRoot: string): IndexState | null {
const statePath = path.join(projectRoot, STATE_FILE);
if (!fs.existsSync(statePath)) return null;
try {
const content = fs.readFileSync(statePath, 'utf-8');
return JSON.parse(content);
} catch {
return null;
}
}
export function saveIndexState(projectRoot: string, state: IndexState): void {
const statePath = path.join(projectRoot, STATE_FILE);
fs.writeFileSync(statePath, JSON.stringify(state, null, 2));
}

View File

@@ -0,0 +1,2 @@
export { parseTypeScript, type ParsedTSFile, type ExportInfo, type ImportInfo, type ClassInfo, type FunctionInfo } from './typescript';
export { parsePython, type ParsedPyFile, type PyImport, type PyClass, type PyFunction } from './python';

View File

@@ -0,0 +1,120 @@
import * as fs from 'fs';
export interface ParsedPyFile {
filePath: string;
language: 'python';
imports: PyImport[];
classes: PyClass[];
functions: PyFunction[];
exports: string[];
loc: number;
}
export interface PyImport {
module: string;
names: string[];
isRelative: boolean;
line: number;
}
export interface PyClass {
name: string;
bases: string[];
methods: string[];
decorators: string[];
line: number;
}
export interface PyFunction {
name: string;
isAsync: boolean;
decorators: string[];
line: number;
}
export async function parsePython(filePath: string): Promise<ParsedPyFile> {
const content = fs.readFileSync(filePath, 'utf-8');
const lines = content.split('\n');
const imports: PyImport[] = [];
const classes: PyClass[] = [];
const functions: PyFunction[] = [];
let exports: string[] = [];
let currentDecorators: string[] = [];
for (let i = 0; i < lines.length; i++) {
const line = lines[i];
const lineNum = i + 1;
// __all__ exports
const allMatch = line.match(/^__all__\s*=\s*\[([^\]]+)\]/);
if (allMatch) {
exports = allMatch[1].match(/['"](\w+)['"]/g)?.map(s => s.slice(1, -1)) || [];
}
// Decorators
const decoMatch = line.match(/^@(\w+)/);
if (decoMatch) {
currentDecorators.push(decoMatch[1]);
continue;
}
// Import statements
const importMatch = line.match(/^import\s+(\S+)/);
if (importMatch) {
imports.push({ module: importMatch[1], names: [importMatch[1]], isRelative: false, line: lineNum });
currentDecorators = [];
continue;
}
const fromMatch = line.match(/^from\s+(\S+)\s+import\s+(.+)/);
if (fromMatch) {
const module = fromMatch[1];
const names = fromMatch[2].split(',').map(n => n.trim().split(/\s+as\s+/)[0]).filter(Boolean);
imports.push({ module, names, isRelative: module.startsWith('.'), line: lineNum });
currentDecorators = [];
continue;
}
// Class definitions
const classMatch = line.match(/^class\s+(\w+)(?:\(([^)]*)\))?:/);
if (classMatch) {
const className = classMatch[1];
const bases = classMatch[2]?.split(',').map(b => b.trim()).filter(Boolean) || [];
const methods: string[] = [];
// Find methods
for (let j = i + 1; j < lines.length; j++) {
const methodLine = lines[j];
if (methodLine.match(/^\S/) && !methodLine.match(/^\s*#/) && !methodLine.match(/^\s*$/)) break;
const methodMatch = methodLine.match(/^\s+(?:async\s+)?def\s+(\w+)/);
if (methodMatch) methods.push(methodMatch[1]);
}
classes.push({ name: className, bases, methods, decorators: currentDecorators, line: lineNum });
currentDecorators = [];
continue;
}
// Function definitions (top-level only)
const funcMatch = line.match(/^(async\s+)?def\s+(\w+)/);
if (funcMatch) {
functions.push({
name: funcMatch[2],
isAsync: !!funcMatch[1],
decorators: currentDecorators,
line: lineNum,
});
currentDecorators = [];
continue;
}
// Reset decorators if we hit a non-decorator, non-function/class line
if (!line.match(/^\s*$/) && !line.match(/^\s*#/)) {
currentDecorators = [];
}
}
return { filePath, language: 'python', imports, classes, functions, exports, loc: lines.length };
}

View File

@@ -0,0 +1,132 @@
import * as fs from 'fs';
import * as path from 'path';
export interface ParsedTSFile {
filePath: string;
language: 'typescript' | 'javascript';
exports: ExportInfo[];
imports: ImportInfo[];
classes: ClassInfo[];
functions: FunctionInfo[];
loc: number;
}
export interface ExportInfo {
name: string;
kind: 'function' | 'class' | 'interface' | 'type' | 'const' | 'default' | 'enum';
line: number;
}
export interface ImportInfo {
source: string;
names: string[];
line: number;
}
export interface ClassInfo {
name: string;
extends?: string;
implements: string[];
methods: string[];
isExported: boolean;
line: number;
}
export interface FunctionInfo {
name: string;
isAsync: boolean;
isExported: boolean;
line: number;
}
export async function parseTypeScript(filePath: string): Promise<ParsedTSFile> {
const content = fs.readFileSync(filePath, 'utf-8');
const ext = path.extname(filePath).toLowerCase();
const language = ext === '.ts' || ext === '.tsx' ? 'typescript' : 'javascript';
return parseContent(filePath, content, language);
}
function parseContent(filePath: string, content: string, language: 'typescript' | 'javascript'): ParsedTSFile {
const lines = content.split('\n');
const exports: ExportInfo[] = [];
const imports: ImportInfo[] = [];
const classes: ClassInfo[] = [];
const functions: FunctionInfo[] = [];
// Parse imports
const importRegex = /^import\s+.*?\s+from\s+['"]([^'"]+)['"]/gm;
let match: RegExpExecArray | null;
while ((match = importRegex.exec(content)) !== null) {
const line = content.slice(0, match.index).split('\n').length;
const clause = match[0].replace(/^import\s+/, '').replace(/\s+from.*/, '');
const names: string[] = [];
const namedMatch = clause.match(/\{([^}]+)\}/);
if (namedMatch) names.push(...namedMatch[1].split(',').map(n => n.trim().split(/\s+as\s+/)[0]).filter(Boolean));
const defaultMatch = clause.match(/^(\w+)/);
if (defaultMatch && !clause.startsWith('{') && !clause.startsWith('*')) names.push(defaultMatch[1]);
if (clause.includes('*')) names.push('*');
imports.push({ source: match[1], names, line });
}
// Parse line by line
for (let i = 0; i < lines.length; i++) {
const line = lines[i];
const lineNum = i + 1;
// Export const/let
const constMatch = line.match(/^export\s+(const|let|var)\s+(\w+)/);
if (constMatch) exports.push({ name: constMatch[2], kind: 'const', line: lineNum });
// Export default
if (line.match(/^export\s+default\s+/)) {
const m = line.match(/^export\s+default\s+(?:class|function)?\s*(\w+)?/);
exports.push({ name: m?.[1] || 'default', kind: 'default', line: lineNum });
}
// Export type/interface
const typeMatch = line.match(/^export\s+(type|interface)\s+(\w+)/);
if (typeMatch) exports.push({ name: typeMatch[2], kind: typeMatch[1] as 'type' | 'interface', line: lineNum });
// Export enum
const enumMatch = line.match(/^export\s+enum\s+(\w+)/);
if (enumMatch) exports.push({ name: enumMatch[1], kind: 'enum', line: lineNum });
// Classes
const classMatch = line.match(/^(export\s+)?(abstract\s+)?class\s+(\w+)(?:\s+extends\s+(\w+))?(?:\s+implements\s+([^{]+))?/);
if (classMatch) {
const isExported = !!classMatch[1];
const className = classMatch[3];
const methods: string[] = [];
for (let j = i + 1; j < lines.length && j < i + 100; j++) {
if (lines[j].match(/^\s*\}/) && !lines[j].match(/^\s*\}\s*\)/)) break;
const methodMatch = lines[j].match(/^\s+(async\s+)?(\w+)\s*\(/);
if (methodMatch && methodMatch[2] !== 'constructor') methods.push(methodMatch[2]);
}
classes.push({
name: className,
extends: classMatch[4],
implements: classMatch[5]?.split(',').map(s => s.trim()).filter(Boolean) || [],
methods,
isExported,
line: lineNum,
});
if (isExported) exports.push({ name: className, kind: 'class', line: lineNum });
}
// Functions
const funcMatch = line.match(/^(export\s+)?(async\s+)?function\s+(\w+)/);
if (funcMatch) {
const isExported = !!funcMatch[1];
functions.push({ name: funcMatch[3], isAsync: !!funcMatch[2], isExported, line: lineNum });
if (isExported) exports.push({ name: funcMatch[3], kind: 'function', line: lineNum });
}
// Arrow function exports
const arrowMatch = line.match(/^export\s+(const|let)\s+(\w+)\s*=\s*(async\s+)?[\(]/);
if (arrowMatch) {
functions.push({ name: arrowMatch[2], isAsync: !!arrowMatch[3], isExported: true, line: lineNum });
}
}
return { filePath, language, exports, imports, classes, functions, loc: lines.length };
}

108
src/core/indexer/scanner.ts Normal file
View File

@@ -0,0 +1,108 @@
import * as fs from 'fs';
import * as path from 'path';
import * as crypto from 'crypto';
export interface ScanOptions {
maxDepth?: number;
ignore?: string[];
extensions?: string[];
}
export interface ScannedFile {
path: string;
relativePath: string;
extension: string;
size: number;
hash: string;
}
const DEFAULT_IGNORE = [
'node_modules', '.git', 'dist', 'build', '__pycache__', '.pytest_cache',
'.mypy_cache', '.env', 'coverage', '.next', '.nuxt', 'target', 'vendor',
'.idea', '.vscode', '.DS_Store', 'Thumbs.db',
];
const CODE_EXTENSIONS = [
'.ts', '.tsx', '.js', '.jsx', '.mjs', '.cjs',
'.py', '.rs', '.go', '.java', '.kt', '.c', '.cpp', '.h', '.hpp',
'.cs', '.rb', '.php', '.swift', '.vue', '.svelte',
];
export async function scanFiles(root: string, options: ScanOptions = {}): Promise<ScannedFile[]> {
const absRoot = path.resolve(root);
const maxDepth = options.maxDepth ?? 10;
const ignorePatterns = [...DEFAULT_IGNORE, ...(options.ignore || [])];
const extensions = options.extensions || CODE_EXTENSIONS;
const files: ScannedFile[] = [];
function shouldIgnore(name: string): boolean {
for (const pattern of ignorePatterns) {
if (pattern.startsWith('*.')) {
if (name.endsWith(pattern.slice(1))) return true;
} else if (name === pattern) return true;
}
return false;
}
function walk(dir: string, depth: number): void {
if (depth > maxDepth) return;
let entries: fs.Dirent[];
try {
entries = fs.readdirSync(dir, { withFileTypes: true });
} catch { return; }
for (const entry of entries) {
if (shouldIgnore(entry.name)) continue;
const fullPath = path.join(dir, entry.name);
if (entry.isDirectory()) {
walk(fullPath, depth + 1);
} else if (entry.isFile()) {
const ext = path.extname(entry.name).toLowerCase();
if (extensions.includes(ext)) {
try {
const stat = fs.statSync(fullPath);
const content = fs.readFileSync(fullPath);
files.push({
path: fullPath,
relativePath: path.relative(absRoot, fullPath),
extension: ext,
size: stat.size,
hash: crypto.createHash('md5').update(content).digest('hex'),
});
} catch { /* skip */ }
}
}
}
}
walk(absRoot, 0);
return files;
}
export function getDirectoryTree(root: string, maxDepth: number = 3): string {
const absRoot = path.resolve(root);
const lines: string[] = [path.basename(absRoot) + '/'];
function walk(dir: string, prefix: string, depth: number): void {
if (depth > maxDepth) return;
let entries: fs.Dirent[];
try { entries = fs.readdirSync(dir, { withFileTypes: true }); } catch { return; }
const filtered = entries.filter(e => !DEFAULT_IGNORE.includes(e.name) && !e.name.startsWith('.'));
const dirs = filtered.filter(e => e.isDirectory()).sort((a, b) => a.name.localeCompare(b.name));
const files = filtered.filter(e => e.isFile()).sort((a, b) => a.name.localeCompare(b.name));
const sorted = [...dirs, ...files];
sorted.forEach((entry, i) => {
const isLast = i === sorted.length - 1;
lines.push(prefix + (isLast ? '└── ' : '├── ') + entry.name + (entry.isDirectory() ? '/' : ''));
if (entry.isDirectory()) {
walk(path.join(dir, entry.name), prefix + (isLast ? ' ' : '│ '), depth + 1);
}
});
}
walk(absRoot, '', 0);
return lines.join('\n');
}

View File

@@ -0,0 +1,93 @@
export interface ChunkOptions {
maxTokens?: number;
overlap?: number;
splitOn?: 'paragraph' | 'sentence' | 'heading';
}
export interface Chunk {
index: number;
content: string;
tokenEstimate: number;
}
const DEFAULT_MAX_TOKENS = 1000;
const DEFAULT_OVERLAP = 100;
const CHARS_PER_TOKEN = 4; // Rough estimate
export function chunkContent(content: string, options: ChunkOptions = {}): Chunk[] {
const maxTokens = options.maxTokens || DEFAULT_MAX_TOKENS;
const overlap = options.overlap || DEFAULT_OVERLAP;
const splitOn = options.splitOn || 'paragraph';
const maxChars = maxTokens * CHARS_PER_TOKEN;
const overlapChars = overlap * CHARS_PER_TOKEN;
// If content is small enough, return as single chunk
if (content.length <= maxChars) {
return [{
index: 0,
content,
tokenEstimate: Math.ceil(content.length / CHARS_PER_TOKEN),
}];
}
// Split into segments based on strategy
const segments = splitContent(content, splitOn);
// Combine segments into chunks
const chunks: Chunk[] = [];
let currentChunk = '';
let chunkIndex = 0;
for (const segment of segments) {
if (currentChunk.length + segment.length > maxChars && currentChunk.length > 0) {
// Save current chunk
chunks.push({
index: chunkIndex++,
content: currentChunk.trim(),
tokenEstimate: Math.ceil(currentChunk.length / CHARS_PER_TOKEN),
});
// Start new chunk with overlap
if (overlapChars > 0 && currentChunk.length > overlapChars) {
currentChunk = currentChunk.slice(-overlapChars) + segment;
} else {
currentChunk = segment;
}
} else {
currentChunk += segment;
}
}
// Don't forget the last chunk
if (currentChunk.trim()) {
chunks.push({
index: chunkIndex,
content: currentChunk.trim(),
tokenEstimate: Math.ceil(currentChunk.length / CHARS_PER_TOKEN),
});
}
return chunks;
}
function splitContent(content: string, strategy: 'paragraph' | 'sentence' | 'heading'): string[] {
switch (strategy) {
case 'heading':
// Split on markdown headings
return content.split(/(?=^#{1,6}\s)/m).filter(s => s.trim());
case 'sentence':
// Split on sentence boundaries
return content.split(/(?<=[.!?])\s+/).filter(s => s.trim());
case 'paragraph':
default:
// Split on double newlines (paragraphs)
return content.split(/\n\n+/).filter(s => s.trim()).map(s => s + '\n\n');
}
}
export function estimateTokens(text: string): number {
return Math.ceil(text.length / CHARS_PER_TOKEN);
}

252
src/core/ingest/fetchers.ts Normal file
View File

@@ -0,0 +1,252 @@
import * as fs from 'fs';
import * as path from 'path';
import * as https from 'https';
import * as http from 'http';
export type SourceType = 'url' | 'pdf' | 'markdown' | 'text' | 'html';
export interface FetchedContent {
title: string;
content: string;
sourceType: SourceType;
metadata: Record<string, any>;
}
export function detectSourceType(source: string): SourceType {
if (source.startsWith('http://') || source.startsWith('https://')) {
return 'url';
}
const ext = path.extname(source).toLowerCase();
switch (ext) {
case '.pdf': return 'pdf';
case '.md': return 'markdown';
case '.html': case '.htm': return 'html';
default: return 'text';
}
}
export async function fetchContent(source: string, sourceType: SourceType): Promise<FetchedContent> {
switch (sourceType) {
case 'url':
return fetchUrl(source);
case 'pdf':
return fetchPdf(source);
case 'markdown':
case 'text':
case 'html':
return fetchFile(source, sourceType);
default:
throw new Error(`Unsupported source type: ${sourceType}`);
}
}
async function fetchUrl(url: string): Promise<FetchedContent> {
const html = await httpGet(url);
// Extract title from HTML
const titleMatch = html.match(/<title[^>]*>([^<]+)<\/title>/i);
const title = titleMatch?.[1]?.trim() || new URL(url).hostname;
// Convert HTML to readable text
const content = htmlToText(html);
return {
title,
content,
sourceType: 'url',
metadata: {
url,
fetchedAt: Date.now(),
},
};
}
function httpGet(url: string): Promise<string> {
return new Promise((resolve, reject) => {
const client = url.startsWith('https') ? https : http;
const req = client.get(url, {
headers: {
'User-Agent': 'Mozilla/5.0 (compatible; Cortex/1.0)',
'Accept': 'text/html,application/xhtml+xml,text/plain,*/*',
},
timeout: 30000,
}, (res) => {
// Handle redirects
if (res.statusCode && res.statusCode >= 300 && res.statusCode < 400 && res.headers.location) {
const redirectUrl = new URL(res.headers.location, url).toString();
resolve(httpGet(redirectUrl));
return;
}
if (res.statusCode && res.statusCode >= 400) {
reject(new Error(`HTTP ${res.statusCode}`));
return;
}
let data = '';
res.on('data', chunk => data += chunk);
res.on('end', () => resolve(data));
});
req.on('error', reject);
req.on('timeout', () => {
req.destroy();
reject(new Error('Request timeout'));
});
});
}
function htmlToText(html: string): string {
// Remove scripts, styles, and other non-content elements
let text = html
.replace(/<script[^>]*>[\s\S]*?<\/script>/gi, '')
.replace(/<style[^>]*>[\s\S]*?<\/style>/gi, '')
.replace(/<nav[^>]*>[\s\S]*?<\/nav>/gi, '')
.replace(/<header[^>]*>[\s\S]*?<\/header>/gi, '')
.replace(/<footer[^>]*>[\s\S]*?<\/footer>/gi, '')
.replace(/<aside[^>]*>[\s\S]*?<\/aside>/gi, '');
// Try to find main content
const mainMatch = text.match(/<main[^>]*>([\s\S]*?)<\/main>/i) ||
text.match(/<article[^>]*>([\s\S]*?)<\/article>/i) ||
text.match(/<div[^>]*class="[^"]*content[^"]*"[^>]*>([\s\S]*?)<\/div>/i);
if (mainMatch) {
text = mainMatch[1];
} else {
// Fall back to body
const bodyMatch = text.match(/<body[^>]*>([\s\S]*?)<\/body>/i);
if (bodyMatch) text = bodyMatch[1];
}
// Convert common elements to markdown-ish format
text = text
.replace(/<h1[^>]*>/gi, '\n# ')
.replace(/<\/h1>/gi, '\n')
.replace(/<h2[^>]*>/gi, '\n## ')
.replace(/<\/h2>/gi, '\n')
.replace(/<h3[^>]*>/gi, '\n### ')
.replace(/<\/h3>/gi, '\n')
.replace(/<h[456][^>]*>/gi, '\n#### ')
.replace(/<\/h[456]>/gi, '\n')
.replace(/<p[^>]*>/gi, '\n')
.replace(/<\/p>/gi, '\n')
.replace(/<br\s*\/?>/gi, '\n')
.replace(/<li[^>]*>/gi, '\n- ')
.replace(/<\/li>/gi, '')
.replace(/<[^>]+>/g, ' ') // Remove remaining tags
.replace(/&nbsp;/g, ' ')
.replace(/&amp;/g, '&')
.replace(/&lt;/g, '<')
.replace(/&gt;/g, '>')
.replace(/&quot;/g, '"')
.replace(/&#39;/g, "'")
.replace(/\n{3,}/g, '\n\n') // Collapse multiple newlines
.replace(/[ \t]+/g, ' ') // Collapse spaces
.trim();
return text;
}
async function fetchPdf(filePath: string): Promise<FetchedContent> {
// Basic PDF text extraction (very simple - just looks for text streams)
// For production, you'd want pdf-parse or pdfjs-dist
const buffer = fs.readFileSync(filePath);
const content = extractPdfText(buffer);
return {
title: path.basename(filePath, '.pdf'),
content: content || '[PDF content could not be extracted. Install pdf-parse for better support.]',
sourceType: 'pdf',
metadata: {
filePath,
size: buffer.length,
fetchedAt: Date.now(),
},
};
}
function extractPdfText(buffer: Buffer): string {
// Very basic PDF text extraction
// Looks for text between BT and ET markers, handles some encoding
const str = buffer.toString('binary');
const texts: string[] = [];
// Find text objects
const textRegex = /BT[\s\S]*?ET/g;
let match;
while ((match = textRegex.exec(str)) !== null) {
const block = match[0];
// Extract text from Tj and TJ operators
const tjMatches = block.match(/\(([^)]*)\)\s*Tj/g) || [];
for (const tj of tjMatches) {
const textMatch = tj.match(/\(([^)]*)\)/);
if (textMatch) {
texts.push(textMatch[1]);
}
}
}
// Also try to find raw text streams
const streamRegex = /stream\s*([\s\S]*?)\s*endstream/g;
while ((match = streamRegex.exec(str)) !== null) {
const decoded = match[1].replace(/[^\x20-\x7E\n\r\t]/g, ' ').trim();
if (decoded.length > 50 && /[a-zA-Z]{3,}/.test(decoded)) {
texts.push(decoded);
}
}
return texts.join('\n').replace(/\s+/g, ' ').trim();
}
async function fetchFile(filePath: string, sourceType: SourceType): Promise<FetchedContent> {
const content = fs.readFileSync(filePath, 'utf-8');
const title = path.basename(filePath, path.extname(filePath));
let processedContent = content;
if (sourceType === 'html') {
processedContent = htmlToText(content);
}
return {
title,
content: processedContent,
sourceType,
metadata: {
filePath,
fetchedAt: Date.now(),
},
};
}
export async function fetchFromClipboard(): Promise<FetchedContent> {
// This is platform-specific and won't work in all environments
// For standalone builds, we'd need native bindings
throw new Error('Clipboard support requires platform-specific implementation');
}
export async function fetchFromStdin(): Promise<FetchedContent> {
return new Promise((resolve, reject) => {
let data = '';
process.stdin.setEncoding('utf8');
process.stdin.on('data', chunk => data += chunk);
process.stdin.on('end', () => {
resolve({
title: 'Stdin Input',
content: data.trim(),
sourceType: 'text',
metadata: {
source: 'stdin',
fetchedAt: Date.now(),
},
});
});
process.stdin.on('error', reject);
// Timeout for stdin
setTimeout(() => {
if (!data) {
reject(new Error('No input received from stdin'));
}
}, 5000);
});
}

162
src/core/ingest/index.ts Normal file
View File

@@ -0,0 +1,162 @@
import * as crypto from 'crypto';
import { addNode, addEdge, listNodes, query } from '../store';
import { Node } from '../../types';
import { detectSourceType, fetchContent, fetchFromStdin, FetchedContent, SourceType } from './fetchers';
import { chunkContent, ChunkOptions, Chunk } from './chunker';
export interface IngestOptions {
title?: string;
tags?: string[];
chunkStrategy?: ChunkOptions;
noLink?: boolean;
stdin?: boolean;
}
export interface IngestResult {
success: boolean;
sourceType: SourceType;
title: string;
nodeCount: number;
nodes: { id: string; title: string }[];
parentId?: string;
}
export async function ingest(source: string, options: IngestOptions = {}): Promise<IngestResult> {
// Fetch content
let fetched: FetchedContent;
if (options.stdin) {
fetched = await fetchFromStdin();
} else {
const sourceType = detectSourceType(source);
fetched = await fetchContent(source, sourceType);
}
const title = options.title || fetched.title;
const tags = options.tags || [];
// Compute checksum for deduplication
const checksum = crypto.createHash('md5').update(fetched.content).digest('hex');
// Check for duplicates
const existingDupe = listNodes({ kind: 'memory', tags: ['ingested'] })
.find(n => (n.metadata as any)?.source?.checksum === checksum);
if (existingDupe) {
return {
success: false,
sourceType: fetched.sourceType,
title,
nodeCount: 0,
nodes: [],
};
}
// Chunk content if needed
const chunks = chunkContent(fetched.content, options.chunkStrategy);
const nodes: Node[] = [];
if (chunks.length === 1) {
// Single node
const node = await addNode({
kind: 'memory',
title,
content: chunks[0].content,
tags: ['ingested', fetched.sourceType, ...tags],
metadata: {
source: {
type: fetched.sourceType,
...fetched.metadata,
checksum,
},
tokenEstimate: chunks[0].tokenEstimate,
},
});
nodes.push(node);
} else {
// Create parent node with summary
const summaryContent = `Ingested content from ${fetched.sourceType} source.\n\n` +
`**Source:** ${fetched.metadata.url || fetched.metadata.filePath || 'stdin'}\n` +
`**Chunks:** ${chunks.length}\n` +
`**Total tokens:** ~${chunks.reduce((sum, c) => sum + c.tokenEstimate, 0)}\n\n` +
`## Preview\n\n${fetched.content.slice(0, 500)}...`;
const parentNode = await addNode({
kind: 'memory',
title,
content: summaryContent,
tags: ['ingested', fetched.sourceType, 'parent', ...tags],
metadata: {
source: {
type: fetched.sourceType,
...fetched.metadata,
checksum,
},
chunkCount: chunks.length,
},
});
nodes.push(parentNode);
// Create child nodes for each chunk
for (const chunk of chunks) {
const chunkNode = await addNode({
kind: 'memory',
title: `${title} (Part ${chunk.index + 1}/${chunks.length})`,
content: chunk.content,
tags: ['ingested', fetched.sourceType, 'chunk', ...tags],
metadata: {
parentId: parentNode.id,
chunkIndex: chunk.index,
tokenEstimate: chunk.tokenEstimate,
},
});
nodes.push(chunkNode);
addEdge(parentNode.id, chunkNode.id, 'contains');
}
}
// Auto-link to related nodes (if not disabled)
if (!options.noLink) {
for (const node of nodes) {
await linkRelatedNodes(node);
}
}
return {
success: true,
sourceType: fetched.sourceType,
title,
nodeCount: nodes.length,
nodes: nodes.map(n => ({ id: n.id, title: n.title })),
parentId: chunks.length > 1 ? nodes[0].id : undefined,
};
}
async function linkRelatedNodes(node: Node): Promise<void> {
// Search for related nodes based on content
const searchText = node.title + ' ' + node.content.slice(0, 500);
try {
const related = await query(searchText, { limit: 5 });
for (const result of related) {
// Don't link to self or siblings from same ingestion
if (result.node.id === node.id) continue;
if ((result.node.metadata as any)?.source?.checksum === (node.metadata as any)?.source?.checksum) continue;
// Only link if relevance is high enough
if (result.score > 0.5) {
try {
addEdge(node.id, result.node.id, 'relates_to', { reason: 'semantic-similarity', score: result.score });
} catch {
// Edge might already exist
}
}
}
} catch {
// Search might fail, that's okay
}
}
export { detectSourceType, fetchContent, fetchFromStdin, SourceType } from './fetchers';
export { chunkContent, estimateTokens, ChunkOptions, Chunk } from './chunker';

222
src/core/journal.ts Normal file
View File

@@ -0,0 +1,222 @@
import { addNode, updateNode, listNodes, getNode, addEdge } from './store';
import { Node } from '../types';
import { generate, isGenAvailable } from './search/ollamaGen';
export interface JournalEntry {
time: string;
text: string;
tags?: string[];
}
export interface JournalMetadata {
date: string;
entries: JournalEntry[];
summary?: string;
}
function formatDate(date: Date): string {
return date.toISOString().split('T')[0];
}
function formatTime(date: Date): string {
return date.toTimeString().slice(0, 5);
}
function dateTags(dateStr: string): string[] {
const [year, month] = dateStr.split('-');
return [year, `${year}-${month}`];
}
function getDateFromOffset(offset: string): string {
const now = new Date();
if (offset === 'yesterday') {
now.setDate(now.getDate() - 1);
} else if (offset === 'tomorrow') {
now.setDate(now.getDate() + 1);
} else if (offset.match(/^-\d+$/)) {
now.setDate(now.getDate() + parseInt(offset));
}
return formatDate(now);
}
export async function getOrCreateJournal(date?: string): Promise<Node> {
const targetDate = date || formatDate(new Date());
const title = `Journal: ${targetDate}`;
// Find existing journal for this date
const existing = listNodes({ kind: 'memory', tags: ['journal'] })
.find(n => n.title === title);
if (existing) {
return existing;
}
// Create new journal
const dayName = new Date(targetDate).toLocaleDateString('en-US', { weekday: 'long' });
return addNode({
kind: 'memory',
title,
content: `# ${dayName}, ${targetDate}\n\n`,
tags: ['journal', 'daily', ...dateTags(targetDate)],
metadata: {
date: targetDate,
entries: [],
},
});
}
export async function appendToJournal(
text: string,
options?: { tags?: string[]; date?: string }
): Promise<{ journal: Node; entry: JournalEntry }> {
const journal = await getOrCreateJournal(options?.date);
const time = formatTime(new Date());
const entry: JournalEntry = {
time,
text,
tags: options?.tags,
};
// Format entry line
const tagStr = options?.tags?.length ? ` \`${options.tags.join(', ')}\`` : '';
const entryLine = `- **${time}**${tagStr} ${text}`;
// Append to content
const newContent = journal.content.trimEnd() + '\n' + entryLine + '\n';
// Update metadata
const metadata = journal.metadata as JournalMetadata;
const entries = [...(metadata.entries || []), entry];
const updated = await updateNode(journal.id, {
content: newContent,
metadata: { ...metadata, entries },
});
// Auto-link mentioned nodes (if text contains node IDs or @mentions)
await linkMentionedNodes(journal.id, text);
return { journal: updated!, entry };
}
async function linkMentionedNodes(journalId: string, text: string): Promise<void> {
// Find UUID-like patterns (node IDs)
const uuidPattern = /\b([0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12})\b/gi;
const matches = text.match(uuidPattern);
if (matches) {
for (const id of matches) {
const node = getNode(id);
if (node && node.id !== journalId) {
try {
addEdge(journalId, node.id, 'relates_to', { reason: 'mentioned' });
} catch {
// Edge might already exist
}
}
}
}
// Find #hashtag patterns
const hashtagPattern = /#(\w+)/g;
const hashtags = [...text.matchAll(hashtagPattern)].map(m => m[1]);
if (hashtags.length) {
// Find nodes with these tags
const related = listNodes({ tags: hashtags, limit: 5 });
for (const node of related) {
if (node.id !== journalId) {
try {
addEdge(journalId, node.id, 'relates_to', { reason: 'hashtag' });
} catch {
// Edge might already exist
}
}
}
}
}
export function listJournals(options?: {
limit?: number;
month?: string;
year?: string;
}): Node[] {
let tags = ['journal'];
if (options?.month) tags.push(options.month);
else if (options?.year) tags.push(options.year);
return listNodes({
kind: 'memory',
tags,
limit: options?.limit || 30,
}).sort((a, b) => {
// Sort by date descending
const dateA = (a.metadata as JournalMetadata)?.date || '';
const dateB = (b.metadata as JournalMetadata)?.date || '';
return dateB.localeCompare(dateA);
});
}
export async function searchJournals(query: string, limit: number = 10): Promise<Node[]> {
const journals = listNodes({ kind: 'memory', tags: ['journal'], limit: 100 });
const lowerQuery = query.toLowerCase();
return journals
.filter(j => j.content.toLowerCase().includes(lowerQuery) || j.title.toLowerCase().includes(lowerQuery))
.slice(0, limit);
}
export async function generateJournalSummary(date?: string): Promise<string | null> {
const journal = await getOrCreateJournal(date);
const metadata = journal.metadata as JournalMetadata;
if (!metadata.entries?.length) {
return null;
}
const available = await isGenAvailable();
if (!available) {
// Fallback: simple summary
const count = metadata.entries.length;
return `${count} entries recorded.`;
}
const entriesText = metadata.entries
.map(e => `- ${e.time}: ${e.text}`)
.join('\n');
const prompt = `Summarize this day's journal entries in 2-3 sentences. Focus on accomplishments, decisions, and key points.
Journal entries:
${entriesText}
Summary:`;
const summary = await generate(prompt);
if (summary) {
// Save summary to journal
await updateNode(journal.id, {
metadata: { ...metadata, summary },
});
}
return summary;
}
export function getJournalByDate(date: string): Node | null {
const title = `Journal: ${date}`;
return listNodes({ kind: 'memory', tags: ['journal'] })
.find(n => n.title === title) || null;
}
export function getTodayJournal(): Node | null {
return getJournalByDate(formatDate(new Date()));
}
export function getYesterdayJournal(): Node | null {
return getJournalByDate(getDateFromOffset('yesterday'));
}
export { formatDate, formatTime, getDateFromOffset };

View File

@@ -0,0 +1,175 @@
import { execSync } from 'child_process';
import * as path from 'path';
export interface GitContext {
branch: string;
recentCommits: string[];
modifiedFiles: string[];
stagedFiles: string[];
recentlyTouched: string[];
isGitRepo: boolean;
}
export interface FileContext {
cwd: string;
projectName: string;
modifiedFiles: string[];
stagedFiles: string[];
}
/**
* Extract git context from current working directory
*/
export function getGitContext(): GitContext {
try {
// Check if we're in a git repo
execSync('git rev-parse --is-inside-work-tree', {
encoding: 'utf-8',
stdio: ['pipe', 'pipe', 'pipe'],
});
const branch = execSync('git branch --show-current', {
encoding: 'utf-8',
stdio: ['pipe', 'pipe', 'pipe'],
}).trim();
// Recent commit messages
let recentCommits: string[] = [];
try {
const logOutput = execSync('git log --oneline -5', {
encoding: 'utf-8',
stdio: ['pipe', 'pipe', 'pipe'],
}).trim();
recentCommits = logOutput
.split('\n')
.filter(Boolean)
.map(line => {
// Remove the commit hash prefix
const parts = line.split(' ');
return parts.slice(1).join(' ');
});
} catch {
// No commits yet
}
// Modified (unstaged) files
let modifiedFiles: string[] = [];
try {
modifiedFiles = execSync('git diff --name-only', {
encoding: 'utf-8',
stdio: ['pipe', 'pipe', 'pipe'],
})
.trim()
.split('\n')
.filter(Boolean);
} catch { /* empty */ }
// Staged files
let stagedFiles: string[] = [];
try {
stagedFiles = execSync('git diff --staged --name-only', {
encoding: 'utf-8',
stdio: ['pipe', 'pipe', 'pipe'],
})
.trim()
.split('\n')
.filter(Boolean);
} catch { /* empty */ }
// Recently touched files (from recent commits)
let recentlyTouched: string[] = [];
try {
recentlyTouched = execSync('git diff --name-only HEAD~5..HEAD 2>/dev/null || git diff --name-only HEAD', {
encoding: 'utf-8',
stdio: ['pipe', 'pipe', 'pipe'],
})
.trim()
.split('\n')
.filter(Boolean);
} catch { /* empty */ }
return {
branch,
recentCommits,
modifiedFiles,
stagedFiles,
recentlyTouched,
isGitRepo: true,
};
} catch {
// Not a git repo
return {
branch: '',
recentCommits: [],
modifiedFiles: [],
stagedFiles: [],
recentlyTouched: [],
isGitRepo: false,
};
}
}
/**
* Get file-based context
*/
export function getFileContext(): FileContext {
const cwd = process.cwd();
const projectName = path.basename(cwd);
const gitContext = getGitContext();
return {
cwd,
projectName,
modifiedFiles: gitContext.modifiedFiles,
stagedFiles: gitContext.stagedFiles,
};
}
/**
* Extract meaningful keywords from git context
*/
export function extractGitKeywords(context: GitContext): string[] {
const keywords: string[] = [];
// Branch name (often contains feature/ticket info)
if (context.branch && context.branch !== 'main' && context.branch !== 'master') {
// Split by common delimiters and filter short parts
const parts = context.branch.split(/[-_\/]/).filter(p => p.length > 2);
keywords.push(...parts);
}
// Keywords from commit messages
for (const commit of context.recentCommits.slice(0, 3)) {
// Extract meaningful words (skip common verbs/prepositions)
const words = commit
.toLowerCase()
.replace(/[^\w\s]/g, ' ')
.split(/\s+/)
.filter(w => w.length > 3)
.filter(w => !STOP_WORDS.has(w));
keywords.push(...words.slice(0, 5));
}
// File names (without extension)
for (const file of [...context.modifiedFiles, ...context.stagedFiles].slice(0, 5)) {
const basename = path.basename(file, path.extname(file));
if (basename.length > 2) {
// Split camelCase and kebab-case
const parts = basename
.replace(/([a-z])([A-Z])/g, '$1 $2')
.toLowerCase()
.split(/[-_\s]+/)
.filter(p => p.length > 2);
keywords.push(...parts);
}
}
// Deduplicate and return
return [...new Set(keywords)];
}
const STOP_WORDS = new Set([
'the', 'and', 'for', 'with', 'this', 'that', 'from', 'have', 'been',
'added', 'updated', 'fixed', 'removed', 'changed', 'merge', 'commit',
'feat', 'fix', 'chore', 'docs', 'style', 'refactor', 'test', 'build',
]);

306
src/core/search/smart.ts Normal file
View File

@@ -0,0 +1,306 @@
import { query, listNodes } from '../store';
import { getGitContext, getFileContext, extractGitKeywords, GitContext, FileContext } from './git-context';
import { Node, NodeKind } from '../../types';
export interface SmartSearchOptions {
limit?: number;
kind?: NodeKind;
includeRelated?: boolean;
}
export interface SmartSearchResult {
node: Node;
score: number;
originalScore: number;
boosts: {
time?: number;
file?: number;
branch?: number;
project?: number;
};
reason?: string;
}
export interface WhatContext {
gitContext: GitContext;
fileContext: FileContext;
branchRelated: Node[];
fileRelated: Node[];
tasks: Node[];
decisions: Node[];
recentMemories: Node[];
}
// Time boost factors
const TIME_BOOSTS = {
lastHour: 1.5,
lastDay: 1.3,
lastWeek: 1.1,
older: 1.0,
};
/**
* Smart search that combines explicit query with context signals
*/
export async function smartSearch(
explicitQuery?: string,
options: SmartSearchOptions = {}
): Promise<SmartSearchResult[]> {
const { limit = 20, kind, includeRelated = false } = options;
// Gather context
const gitContext = getGitContext();
const fileContext = getFileContext();
// Build implicit query from context
const contextKeywords = extractGitKeywords(gitContext);
contextKeywords.push(fileContext.projectName);
// Combine queries
const searchQuery = explicitQuery
? `${explicitQuery} ${contextKeywords.slice(0, 5).join(' ')}`
: contextKeywords.join(' ');
if (!searchQuery.trim()) {
// No context, fall back to recent
const recent = listNodes({ limit, kind, includeStale: false });
return recent.map(node => ({
node,
score: 1.0,
originalScore: 1.0,
boosts: {},
}));
}
// Run hybrid search with higher limit for re-ranking
const results = await query(searchQuery, { limit: limit * 3, kind });
// Re-rank based on context signals
const reranked = rerankResults(results, gitContext, fileContext);
// Expand to related if requested
if (includeRelated && reranked.length > 0) {
// Implementation for expanding to related nodes could go here
// For now, we just return the reranked results
}
return reranked.slice(0, limit);
}
/**
* Re-rank search results based on context signals
*/
function rerankResults(
results: Array<{ node: Node; score: number }>,
gitContext: GitContext,
fileContext: FileContext
): SmartSearchResult[] {
const now = Date.now();
const HOUR = 60 * 60 * 1000;
const DAY = 24 * HOUR;
const WEEK = 7 * DAY;
return results
.map(result => {
const boosts: SmartSearchResult['boosts'] = {};
let totalBoost = 1.0;
// Time boost based on last access
const lastAccess = result.node.lastAccessedAt || result.node.updatedAt;
const age = now - lastAccess;
if (age < HOUR) {
boosts.time = TIME_BOOSTS.lastHour;
totalBoost *= TIME_BOOSTS.lastHour;
} else if (age < DAY) {
boosts.time = TIME_BOOSTS.lastDay;
totalBoost *= TIME_BOOSTS.lastDay;
} else if (age < WEEK) {
boosts.time = TIME_BOOSTS.lastWeek;
totalBoost *= TIME_BOOSTS.lastWeek;
}
// File relevance boost
const nodeFiles = (result.node.metadata?.files as string[]) || [];
const nodePath = result.node.metadata?.filePath as string | undefined;
const allFiles = [...nodeFiles];
if (nodePath) allFiles.push(nodePath);
const changedFiles = [...gitContext.modifiedFiles, ...gitContext.stagedFiles];
const fileOverlap = allFiles.filter(f =>
changedFiles.some(cf => f.includes(cf) || cf.includes(f))
).length;
if (fileOverlap > 0) {
boosts.file = 1.0 + (0.2 * fileOverlap);
totalBoost *= boosts.file;
}
// Branch relevance boost
const branchKeywords = gitContext.branch
.split(/[-_\/]/)
.filter(k => k.length > 2)
.map(k => k.toLowerCase());
const tagMatch = result.node.tags.some(tag =>
branchKeywords.some(bk => tag.toLowerCase().includes(bk))
);
const titleMatch = branchKeywords.some(bk =>
result.node.title.toLowerCase().includes(bk)
);
if (tagMatch || titleMatch) {
boosts.branch = 1.3;
totalBoost *= 1.3;
}
// Project name boost
if (result.node.tags.includes(fileContext.projectName.toLowerCase())) {
boosts.project = 1.2;
totalBoost *= 1.2;
}
return {
node: result.node,
score: result.score * totalBoost,
originalScore: result.score,
boosts,
};
})
.sort((a, b) => b.score - a.score);
}
/**
* Gather full context for "what should I know?" command
*/
export async function gatherWhatContext(): Promise<WhatContext> {
const gitContext = getGitContext();
const fileContext = getFileContext();
// Branch-related nodes
let branchRelated: Node[] = [];
if (gitContext.branch && gitContext.branch !== 'main' && gitContext.branch !== 'master') {
const branchKeywords = gitContext.branch.replace(/[-_\/]/g, ' ');
const results = await query(branchKeywords, { limit: 5 });
branchRelated = results.map(r => r.node);
}
// File-related nodes
let fileRelated: Node[] = [];
if (gitContext.modifiedFiles.length > 0 || gitContext.stagedFiles.length > 0) {
const fileNames = [...gitContext.modifiedFiles, ...gitContext.stagedFiles]
.slice(0, 5)
.map(f => f.replace(/\.[^.]+$/, '').replace(/[\/\\]/g, ' '));
const fileQuery = fileNames.join(' ');
if (fileQuery) {
const results = await query(fileQuery, { limit: 5 });
fileRelated = results.map(r => r.node);
}
}
// Open tasks
const tasks = listNodes({
kind: 'task',
status: 'todo',
limit: 10,
includeStale: false,
}).concat(
listNodes({
kind: 'task',
status: 'in_progress',
limit: 5,
includeStale: false,
})
);
// Recent decisions
const decisions = listNodes({
kind: 'decision',
limit: 5,
includeStale: false,
});
// Recent memories (by lastAccessedAt)
const recentMemories = listNodes({
kind: 'memory',
limit: 10,
includeStale: false,
}).sort((a, b) => (b.lastAccessedAt || b.updatedAt) - (a.lastAccessedAt || a.updatedAt))
.slice(0, 5);
return {
gitContext,
fileContext,
branchRelated,
fileRelated,
tasks,
decisions,
recentMemories,
};
}
/**
* Format "what" context for display
*/
export function formatWhatContext(context: WhatContext): string {
const lines: string[] = [];
// Git context summary
if (context.gitContext.isGitRepo) {
lines.push(`📁 Project: ${context.fileContext.projectName}`);
if (context.gitContext.branch) {
lines.push(`🌿 Branch: ${context.gitContext.branch}`);
}
if (context.gitContext.modifiedFiles.length > 0) {
lines.push(`📝 Modified: ${context.gitContext.modifiedFiles.length} files`);
}
lines.push('');
}
// Branch-related
if (context.branchRelated.length > 0) {
lines.push('📚 Related to current branch:');
for (const node of context.branchRelated.slice(0, 3)) {
lines.push(` • [${node.kind}] ${node.title}`);
}
lines.push('');
}
// File-related
if (context.fileRelated.length > 0) {
lines.push('🔗 Related to changes:');
for (const node of context.fileRelated.slice(0, 3)) {
lines.push(` • [${node.kind}] ${node.title}`);
}
lines.push('');
}
// Open tasks
if (context.tasks.length > 0) {
lines.push('✅ Open tasks:');
for (const task of context.tasks.slice(0, 5)) {
const status = task.status === 'in_progress' ? '🔄' : '⬜';
lines.push(` ${status} ${task.title}`);
}
lines.push('');
}
// Recent decisions
if (context.decisions.length > 0) {
lines.push('🎯 Recent decisions:');
for (const decision of context.decisions.slice(0, 3)) {
lines.push(`${decision.title}`);
}
lines.push('');
}
// Recent memories
if (context.recentMemories.length > 0) {
lines.push('💭 Recently accessed:');
for (const memory of context.recentMemories.slice(0, 3)) {
lines.push(`${memory.title}`);
}
}
return lines.join('\n');
}

View File

@@ -1,6 +1,6 @@
import { randomUUID as uuid } from 'crypto';
import { getDb } from './db';
import { Node, Edge, AddNodeInput, UpdateNodeInput, ListOptions, QueryOptions, SearchResult, EdgeType } from '../types';
import { Node, Edge, AddNodeInput, UpdateNodeInput, ListOptions, QueryOptions, SearchResult, EdgeType, NodeVersion, HistoricalNode, NodeDiff } from '../types';
import { hybridSearch, deserializeEmbedding } from './search/index';
import { getEmbedding } from './search/ollama';
@@ -43,21 +43,44 @@ export async function addNode(input: AddNodeInput): Promise<Node> {
// Try to get embedding
const embedding = await getEmbedding(`${input.title} ${content}`);
db.prepare(`
INSERT INTO nodes (id, kind, title, content, status, tags, metadata, embedding, created_at, updated_at, last_accessed_at)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`).run(
id, input.kind, input.title, content, input.status ?? null,
JSON.stringify(tags), JSON.stringify(metadata),
embedding ? serializeEmbedding(embedding) : null,
now, now, now
);
const transaction = db.transaction(() => {
db.prepare(`
INSERT INTO nodes (id, kind, title, content, status, tags, metadata, embedding, created_at, updated_at, last_accessed_at, version)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`).run(
id, input.kind, input.title, content, input.status ?? null,
JSON.stringify(tags), JSON.stringify(metadata),
embedding ? serializeEmbedding(embedding) : null,
now, now, now, 1
);
// Insert tags
const insertTag = db.prepare('INSERT OR IGNORE INTO node_tags (node_id, tag) VALUES (?, ?)');
for (const tag of tags) {
insertTag.run(id, tag);
}
// Insert tags
const insertTag = db.prepare('INSERT OR IGNORE INTO node_tags (node_id, tag) VALUES (?, ?)');
for (const tag of tags) {
insertTag.run(id, tag);
}
// Create initial version record
const versionId = uuid();
db.prepare(`
INSERT INTO node_versions (id, node_id, version, title, content, status, tags, metadata, valid_from, valid_until, created_by)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`).run(
versionId,
id,
1,
input.title,
content,
input.status ?? null,
JSON.stringify(tags),
JSON.stringify(metadata),
now,
null,
'user'
);
});
transaction();
notifyDirty();
return {
@@ -114,46 +137,88 @@ export function listNodes(options: ListOptions = {}): Node[] {
return nodes;
}
export async function updateNode(id: string, input: UpdateNodeInput): Promise<Node | null> {
export async function updateNode(id: string, input: UpdateNodeInput, createdBy: string = 'user'): Promise<Node | null> {
const db = getDb();
const existing = getNode(id);
if (!existing) return null;
// Get existing node without updating last_accessed_at
const existingRow = db.prepare('SELECT * FROM nodes WHERE id = ?').get(id) as any;
if (!existingRow) return null;
const existing = rowToNode(existingRow);
const now = Date.now();
const sets: string[] = ['updated_at = ?'];
const params: any[] = [now];
if (input.title !== undefined) { sets.push('title = ?'); params.push(input.title); }
if (input.content !== undefined) { sets.push('content = ?'); params.push(input.content); }
if (input.status !== undefined) { sets.push('status = ?'); params.push(input.status); }
if (input.isStale !== undefined) { sets.push('is_stale = ?'); params.push(input.isStale ? 1 : 0); }
if (input.tags !== undefined) { sets.push('tags = ?'); params.push(JSON.stringify(input.tags)); }
if (input.metadata !== undefined) {
const merged = { ...existing.metadata, ...input.metadata };
sets.push('metadata = ?');
params.push(JSON.stringify(merged));
}
// Get current version number
const currentVersion = existingRow.version ?? 1;
const newVersion = currentVersion + 1;
// Re-embed if title or content changed
// Create version snapshot in a transaction
const transaction = db.transaction(() => {
// Close out the current version by setting valid_until
db.prepare(`
UPDATE node_versions SET valid_until = ? WHERE node_id = ? AND valid_until IS NULL
`).run(now, id);
// Insert new version record with the NEW state (after update)
const versionId = uuid();
const newTitle = input.title ?? existing.title;
const newContent = input.content ?? existing.content;
const newStatus = input.status !== undefined ? input.status : existing.status;
const newTags = input.tags ?? existing.tags;
const newMetadata = input.metadata !== undefined ? { ...existing.metadata, ...input.metadata } : existing.metadata;
db.prepare(`
INSERT INTO node_versions (id, node_id, version, title, content, status, tags, metadata, valid_from, valid_until, created_by)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`).run(
versionId,
id,
newVersion,
newTitle,
newContent,
newStatus ?? null,
JSON.stringify(newTags),
JSON.stringify(newMetadata),
now,
null,
createdBy
);
// Build the update query
const sets: string[] = ['updated_at = ?', 'version = ?'];
const params: any[] = [now, newVersion];
if (input.title !== undefined) { sets.push('title = ?'); params.push(input.title); }
if (input.content !== undefined) { sets.push('content = ?'); params.push(input.content); }
if (input.status !== undefined) { sets.push('status = ?'); params.push(input.status); }
if (input.isStale !== undefined) { sets.push('is_stale = ?'); params.push(input.isStale ? 1 : 0); }
if (input.tags !== undefined) { sets.push('tags = ?'); params.push(JSON.stringify(input.tags)); }
if (input.metadata !== undefined) {
const merged = { ...existing.metadata, ...input.metadata };
sets.push('metadata = ?');
params.push(JSON.stringify(merged));
}
params.push(id);
db.prepare(`UPDATE nodes SET ${sets.join(', ')} WHERE id = ?`).run(...params);
// Update tags if changed
if (input.tags !== undefined) {
db.prepare('DELETE FROM node_tags WHERE node_id = ?').run(id);
const insertTag = db.prepare('INSERT OR IGNORE INTO node_tags (node_id, tag) VALUES (?, ?)');
for (const tag of input.tags) {
insertTag.run(id, tag);
}
}
});
transaction();
// Re-embed if title or content changed (outside transaction since it's async)
if (input.title !== undefined || input.content !== undefined) {
const newTitle = input.title ?? existing.title;
const newContent = input.content ?? existing.content;
const embedding = await getEmbedding(`${newTitle} ${newContent}`);
if (embedding) {
sets.push('embedding = ?');
params.push(serializeEmbedding(embedding));
}
}
params.push(id);
db.prepare(`UPDATE nodes SET ${sets.join(', ')} WHERE id = ?`).run(...params);
// Update tags if changed
if (input.tags !== undefined) {
db.prepare('DELETE FROM node_tags WHERE node_id = ?').run(id);
const insertTag = db.prepare('INSERT OR IGNORE INTO node_tags (node_id, tag) VALUES (?, ?)');
for (const tag of input.tags) {
insertTag.run(id, tag);
db.prepare('UPDATE nodes SET embedding = ? WHERE id = ?').run(serializeEmbedding(embedding), id);
}
}
@@ -195,3 +260,114 @@ export async function query(text: string, options: QueryOptions = {}): Promise<S
const nodes = listNodes({ kind: options.kind, tags: options.tags, includeStale: options.includeStale });
return hybridSearch(nodes, text, options);
}
// Version tracking functions
function rowToNodeVersion(row: any): NodeVersion {
return {
id: row.id,
nodeId: row.node_id,
version: row.version,
title: row.title,
content: row.content,
status: row.status ?? undefined,
tags: JSON.parse(row.tags || '[]'),
metadata: JSON.parse(row.metadata || '{}'),
validFrom: row.valid_from,
validUntil: row.valid_until ?? null,
createdBy: row.created_by,
};
}
function rowToHistoricalNode(row: any, nodeRow: any): HistoricalNode {
return {
id: nodeRow.id,
kind: nodeRow.kind,
title: row.title,
content: row.content,
status: row.status ?? undefined,
tags: JSON.parse(row.tags || '[]'),
metadata: JSON.parse(row.metadata || '{}'),
version: row.version,
validFrom: row.valid_from,
validUntil: row.valid_until ?? null,
};
}
export function getNodeHistory(id: string): NodeVersion[] {
const db = getDb();
const rows = db.prepare(`
SELECT * FROM node_versions WHERE node_id = ? ORDER BY version DESC
`).all(id) as any[];
return rows.map(rowToNodeVersion);
}
export function getNodeAtTime(id: string, timestamp: number): HistoricalNode | null {
const db = getDb();
const nodeRow = db.prepare('SELECT * FROM nodes WHERE id = ?').get(id) as any;
if (!nodeRow) return null;
const versionRow = db.prepare(`
SELECT * FROM node_versions
WHERE node_id = ? AND valid_from <= ? AND (valid_until IS NULL OR valid_until > ?)
ORDER BY version DESC LIMIT 1
`).get(id, timestamp, timestamp) as any;
if (!versionRow) return null;
return rowToHistoricalNode(versionRow, nodeRow);
}
export function getNodeVersion(id: string, version: number): HistoricalNode | null {
const db = getDb();
const nodeRow = db.prepare('SELECT * FROM nodes WHERE id = ?').get(id) as any;
if (!nodeRow) return null;
const versionRow = db.prepare(`
SELECT * FROM node_versions WHERE node_id = ? AND version = ?
`).get(id, version) as any;
if (!versionRow) return null;
return rowToHistoricalNode(versionRow, nodeRow);
}
export function diffVersions(id: string, v1: number, v2: number): NodeDiff | null {
const version1 = getNodeVersion(id, v1);
const version2 = getNodeVersion(id, v2);
if (!version1 || !version2) return null;
const changes: NodeDiff['changes'] = [];
const fieldsToCompare: (keyof HistoricalNode)[] = ['title', 'content', 'status', 'tags', 'metadata'];
for (const field of fieldsToCompare) {
const oldVal = version1[field];
const newVal = version2[field];
const oldStr = JSON.stringify(oldVal);
const newStr = JSON.stringify(newVal);
if (oldStr !== newStr) {
changes.push({ field, old: oldVal, new: newVal });
}
}
return { nodeId: id, v1, v2, changes };
}
export async function restoreVersion(id: string, version: number, createdBy: string = 'restore'): Promise<Node | null> {
const historical = getNodeVersion(id, version);
if (!historical) return null;
// updateNode will handle creating a new version
return updateNode(id, {
title: historical.title,
content: historical.content,
status: historical.status,
tags: historical.tags,
metadata: historical.metadata,
}, createdBy);
}
export function getCurrentVersion(id: string): number {
const db = getDb();
const row = db.prepare('SELECT version FROM nodes WHERE id = ?').get(id) as any;
return row?.version ?? 1;
}

View File

@@ -1,7 +1,7 @@
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
import { z } from 'zod/v3';
import { query, listNodes, getNode, findNodeByPrefix, addNode, addEdge, removeNode, removeEdge, updateNode } from '../core/store';
import { query, listNodes, getNode, findNodeByPrefix, addNode, addEdge, removeNode, removeEdge, updateNode, getNodeHistory, getNodeAtTime, getNodeVersion, diffVersions, restoreVersion } from '../core/store';
import { getConnections, getEdgesByNode } from '../core/graph';
import { cosineSimilarity } from '../core/search/vector';
import { getDb } from '../core/db';
@@ -416,6 +416,69 @@ server.tool(
}
);
// --- memory_capture ---
import { captureConversation, captureText, getCaptureConfig, setCaptureConfig, CaptureMode } from '../core/capture';
server.tool(
'memory_capture',
'Capture a conversation or context as a memory node. Uses AI to summarize and extract key information.',
{
conversation: z.string().describe('The conversation or context to capture'),
sessionId: z.string().optional().describe('Session identifier'),
filesChanged: z.array(z.string()).optional().describe('List of files that were changed'),
source: z.string().optional().describe('Source identifier (default: claude-code)'),
},
async ({ conversation, sessionId, filesChanged, source }) => {
const result = await captureConversation({
conversation,
sessionId,
filesChanged,
source: source || 'claude-code',
});
return { content: [{ type: 'text' as const, text: serialize(result) }] };
}
);
server.tool(
'memory_remember',
'Remember a piece of text for later. Simpler than memory_capture - for quick notes and facts.',
{
text: z.string().describe('The text to remember'),
tags: z.array(z.string()).optional().describe('Tags to apply'),
},
async ({ text, tags }) => {
const result = await captureText(text, { tags, source: 'remember' });
return { content: [{ type: 'text' as const, text: serialize(result) }] };
}
);
server.tool(
'memory_capture_config',
'Get or set auto-capture configuration',
{
action: z.enum(['get', 'set']).describe('Action to perform'),
mode: z.enum(['always', 'manual', 'decisions', 'off']).optional().describe('Capture mode (for set)'),
minLength: z.number().optional().describe('Minimum conversation length (for set)'),
autoTag: z.boolean().optional().describe('Auto-generate tags (for set)'),
linkRelated: z.boolean().optional().describe('Auto-link related nodes (for set)'),
},
async ({ action, mode, minLength, autoTag, linkRelated }) => {
if (action === 'get') {
const config = getCaptureConfig();
return { content: [{ type: 'text' as const, text: serialize(config) }] };
}
const updates: Partial<{ mode: CaptureMode; minLength: number; autoTag: boolean; linkRelated: boolean }> = {};
if (mode !== undefined) updates.mode = mode;
if (minLength !== undefined) updates.minLength = minLength;
if (autoTag !== undefined) updates.autoTag = autoTag;
if (linkRelated !== undefined) updates.linkRelated = linkRelated;
const config = setCaptureConfig(updates);
return { content: [{ type: 'text' as const, text: serialize({ updated: true, config }) }] };
}
);
// --- memory_prompt ---
import { interpretAndExecute } from '../core/prompt/interpreter';
@@ -431,6 +494,538 @@ server.tool(
}
);
// --- memory_history ---
server.tool(
'memory_history',
'Get version history for a node',
{
id: z.string().describe('Node ID or prefix'),
},
async ({ id }) => {
const node = getNode(id) ?? findNodeByPrefix(id);
if (!node) {
return { content: [{ type: 'text' as const, text: serialize({ error: 'Node not found' }) }], isError: true };
}
const history = getNodeHistory(node.id);
return { content: [{ type: 'text' as const, text: serialize({ nodeId: node.id, title: node.title, versions: history }) }] };
}
);
// --- memory_show_at ---
server.tool(
'memory_show_at',
'Show node at a specific point in time',
{
id: z.string().describe('Node ID or prefix'),
timestamp: z.union([z.number(), z.string()]).describe('Unix ms or ISO date string'),
},
async ({ id, timestamp }) => {
const node = getNode(id) ?? findNodeByPrefix(id);
if (!node) {
return { content: [{ type: 'text' as const, text: serialize({ error: 'Node not found' }) }], isError: true };
}
// Parse timestamp
let ts: number;
if (typeof timestamp === 'number') {
ts = timestamp;
} else {
const parsed = Date.parse(timestamp);
if (isNaN(parsed)) {
return { content: [{ type: 'text' as const, text: serialize({ error: 'Invalid timestamp format' }) }], isError: true };
}
ts = parsed;
}
const historical = getNodeAtTime(node.id, ts);
if (!historical) {
return { content: [{ type: 'text' as const, text: serialize({ error: 'No version found for the specified time' }) }], isError: true };
}
return { content: [{ type: 'text' as const, text: serialize(historical) }] };
}
);
// --- memory_diff ---
server.tool(
'memory_diff',
'Compare two versions of a node',
{
id: z.string().describe('Node ID or prefix'),
v1: z.number().describe('First version number'),
v2: z.number().describe('Second version number'),
},
async ({ id, v1, v2 }) => {
const node = getNode(id) ?? findNodeByPrefix(id);
if (!node) {
return { content: [{ type: 'text' as const, text: serialize({ error: 'Node not found' }) }], isError: true };
}
const diff = diffVersions(node.id, v1, v2);
if (!diff) {
return { content: [{ type: 'text' as const, text: serialize({ error: 'One or both versions not found' }) }], isError: true };
}
return { content: [{ type: 'text' as const, text: serialize(diff) }] };
}
);
// --- memory_restore ---
server.tool(
'memory_restore',
'Restore a node to a previous version (creates new version)',
{
id: z.string().describe('Node ID or prefix'),
version: z.number().describe('Version number to restore'),
},
async ({ id, version }) => {
const node = getNode(id) ?? findNodeByPrefix(id);
if (!node) {
return { content: [{ type: 'text' as const, text: serialize({ error: 'Node not found' }) }], isError: true };
}
const restored = await restoreVersion(node.id, version);
if (!restored) {
return { content: [{ type: 'text' as const, text: serialize({ error: 'Version not found' }) }], isError: true };
}
return { content: [{ type: 'text' as const, text: serialize({ message: `Restored to version ${version}`, node: restored }) }] };
}
);
// --- memory_journal ---
import { getOrCreateJournal, appendToJournal, listJournals, generateJournalSummary, JournalMetadata } from '../core/journal';
server.tool(
'memory_journal',
'Get or create today\'s journal, or add an entry to it',
{
text: z.string().optional().describe('Text to add to journal (if omitted, returns current journal)'),
date: z.string().optional().describe('Specific date (YYYY-MM-DD)'),
tags: z.array(z.string()).optional().describe('Tags for the entry'),
},
async ({ text, date, tags }) => {
if (text) {
const { journal, entry } = await appendToJournal(text, { tags, date });
const meta = journal.metadata as JournalMetadata;
return { content: [{ type: 'text' as const, text: serialize({ added: true, date: meta.date, entry }) }] };
}
const journal = await getOrCreateJournal(date);
return { content: [{ type: 'text' as const, text: serialize(journal) }] };
}
);
server.tool(
'memory_journal_list',
'List recent journals',
{
limit: z.number().optional().describe('Max journals to return (default: 10)'),
month: z.string().optional().describe('Filter by month (YYYY-MM)'),
},
async ({ limit, month }) => {
const journals = listJournals({ limit: limit || 10, month });
return {
content: [{
type: 'text' as const,
text: serialize(journals.map(j => {
const meta = j.metadata as JournalMetadata;
return {
id: j.id,
date: meta.date,
entries: meta.entries?.length || 0,
hasSummary: !!meta.summary,
};
})),
}],
};
}
);
server.tool(
'memory_journal_summarize',
'Generate AI summary for a journal',
{
date: z.string().optional().describe('Date to summarize (default: today)'),
},
async ({ date }) => {
const summary = await generateJournalSummary(date);
return { content: [{ type: 'text' as const, text: serialize({ summary }) }] };
}
);
// --- memory_export ---
import { exportGraph, ExportFormat } from '../core/export';
server.tool(
'memory_export',
'Export the knowledge graph as HTML, SVG, or Mermaid diagram',
{
format: z.enum(['html', 'svg', 'mermaid']).describe('Export format'),
rootId: z.string().optional().describe('Root node ID for subgraph export'),
depth: z.number().optional().describe('Depth for subgraph (default: 3)'),
kind: z.string().optional().describe('Filter by node kind'),
tags: z.array(z.string()).optional().describe('Filter by tags'),
theme: z.enum(['light', 'dark']).optional().describe('Theme for HTML (default: dark)'),
},
async ({ format, rootId, depth, kind, tags, theme }) => {
const content = await exportGraph({
format: format as ExportFormat,
rootId,
depth,
kind,
tags,
theme,
});
return { content: [{ type: 'text' as const, text: content }] };
}
);
// --- memory_ingest ---
import { ingest } from '../core/ingest';
server.tool(
'memory_ingest',
'Ingest content from a URL or text into the knowledge graph',
{
source: z.string().describe('URL or raw text to ingest'),
title: z.string().optional().describe('Override title'),
tags: z.array(z.string()).optional().describe('Tags to apply'),
isUrl: z.boolean().optional().describe('Treat source as URL (auto-detected if not specified)'),
},
async ({ source, title, tags, isUrl }) => {
// If explicitly not a URL, or doesn't look like a URL, treat as raw text
const isSourceUrl = isUrl ?? (source.startsWith('http://') || source.startsWith('https://'));
if (!isSourceUrl) {
// Treat as raw text - create a simple memory node
const node = await addNode({
kind: 'memory',
title: title || 'Ingested Content',
content: source,
tags: ['ingested', 'text', ...(tags || [])],
metadata: { source: { type: 'text', ingestedAt: Date.now() } },
});
return { content: [{ type: 'text' as const, text: serialize({ success: true, nodeId: node.id, title: node.title }) }] };
}
const result = await ingest(source, { title, tags });
return { content: [{ type: 'text' as const, text: serialize(result) }] };
}
);
server.tool(
'memory_clip',
'Quick clip a URL into memory',
{
url: z.string().describe('URL to clip'),
title: z.string().optional().describe('Override title'),
tags: z.array(z.string()).optional().describe('Tags to apply'),
},
async ({ url, title, tags }) => {
if (!url.startsWith('http://') && !url.startsWith('https://')) {
return { content: [{ type: 'text' as const, text: serialize({ error: 'Invalid URL' }) }], isError: true };
}
const result = await ingest(url, { title, tags });
return { content: [{ type: 'text' as const, text: serialize(result) }] };
}
);
// --- memory_graphs ---
import { listGraphs, createGraph, deleteGraph, useGraph, getActiveGraph, graphExists, initProject } from '../core/graphs';
server.tool(
'memory_graphs',
'List available knowledge graphs',
{},
async () => {
const graphs = listGraphs();
const active = getActiveGraph();
return {
content: [{
type: 'text' as const,
text: serialize({
activeGraph: active,
graphs: graphs.map(g => ({
name: g.name,
isActive: g.name === active,
nodeCount: g.nodeCount,
edgeCount: g.edgeCount,
size: g.size,
lastAccessed: new Date(g.lastAccessed).toISOString(),
})),
}),
}],
};
}
);
server.tool(
'memory_use_graph',
'Switch to a different knowledge graph',
{
name: z.string().describe('Graph name to switch to'),
create: z.boolean().optional().describe('Create graph if it does not exist'),
},
async ({ name, create }) => {
if (!graphExists(name)) {
if (create) {
createGraph(name);
} else {
return {
content: [{ type: 'text' as const, text: serialize({ error: `Graph '${name}' does not exist` }) }],
isError: true,
};
}
}
useGraph(name);
return { content: [{ type: 'text' as const, text: serialize({ switched: true, graph: name }) }] };
}
);
server.tool(
'memory_create_graph',
'Create a new knowledge graph',
{
name: z.string().describe('Graph name (alphanumeric, dashes, underscores)'),
},
async ({ name }) => {
try {
const graph = createGraph(name);
return { content: [{ type: 'text' as const, text: serialize(graph) }] };
} catch (err: any) {
return { content: [{ type: 'text' as const, text: serialize({ error: err.message }) }], isError: true };
}
}
);
server.tool(
'memory_delete_graph',
'Delete a knowledge graph (cannot delete "default")',
{
name: z.string().describe('Graph name to delete'),
},
async ({ name }) => {
try {
deleteGraph(name);
return { content: [{ type: 'text' as const, text: serialize({ deleted: true, graph: name }) }] };
} catch (err: any) {
return { content: [{ type: 'text' as const, text: serialize({ error: err.message }) }], isError: true };
}
}
);
// --- memory_import ---
import { importObsidian } from '../core/import/obsidian';
import { importMarkdown } from '../core/import/markdown';
server.tool(
'memory_import',
'Import data from Obsidian vault or markdown folder',
{
source: z.enum(['obsidian', 'markdown']).describe('Source type'),
path: z.string().describe('Path to import from'),
tags: z.array(z.string()).optional().describe('Additional tags (markdown only)'),
kind: z.string().optional().describe('Node kind (default: memory)'),
hierarchy: z.boolean().optional().describe('Create folder hierarchy (obsidian only)'),
dryRun: z.boolean().optional().describe('Preview without making changes'),
},
async ({ source, path, tags, kind, hierarchy, dryRun }) => {
try {
if (source === 'obsidian') {
const result = await importObsidian(path, { kind, hierarchy, dryRun });
return { content: [{ type: 'text' as const, text: serialize(result) }] };
} else {
const result = await importMarkdown(path, { kind: kind as any, tags, dryRun });
return { content: [{ type: 'text' as const, text: serialize(result) }] };
}
} catch (err: any) {
return { content: [{ type: 'text' as const, text: serialize({ error: err.message }) }], isError: true };
}
}
);
// --- memory_backup ---
import { createBackup, restoreBackup, listBackups } from '../core/backup';
server.tool(
'memory_backup',
'Manage database backups: create, restore, or list',
{
action: z.enum(['create', 'restore', 'list']).describe('Action to perform'),
path: z.string().describe('Path for backup file or directory'),
},
async ({ action, path }) => {
try {
switch (action) {
case 'create': {
const result = await createBackup(path);
return { content: [{ type: 'text' as const, text: serialize(result) }] };
}
case 'restore': {
const result = await restoreBackup(path);
return { content: [{ type: 'text' as const, text: serialize({ restored: true, ...result }) }] };
}
case 'list': {
const backups = listBackups(path);
return { content: [{ type: 'text' as const, text: serialize({ backups }) }] };
}
}
} catch (err: any) {
return { content: [{ type: 'text' as const, text: serialize({ error: err.message }) }], isError: true };
}
}
);
// --- memory_export_markdown ---
import { exportMarkdown as exportMd } from '../core/export/markdown';
import { exportJsonLd } from '../core/export/jsonld';
server.tool(
'memory_export_markdown',
'Export knowledge graph to markdown files',
{
outputDir: z.string().describe('Output directory'),
kind: z.string().optional().describe('Filter by node kind'),
tags: z.array(z.string()).optional().describe('Filter by tags'),
frontmatter: z.boolean().optional().describe('Include frontmatter (default: true)'),
wikilinks: z.boolean().optional().describe('Include wikilinks (default: true)'),
},
async ({ outputDir, kind, tags, frontmatter, wikilinks }) => {
try {
const result = await exportMd(outputDir, {
kind: kind as any,
tags,
frontmatter: frontmatter !== false,
wikilinks: wikilinks !== false,
});
return { content: [{ type: 'text' as const, text: serialize(result) }] };
} catch (err: any) {
return { content: [{ type: 'text' as const, text: serialize({ error: err.message }) }], isError: true };
}
}
);
server.tool(
'memory_export_jsonld',
'Export knowledge graph as JSON-LD linked data',
{
kind: z.string().optional().describe('Filter by node kind'),
tags: z.array(z.string()).optional().describe('Filter by tags'),
},
async ({ kind, tags }) => {
try {
const result = await exportJsonLd({
kind: kind as any,
tags,
pretty: true,
});
return { content: [{ type: 'text' as const, text: result }] };
} catch (err: any) {
return { content: [{ type: 'text' as const, text: serialize({ error: err.message }) }], isError: true };
}
}
);
// --- memory_smart_search ---
import { smartSearch, gatherWhatContext, formatWhatContext } from '../core/search/smart';
server.tool(
'memory_smart_search',
'Context-aware search that uses git and file signals for relevance boosting',
{
query: z.string().optional().describe('Optional explicit search query'),
kind: z.enum(['memory', 'component', 'task', 'decision']).optional().describe('Filter by kind'),
limit: z.number().optional().describe('Max results (default: 10)'),
},
async ({ query: searchQuery, kind, limit }) => {
const results = await smartSearch(searchQuery, {
kind: kind as NodeKind,
limit: limit || 10,
});
return {
content: [{
type: 'text' as const,
text: serialize({
count: results.length,
results: results.map(r => ({
id: r.node.id,
kind: r.node.kind,
title: r.node.title,
score: r.score,
originalScore: r.originalScore,
boosts: r.boosts,
tags: r.node.tags,
})),
}),
}],
};
}
);
server.tool(
'memory_what',
'Get relevant context for current work: branch-related nodes, file-related nodes, open tasks, recent decisions',
{},
async () => {
const context = await gatherWhatContext();
const formatted = formatWhatContext(context);
return {
content: [{
type: 'text' as const,
text: formatted || 'No relevant context found.',
}],
};
}
);
// --- memory_index ---
import { indexProject } from '../core/indexer';
server.tool(
'memory_index',
'Index a codebase to create component nodes. Scans files, extracts exports/imports, and maps relationships.',
{
path: z.string().optional().describe('Path to index (default: current directory)'),
update: z.boolean().optional().describe('Only update changed files (incremental)'),
language: z.string().optional().describe('Only index specific language (ts, js, py)'),
maxDepth: z.number().optional().describe('Maximum directory depth (default: 10)'),
},
async ({ path: inputPath, update, language, maxDepth }) => {
const result = await indexProject(inputPath || '.', {
update,
language,
maxDepth,
});
return { content: [{ type: 'text' as const, text: serialize(result) }] };
}
);
// --- memory_components ---
server.tool(
'memory_components',
'List indexed components for a project',
{
project: z.string().optional().describe('Project name to filter by'),
limit: z.number().optional().describe('Max results (default: 50)'),
},
async ({ project, limit }) => {
const tags = project ? [project, 'indexed'] : ['indexed'];
const components = listNodes({ kind: 'component' as NodeKind, tags, limit: limit || 50 });
return {
content: [{
type: 'text' as const,
text: serialize({
count: components.length,
components: components.map(c => ({
id: c.id,
title: c.title,
filePath: c.metadata?.filePath,
exports: (c.metadata?.exports as string[])?.length || 0,
loc: c.metadata?.loc,
})),
}),
}],
};
}
);
async function main() {
const transport = new StdioServerTransport();
await server.connect(transport);

View File

@@ -4,6 +4,7 @@ import { getConnections, buildTree } from '../core/graph';
import { getDb } from '../core/db';
import { determineGroupingStrategy, groupResults } from './queryOrganizer';
import { getLastReport, markDirty, runMaintenance } from './heartbeat';
import { getCachedSummary, generateSummary } from '../core/summary';
import { gatherContext } from '../core/context';
const router = Router();
@@ -150,6 +151,20 @@ router.post('/query/organize', async (req: Request, res: Response) => {
}
});
// Summary — hierarchical pre-computed graph summary
router.get('/summary', async (req: Request, res: Response) => {
try {
const refresh = req.query.refresh === 'true';
let summary = refresh ? null : getCachedSummary();
if (!summary) {
summary = await generateSummary();
}
res.json(summary);
} catch (err: any) {
res.status(500).json({ error: err.message });
}
});
// Maintenance status
router.get('/maintenance/status', (_req: Request, res: Response) => {
const report = getLastReport();

392
src/tui/index.ts Normal file
View File

@@ -0,0 +1,392 @@
import * as readline from 'readline';
import chalk from 'chalk';
import { listNodes, query, getNode } from '../core/store';
import { getConnections } from '../core/graph';
import { getActiveGraph } from '../core/graphs';
import { Node } from '../types';
interface TuiState {
mode: 'browse' | 'search' | 'detail' | 'help';
nodes: Node[];
selected: number;
searchQuery: string;
detailNode: Node | null;
scroll: number;
}
const ITEMS_PER_PAGE = 10;
let state: TuiState = {
mode: 'browse',
nodes: [],
selected: 0,
searchQuery: '',
detailNode: null,
scroll: 0,
};
/**
* Launch the TUI dashboard
*/
export async function launchTui(): Promise<void> {
// Load initial nodes
state.nodes = listNodes({ limit: 100, includeStale: false });
// Setup terminal
setupTerminal();
render();
// Setup input handling
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
terminal: false,
});
// Enable raw mode for keystroke handling
if (process.stdin.isTTY) {
process.stdin.setRawMode(true);
}
process.stdin.on('data', handleKeypress);
// Initial render
render();
// Keep process alive
await new Promise<void>((resolve) => {
process.on('SIGINT', () => {
cleanup();
resolve();
});
});
}
function setupTerminal(): void {
// Clear screen and hide cursor
process.stdout.write('\x1b[2J');
process.stdout.write('\x1b[?25l');
process.stdout.write('\x1b[H');
}
function cleanup(): void {
// Show cursor and clear screen
process.stdout.write('\x1b[?25h');
process.stdout.write('\x1b[2J');
process.stdout.write('\x1b[H');
}
async function handleKeypress(data: Buffer): Promise<void> {
const key = data.toString();
// Global keys
if (key === 'q' || key === '\x03') { // q or Ctrl+C
cleanup();
process.exit(0);
}
if (key === '?') {
state.mode = state.mode === 'help' ? 'browse' : 'help';
render();
return;
}
if (state.mode === 'search') {
await handleSearchInput(key);
} else if (state.mode === 'browse') {
await handleBrowseInput(key);
} else if (state.mode === 'detail') {
handleDetailInput(key);
} else if (state.mode === 'help') {
if (key === '\x1b' || key === '\r') { // Escape or Enter
state.mode = 'browse';
render();
}
}
}
async function handleBrowseInput(key: string): Promise<void> {
switch (key) {
case '\x1b[A': // Up arrow
case 'k':
if (state.selected > 0) {
state.selected--;
if (state.selected < state.scroll) {
state.scroll = state.selected;
}
}
break;
case '\x1b[B': // Down arrow
case 'j':
if (state.selected < state.nodes.length - 1) {
state.selected++;
if (state.selected >= state.scroll + ITEMS_PER_PAGE) {
state.scroll = state.selected - ITEMS_PER_PAGE + 1;
}
}
break;
case '\r': // Enter
if (state.nodes[state.selected]) {
state.detailNode = state.nodes[state.selected];
state.mode = 'detail';
}
break;
case '/':
state.mode = 'search';
state.searchQuery = '';
break;
case 'r':
// Refresh
state.nodes = listNodes({ limit: 100, includeStale: false });
state.selected = 0;
state.scroll = 0;
break;
case '1':
state.nodes = listNodes({ kind: 'memory', limit: 100, includeStale: false });
state.selected = 0;
state.scroll = 0;
break;
case '2':
state.nodes = listNodes({ kind: 'component', limit: 100, includeStale: false });
state.selected = 0;
state.scroll = 0;
break;
case '3':
state.nodes = listNodes({ kind: 'task', limit: 100, includeStale: false });
state.selected = 0;
state.scroll = 0;
break;
case '4':
state.nodes = listNodes({ kind: 'decision', limit: 100, includeStale: false });
state.selected = 0;
state.scroll = 0;
break;
}
render();
}
async function handleSearchInput(key: string): Promise<void> {
if (key === '\x1b') { // Escape
state.mode = 'browse';
state.searchQuery = '';
state.nodes = listNodes({ limit: 100, includeStale: false });
state.selected = 0;
state.scroll = 0;
} else if (key === '\r') { // Enter
state.mode = 'browse';
} else if (key === '\x7f') { // Backspace
state.searchQuery = state.searchQuery.slice(0, -1);
if (state.searchQuery.length > 0) {
const results = await query(state.searchQuery, { limit: 50 });
state.nodes = results.map(r => r.node);
} else {
state.nodes = listNodes({ limit: 100, includeStale: false });
}
state.selected = 0;
state.scroll = 0;
} else if (key.length === 1 && key >= ' ') {
state.searchQuery += key;
if (state.searchQuery.length >= 2) {
const results = await query(state.searchQuery, { limit: 50 });
state.nodes = results.map(r => r.node);
}
state.selected = 0;
state.scroll = 0;
}
render();
}
function handleDetailInput(key: string): void {
if (key === '\x1b' || key === 'q' || key === '\r') { // Escape, q, or Enter
state.mode = 'browse';
state.detailNode = null;
} else if (key === '\x1b[A' || key === 'k') {
// Scroll up in detail view
// Could implement scrolling for long content
} else if (key === '\x1b[B' || key === 'j') {
// Scroll down in detail view
}
render();
}
function render(): void {
// Clear screen
process.stdout.write('\x1b[2J');
process.stdout.write('\x1b[H');
const width = process.stdout.columns || 80;
const height = process.stdout.rows || 24;
const activeGraph = getActiveGraph();
// Header
const header = ` CORTEX TUI `;
const graphInfo = ` Graph: ${activeGraph} `;
const headerLine = chalk.bgBlue.white(header) + ' '.repeat(width - header.length - graphInfo.length) + chalk.bgGreen.black(graphInfo);
console.log(headerLine);
console.log(chalk.dim('─'.repeat(width)));
if (state.mode === 'help') {
renderHelp(width, height);
return;
}
if (state.mode === 'detail' && state.detailNode) {
renderDetail(state.detailNode, width, height);
return;
}
// Search bar
if (state.mode === 'search') {
console.log(chalk.cyan('🔍 Search: ') + state.searchQuery + chalk.dim('_'));
} else {
console.log(chalk.dim('🔍 Press / to search'));
}
console.log();
// Filter info
console.log(chalk.dim(`Showing ${state.nodes.length} nodes [1]Memory [2]Component [3]Task [4]Decision`));
console.log();
// Node list
const visibleNodes = state.nodes.slice(state.scroll, state.scroll + ITEMS_PER_PAGE);
for (let i = 0; i < ITEMS_PER_PAGE; i++) {
const nodeIndex = state.scroll + i;
const node = visibleNodes[i];
if (!node) {
console.log();
continue;
}
const isSelected = nodeIndex === state.selected;
const prefix = isSelected ? chalk.cyan('▶ ') : ' ';
const kindColor = getKindColor(node.kind);
const title = node.title.slice(0, width - 30);
const id = node.id.slice(0, 8);
let line = `${prefix}${chalk.dim(id)} [${kindColor(node.kind.padEnd(9))}] ${isSelected ? chalk.bold(title) : title}`;
if (node.status) {
line += chalk.dim(` (${node.status})`);
}
console.log(line);
}
// Scrollbar indicator
if (state.nodes.length > ITEMS_PER_PAGE) {
const scrollPercent = Math.floor((state.selected / state.nodes.length) * 100);
console.log();
console.log(chalk.dim(` Showing ${state.scroll + 1}-${Math.min(state.scroll + ITEMS_PER_PAGE, state.nodes.length)} of ${state.nodes.length} (${scrollPercent}%)`));
}
// Footer
console.log();
console.log(chalk.dim('─'.repeat(width)));
console.log(chalk.dim('[↑↓/jk]Navigate [Enter]View [/]Search [r]Refresh [?]Help [q]Quit'));
}
function renderDetail(node: Node, width: number, height: number): void {
console.log(chalk.bold.cyan(`\n ${node.title}\n`));
console.log(chalk.dim(` ID: ${node.id}`));
console.log(` Kind: ${getKindColor(node.kind)(node.kind)}`);
if (node.status) {
console.log(` Status: ${node.status}`);
}
if (node.tags.length > 0) {
console.log(` Tags: ${chalk.yellow(node.tags.join(', '))}`);
}
console.log(` Created: ${new Date(node.createdAt).toLocaleString()}`);
console.log(` Updated: ${new Date(node.updatedAt).toLocaleString()}`);
console.log(chalk.dim('\n ─── Content ───\n'));
if (node.content) {
const contentLines = node.content.split('\n').slice(0, height - 20);
for (const line of contentLines) {
console.log(` ${line.slice(0, width - 4)}`);
}
if (node.content.split('\n').length > height - 20) {
console.log(chalk.dim(' ... (truncated)'));
}
} else {
console.log(chalk.dim(' (no content)'));
}
// Connections
const connections = getConnections(node.id);
if (connections.outgoing.length > 0 || connections.incoming.length > 0) {
console.log(chalk.dim('\n ─── Connections ───\n'));
if (connections.outgoing.length > 0) {
console.log(chalk.dim(' Outgoing:'));
for (const conn of connections.outgoing.slice(0, 5)) {
console.log(`${conn.type}: ${conn.node.title}`);
}
if (connections.outgoing.length > 5) {
console.log(chalk.dim(` ... and ${connections.outgoing.length - 5} more`));
}
}
if (connections.incoming.length > 0) {
console.log(chalk.dim(' Incoming:'));
for (const conn of connections.incoming.slice(0, 5)) {
console.log(`${conn.type}: ${conn.node.title}`);
}
if (connections.incoming.length > 5) {
console.log(chalk.dim(` ... and ${connections.incoming.length - 5} more`));
}
}
}
console.log(chalk.dim('\n' + '─'.repeat(width)));
console.log(chalk.dim('[Esc/q]Back'));
}
function renderHelp(width: number, height: number): void {
console.log(chalk.bold.cyan('\n Keyboard Shortcuts\n'));
const shortcuts = [
['↑ / k', 'Move selection up'],
['↓ / j', 'Move selection down'],
['Enter', 'View node details'],
['/', 'Start search'],
['Esc', 'Cancel / Go back'],
['r', 'Refresh list'],
['1', 'Show only memories'],
['2', 'Show only components'],
['3', 'Show only tasks'],
['4', 'Show only decisions'],
['?', 'Toggle help'],
['q', 'Quit'],
];
for (const [key, desc] of shortcuts) {
console.log(` ${chalk.cyan(key.padEnd(12))} ${desc}`);
}
console.log(chalk.dim('\n' + '─'.repeat(width)));
console.log(chalk.dim('[Enter/Esc]Close help'));
}
function getKindColor(kind: string): chalk.Chalk {
switch (kind) {
case 'memory': return chalk.blue;
case 'component': return chalk.green;
case 'task': return chalk.yellow;
case 'decision': return chalk.magenta;
default: return chalk.white;
}
}

View File

@@ -71,3 +71,42 @@ export interface ListOptions {
limit?: number;
includeStale?: boolean;
}
// Version tracking types
export interface NodeVersion {
id: string;
nodeId: string;
version: number;
title: string;
content: string;
status?: string;
tags: string[];
metadata: Record<string, any>;
validFrom: number;
validUntil: number | null;
createdBy: string;
}
export interface HistoricalNode {
id: string;
kind: NodeKind;
title: string;
content: string;
status?: string;
tags: string[];
metadata: Record<string, any>;
version: number;
validFrom: number;
validUntil: number | null;
}
export interface NodeDiff {
nodeId: string;
v1: number;
v2: number;
changes: {
field: string;
old: any;
new: any;
}[];
}