|
|
0d592b8f2b
|
feat: optional LLM enhancement + noise filter for topic detection
- Add llm-enhance.ts: optional OpenAI-compatible LLM for deeper analysis
- Supports any provider: Ollama, OpenAI, OpenRouter, vLLM, etc.
- Batched calls (configurable batchSize, default 3 messages)
- Cooldown + timeout + graceful degradation (falls back to regex)
- JSON structured output: threads, decisions, closures, mood
- Add noise filter (isNoiseTopic):
- Rejects short/blacklisted/pronoun-starting fragments
- Fixes 'nichts gepostet habe' type garbage threads
- Improve patterns:
- Topic regex: min 3 chars, max 40 (was 2-30)
- Add 'let's talk/discuss/look at' and 'lass uns über/mal' triggers
- German patterns handle optional articles (dem/die/das)
- Wire LLM into hooks:
- Regex runs first (zero cost, always)
- LLM batches and enhances on top (async, fire-and-forget)
- ThreadTracker.applyLlmAnalysis() merges LLM findings
- DecisionTracker.addDecision() for direct LLM-detected decisions
- Config: new 'llm' section (disabled by default)
- 288 tests passing (18 new)
- Version 0.2.0
BREAKING: None — LLM is opt-in, regex behavior unchanged
|
2026-02-17 14:04:43 +01:00 |
|
|
|
44c78eaf5a
|
docs: rich demo showcase in README + fix openclaw.id in package.json
- README: expanded demo section with collapsible output per feature
- README: shows real conversation, thread tracking, decisions, mood, snapshot, boot context
- package.json: added openclaw.id field (fixes plugin discovery on install)
- Bump v0.1.2
|
2026-02-17 12:45:57 +01:00 |
|
|
|
d41a13f914
|
feat: openclaw-cortex v0.1.0 — conversation intelligence plugin
Thread tracking, decision extraction, boot context generation,
pre-compaction snapshots, structured narratives.
- 10 source files, 1983 LOC TypeScript
- 9 test files, 270 tests passing
- Zero runtime dependencies
- Cerberus approved + all findings fixed
- EN/DE pattern matching, atomic file writes
- Graceful degradation (read-only workspace, corrupt JSON)
|
2026-02-17 12:16:49 +01:00 |
|