Collected molecules will appear here. Add from search or explore.
Local LLM-powered knowledge graph builder for Obsidian that auto-extracts concepts from markdown notes and creates wiki-style backlinks without cloud dependencies
stars
28
forks
4
This is a 24-star, brand-new project (<1 day old) that combines three well-established components: Obsidian plugin architecture, Ollama local LLM inference, and standard NLP concept extraction. The privacy/local-first positioning is attractive but not novel—Obsidian already has extensive local-first workflows, and Ollama is commodity infrastructure for running local models. The core novelty is applying local LLM inference to auto-generate wiki structure, but this is a straightforward application of existing techniques (prompt-based entity extraction + link injection). No custom model architecture, no novel algorithm, no specialized dataset. The implementation appears to be in early prototype stage given the repo age and low adoption. Obsidian plugins are highly fragmented; this would compete with existing knowledge-graph plugins (Breadcrumbs, Graph Analysis, etc.) and Copilot-style plugins but with a local-only constraint. Frontier labs (Anthropic, OpenAI) would view this as a niche application layer, not a platform threat. Medium frontier risk only because they could trivially add 'local mode' or 'Obsidian integration' as a feature flag to their APIs, making standalone local tools less compelling over time. Switching costs are near zero—users can fork notes, try another tool, or prompt their LLM manually. Ecosystem lock-in is weak; the value is in the Obsidian ecosystem itself, not the plugin's technical depth.
TECH STACK
INTEGRATION
obsidian_plugin
READINESS