Localcode packs every feature a serious developer needs.
Run any Ollama model. No API key. No cloud. Your code never leaves your machine. Free forever.
Free foreverFrom AI engineers to security auditors, database optimizers to DevOps automators. Each with deep domain expertise.
Auto-dispatchNEXUS pipeline coordinates multiple agents through 7 phases with quality gates, DevβQA loops, parallel execution.
NEXUSRead/write/patch/delete/move files, run shell commands, list directories, search, find, git operations.
Full toolsetEvery file write, patch, or shell command asks first. Configurable per-tool and per-command pattern.
Safety firstOllama, OpenAI, Anthropic, Groq. Switch mid-session. Budget guard auto-falls back to local when limits hit.
FlexibilityDrop a .js file in ~/.localcode/plugins/ to add custom slash commands. Zero boilerplate.
ExtensiblePreToolUse, PostToolUse, and Notification hooks β run custom scripts before or after any tool call.
AutomationConnect Model Context Protocol servers β both stdio and HTTP transports supported.
Extensible.localcode.md for global and project memory. Both loaded automatically at startup.
Persistent context/index builds a full semantic index of your project for smart context retrieval.
Smart contextDark, Light, Monokai, Nord. Switch with /theme.
Customizable