A terminal AI coding assistant that runs 100% local with Ollama — or connect Claude, OpenAI, or Groq. 30+ slash commands, a plugin system, hooks, MCP servers, and Nyx, your ASCII co-pilot.
One npm package. No config files, no daemons, no cloud setup required.
npm install -g @localcode/cli
Node 18+ required. That's it.
Run localcode and pick a provider. Choose ollama for fully free, no-key-needed local AI.
Ask anything. Nyx reads files, writes patches, runs shell commands — with permission prompts before every write.
Run /commit to generate a conventional commit message. Nyx co-authors every one.
Everything Claude Code and Codex have — plus the things they'll never give you.
Run qwen2.5-coder, deepseek-r1, llama3.2, or any Ollama model. No API key. No cloud. Your code stays on your machine.
Use /provider claude mid-session. Switch from local Ollama to GPT-4o to Claude without restarting. Keys stored per-provider.
Drop a .js file in ~/.localcode/plugins/ to add custom slash commands. Full access to the conversation context and tools.
PreToolUse, PostToolUse, and Notification hooks — run custom scripts before or after any tool call, just like Claude Code.
~/.nyx.md for global memory, .nyx.md in project root for project context. Both loaded automatically at startup.
Connect Model Context Protocol servers — both stdio and HTTP transports supported. Give the AI access to your databases, APIs, and tools.
Extensible/image path/to/screenshot.png — works with Claude, GPT-4o, and Ollama llava. Describe UI, debug layouts, analyze diagrams.
/index builds a full semantic index of your project. Ask Nyx to find relevant files without passing your entire codebase as context.
Every file write, patch, or shell command asks first. Press y, n, or a (allow all). Full control, always.
Switch live with /provider <name>. API keys are stored per-provider and auto-loaded from env.
Type / to open the interactive picker. Every command has usage hints and tab completion.
Drop a .js file in ~/.localcode/plugins/ and it becomes a first-class slash command.
Name the file, export an object with name, description, and run. Nyx hot-reloads it on next command.
Plugins get the full conversation context, can ask the AI questions, and call any of LocalCode's built-in tools.
Package your plugin as an npm module. Anyone can install it with npm install -g localcode-plugin-deploy.
Plugins can register PreToolUse and PostToolUse hooks that fire automatically on every tool call.
Claude Code is great — but it locks you in. LocalCode gives you everything they have, plus the things that matter for serious developers.
Last updated March 2026. Subject to change as tools evolve.
Nyx is LocalCode's ASCII cat mascot. She lives in the header of every session, reacting in real time to what's happening — thinking when the model is working, lighting up green on success, turning red on errors.
She's not just decoration. Every git commit made through /commit carries her co-author signature — so she shows up as a real contributor on your GitHub repos.
Node 18 or later required. Ollama is optional — but recommended for free local AI.
localcode command globally.First run → setup wizard → pick provider → start coding.