@localcode/cli  ·  v3.1.0  ·  MIT open source

Code
offline.
Own everything.

A terminal AI coding assistant that runs 100% local with Ollama — or connect Claude, OpenAI, or Groq. 30+ slash commands, a plugin system, hooks, MCP servers, and Nyx, your ASCII co-pilot.

4
Providers
30+
Commands
100%
Free w/ Ollama
MIT
License
localcode — ~/projects/api
100% offline with Ollama Multi-provider — Claude · OpenAI · Groq · Ollama 30+ slash commands Plugin system — drop .js in ~/.localcode/plugins/ Hooks — PreToolUse · PostToolUse · Notification Memory files — .nyx.md global + project MCP servers — stdio + HTTP 4 built-in themes Vision input — /image with llava · GPT-4o · Claude TF-IDF codebase search with /index Session history — browse and restore MIT License — fork it, own it 100% offline with Ollama Multi-provider — Claude · OpenAI · Groq · Ollama 30+ slash commands Plugin system — drop .js in ~/.localcode/plugins/ Hooks — PreToolUse · PostToolUse · Notification Memory files — .nyx.md global + project MCP servers — stdio + HTTP 4 built-in themes Vision input — /image with llava · GPT-4o · Claude TF-IDF codebase search with /index Session history — browse and restore MIT License — fork it, own it

Zero to first commit in 60 seconds.

One npm package. No config files, no daemons, no cloud setup required.

01

Install

npm install -g @localcode/cli
Node 18+ required. That's it.

02

Choose your provider

Run localcode and pick a provider. Choose ollama for fully free, no-key-needed local AI.

03

Chat & code

Ask anything. Nyx reads files, writes patches, runs shell commands — with permission prompts before every write.

04

Ship it

Run /commit to generate a conventional commit message. Nyx co-authors every one.

Built different.

Everything Claude Code and Codex have — plus the things they'll never give you.

100% local with Ollama

Run qwen2.5-coder, deepseek-r1, llama3.2, or any Ollama model. No API key. No cloud. Your code stays on your machine.

Free forever

Switch providers live

Use /provider claude mid-session. Switch from local Ollama to GPT-4o to Claude without restarting. Keys stored per-provider.

Multi-provider

Plugin system

Drop a .js file in ~/.localcode/plugins/ to add custom slash commands. Full access to the conversation context and tools.

Extensible

Hooks

PreToolUse, PostToolUse, and Notification hooks — run custom scripts before or after any tool call, just like Claude Code.

Automation

Memory files

~/.nyx.md for global memory, .nyx.md in project root for project context. Both loaded automatically at startup.

Persistent context

MCP servers

Connect Model Context Protocol servers — both stdio and HTTP transports supported. Give the AI access to your databases, APIs, and tools.

Extensible

Vision input

/image path/to/screenshot.png — works with Claude, GPT-4o, and Ollama llava. Describe UI, debug layouts, analyze diagrams.

Multimodal

TF-IDF codebase search

/index builds a full semantic index of your project. Ask Nyx to find relevant files without passing your entire codebase as context.

Smart context

Permission prompts

Every file write, patch, or shell command asks first. Press y, n, or a (allow all). Full control, always.

Safety

Your model,
your choice.

Switch live with /provider <name>. API keys are stored per-provider and auto-loaded from env.

Ollama qwen2.5-coder · llama3.2 · deepseek-r1 · mistral · gemma3 · llava Free · Fully local · No API key Runs 100% on your machine Requires Ollama running locally
Claude claude-sonnet-4-5 · claude-opus-4-5 · claude-haiku-4-5 ANTHROPIC_API_KEY console.anthropic.com
OpenAI gpt-4o · gpt-4o-mini · o1-mini · o3-mini OPENAI_API_KEY platform.openai.com/api-keys
Groq llama-3.3-70b · mixtral-8x7b · gemma2-9b Free tier available console.groq.com/keys

30+ slash commands.
All in the terminal.

Type / to open the interactive picker. Every command has usage hints and tab completion.

/clearClear conversation history. Checkpoints are preserved.session
/compactSummarize and compress the conversation to save context window.session
/statusShow provider, model, key status, token count, and checkpoints.session
/checkpointSave a named snapshot. Usage: /checkpoint before-refactorsession
/restoreList or restore a saved checkpoint by ID.session
/historyBrowse and restore past sessions interactively.session
/themeSwitch UI theme: dark · nord · monokai · lightsession
/shareExport session as shareable markdown or HTML snippet.session
/pingTest connection to the current provider. Shows latency.session
/exitSave session state and quit the application.session
/theme
Four built-in themes: dark (default), nord, monokai, and light. Instant toggle, persisted across sessions.
/theme nord
Theme set to nord
/history
Browse your past sessions by date and topic. Select any session to restore its context — files, messages, and checkpoints.
/commitGenerate a conventional commit message from staged diff. Nyx co-authors.git
/gitRun git commands with AI assistance. /git status, /git log, etc.git
/diffShow all files modified by tool calls in this session.git
/reviewAI code review of a file or selection with actionable feedback.code
/testGenerate unit tests. Auto-detects Jest, Vitest, Mocha, pytest, etc.code
/explainExplain a file, function, or concept in plain English.code
/watchWatch a file for changes and auto-explain diffs in real time.code
/commit — AI-generated commits
Stage your changes, run /commit, and Nyx reads the diff and writes a conventional commit message. She's automatically added as co-author.
/commit
Reading staged diff…
feat(auth): add refresh token rotation
Co-authored-by: Nyx <nyx@thealxlabs.ca>
/contextInject a file or folder into the conversation. Usage: /context src/context
/indexBuild TF-IDF search index of your codebase. Enables smart context retrieval.context
/webFetch a URL and inject the page content as context.context
/imageSend an image to the model. Works with Claude, GPT-4o, and Ollama llava.context
/templateApply a saved prompt template. /template init, /template review, etc.context
/aliasCreate custom command aliases. /alias deploy="run npm run deploy"context
/index — TF-IDF codebase search
Build a full semantic index of your project. Instead of injecting thousands of lines of context, Nyx finds the 5-10 most relevant files automatically.
/index
Indexed 847 files in 1.2s
TF-IDF index ready. Ask me anything.
/providerList or switch providers. Usage: /provider ollamaprovider
/modelSwitch model for the active provider. Usage: /model qwen2.5-coder:7bprovider
/apikeySet or update API key for current provider.provider
/pingTest provider connection and show latency.provider
/doctorRun diagnostics: check Ollama, API keys, Node version, and config.provider
/initRe-run the setup wizard. Pick provider, model, and configure keys.provider
/provider — live switching
Switch between any provider mid-session without losing context. Your API keys are stored per-provider and reloaded automatically.
/provider claude
Switched to claude-sonnet-4-5
/provider ollama
Switched to qwen2.5-coder:7b (local)
/allowallToggle permission prompts for all tool calls this session.tools
/templateManage and apply saved prompt templates.tools
/aliasCreate or list command aliases for common tasks.tools
/hookManage PreToolUse, PostToolUse, and Notification hooks.tools
/mcpConnect or list MCP (Model Context Protocol) servers.tools
/memoryView or edit the .nyx.md memory files (global and project).tools
/hook — automation
Run custom scripts before or after any tool call. Wire up linters, formatters, notifications, or anything else you want Nyx to trigger automatically.
// ~/.localcode/hooks/post-write.sh
npm run lint --fix $FILE
git add $FILE

Extend Nyx with
your own commands.

Drop a .js file in ~/.localcode/plugins/ and it becomes a first-class slash command.

// ~/.localcode/plugins/deploy.js
module.exports = {
  name: 'deploy',
  description: 'Deploy to production',
  async run({ context, tools }) {
    // Full access to AI context + tools
    const review = await context.ask(
      'Check for security issues'
    );
    if (review.safe) {
      await tools.shell('npm run deploy');
    }
  }
};

Zero boilerplate

Name the file, export an object with name, description, and run. Nyx hot-reloads it on next command.

Full AI access

Plugins get the full conversation context, can ask the AI questions, and call any of LocalCode's built-in tools.

Share and publish

Package your plugin as an npm module. Anyone can install it with npm install -g localcode-plugin-deploy.

Hooks too

Plugins can register PreToolUse and PostToolUse hooks that fire automatically on every tool call.

The one they don't
want you to have.

Claude Code is great — but it locks you in. LocalCode gives you everything they have, plus the things that matter for serious developers.

Feature
LocalCode
Claude Code
Codex CLI
Fully local / offline mode
✓ Free with Ollama
✗ Always cloud
✗ Always cloud
Multi-provider support
✓ 4 providers
✗ Claude only
✗ OpenAI only
Switch provider live
✓ /provider cmd
Open source (MIT)
✓ Full source
✗ Proprietary
~ Partial
Plugin system
✓ Drop .js files
Hooks (Pre/PostToolUse)
Memory files (.nyx.md)
✓ Global + project
✓ CLAUDE.md
MCP server support
✓ stdio + HTTP
Vision / image input
✓ /image cmd
TF-IDF codebase search
✓ /index cmd
Session history / restore
✓ /history cmd
Free to use
✓ With Ollama
✗ Subscription
~ Free tier

Last updated March 2026. Subject to change as tools evolve.

/\_/\ ( ·.· ) > ♥ <

Meet Nyx.

Nyx is LocalCode's ASCII cat mascot. She lives in the header of every session, reacting in real time to what's happening — thinking when the model is working, lighting up green on success, turning red on errors.

She's not just decoration. Every git commit made through /commit carries her co-author signature — so she shows up as a real contributor on your GitHub repos.

Co-authored-by: Nyx <nyx@thealxlabs.ca>
Mood states
idle · thinking · happy · error · waiting
Co-author email
nyx@thealxlabs.ca
Appears in
Header · Commits · Errors · Setup wizard
Created by
TheAlxLabs · Toronto

Ship it to your
terminal now.

Node 18 or later required. Ollama is optional — but recommended for free local AI.

Global install (recommended)
npm install -g @localcode/cli
Installs the localcode command globally.
No install needed
npx @localcode/cli
Runs directly via npx. No global install required.
Then launch:
localcode

First run → setup wizard → pick provider → start coding.

Node.js 18+ macOS · Linux · WSL MIT License TypeScript source 0 telemetry