Installation
The recommended path is the one-line bundle installer. It downloads a prebuilt runtime bundle, unpacks it under ~/.local/share/grclanker, and links the launcher into ~/.local/bin.
If you just want the shortest install-to-first-run path, use the Quick Start. This page is the full reference for installers, pinned versions, skills-only installs, and fallback paths.
0.0.1is primarily tested on macOS and Linux. Windows support is best-effort right now and is not a priority for this first experimental release. If you want the least-friction path, use macOS or Linux.
One-line installer (recommended)
On macOS or Linux:
curl -fsSL https://grclanker.com/install | bash
On Windows PowerShell (best effort):
powershell -ExecutionPolicy Bypass -c "irm https://grclanker.com/install.ps1 | iex"
After install, run:
grclanker setup
That setup step is where you choose the local-first or hosted path. grclanker no longer expects you to guess how model configuration is supposed to work.
Configure grclanker for local-only use
The main thing people miss is that install and model setup are separate on purpose.
If you want grclanker to stay local-first instead of silently drifting to a hosted provider, do this immediately after install:
ollama serve
ollama pull gemma4
grclanker setup
Choose local-first when prompted.
The default documented local path in 0.0.1 is:
- endpoint:
http://localhost:11434/v1 - provider kind:
ollama - model example:
gemma4
If the endpoint is unreachable, setup stops and tells you exactly what to fix. It does not silently fall back to GPT or another hosted model.
What the installer does
- Detects the matching OS and architecture.
- Downloads the matching release bundle.
- Installs the runtime under
~/.local/share/grclanker. - Links
grclankerinto~/.local/bin. - Keeps runtime state under
~/.grclanker/agent.
If you previously installed grclanker through npm or bun, your shell may still resolve the old global binary first. Run which -a grclanker, then hash -r, or launch the standalone shim directly from ~/.local/bin/grclanker.
Skills only
If you only want the Codex/CLAUDE-style skills and not the full terminal runtime:
User-scoped install:
curl -fsSL https://grclanker.com/install-skills | bash
Repo-local install:
curl -fsSL https://grclanker.com/install-skills | bash -s -- --repo
Windows PowerShell (best effort):
powershell -ExecutionPolicy Bypass -c "irm https://grclanker.com/install-skills.ps1 | iex"
These installers download only the skills/ tree. They do not install the runtime bundle, the terminal UI, or the state under ~/.grclanker/agent.
Pinned versions
Pin the experimental release explicitly when you want a specific bundle:
curl -fsSL https://grclanker.com/install | bash -s -- 0.0.1
Windows PowerShell (best effort):
powershell -ExecutionPolicy Bypass -c "& ([scriptblock]::Create((irm https://grclanker.com/install.ps1))) -Version 0.0.1"
Package-manager fallback
Use this path only when you already manage your own Node runtime and do not need the bundled installer:
npm install -g @grclanker/cli
bun install -g @grclanker/cli
The package-manager installs do not solve runtime Node requirements for you. The bundle installer does.
Verify the install
grclanker --help
grclanker setup
If the help output appears and setup starts, the install is healthy.
Troubleshooting
- If
grclankerresolves to an older global install, runwhich -a grclankerandhash -r. - If local-first setup fails, check that Ollama is actually serving on
http://localhost:11434/v1. - If
gemma4is missing, runollama pull gemma4and rerungrclanker setup. - If you do not want local-first, rerun
grclanker setupand choosehostedexplicitly.
If you are on Windows and something is weird, that is not surprising in 0.0.1. The recommended path for now is macOS or Linux.