Upgrading Ollama for Copilot Support: The Step you probably Miss
Ollama recently added a built-in Copilot integration — but only from a certain version onwards. If your installation is older, you’ll get a confusing error before you even get started. Here’s exactly what happened when I upgraded, including the one gotcha that tripped me up along the way.
The Starting Point: An Unsupported Version
I wanted to launch Ollama’s new Copilot feature with the kimi-k2.5:cloud model. The command looked straightforward:
slomm@wsl01:~$ ollama launch copilot --model kimi-k2.5:cloud
Error: unknown integration: copilotThe error made it clear: my version of Ollama simply didn’t know what copilot was. A quick version check confirmed the problem:
slomm@wsl01:~$ ollama --version
ollama version is 0.17.7Version 0.17.7 — too old for Copilot. Time to upgrade.
Running the Upgrade
Ollama’s official upgrade method is a one-liner install script. The first attempt had a small typo — a stray period after sh — which Linux helpfully caught:
slomm@wsl01:~$ curl -fsSL https://ollama.com/install.sh | sh.
Command 'sh.' not found, did you mean:
command 'shc' from deb shc (4.0.3-1)
command 'sh' from deb dash (0.5.12-6ubuntu1)Easy fix — remove the dot. The corrected command ran successfully:
slomm@wsl01:~$ curl -fsSL https://ollama.com/install.sh | sh
>>> Cleaning up old version at /usr/local/lib/ollama
>>> Installing ollama to /usr/local
>>> Downloading ollama-linux-amd64.tar.zst
######################################################################## 100.0%
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
>>> Nvidia GPU detected.
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.The installer handled everything: removed the old version, downloaded the new binary, configured user groups, set up the systemd service, and detected the Nvidia GPU. Clean and automatic.
The Gotcha: The Service Needs a Restart
Even though the installer said it was done, there’s an important step that’s easy to miss. The running Ollama service was still the old version in memory. Without an explicit restart, the client and service versions would be out of sync, causing warnings and unexpected behaviour.
The fix is a single command:
slomm@wsl01:~$ sudo systemctl restart ollamaAfter that, the version check confirmed the upgrade had fully taken effect:
slomm@wsl01:~$ ollama --version<br>ollama version is 0.21.0From 0.17.7 to 0.21.0 — and now, with the service properly restarted, the Copilot integration was available.
Launching Copilot
With the upgrade complete and the service restarted, the original command worked as expected:
slomm@wsl01:~$ ollama launch copilot --model kimi-k2.5:cloudNo error. Copilot up and running with kimi-k2.5:cloud.
Step-by-Step Summary
If you’re in the same situation, here’s the full process condensed:
- Check your current version:
ollama --version - Run the upgrade:
curl -fsSL https://ollama.com/install.sh | sh - Restart the system service:
sudo systemctl restart ollama - Verify the new version:
ollama --version - Launch Copilot:
ollama launch copilot --model <your-model>
The service restart in step 3 is the critical one. The installer does not automatically unload the old running daemon — you have to do that manually. Skip it and you’ll get version mismatch warnings that can be confusing to debug.
A Note on Shell Environment
Depending on your setup, you may also need to reload your shell environment after the upgrade — particularly if Ollama’s path or any environment variables changed. You can do this by either opening a new terminal session or running:
source ~/.bashrcor, if you use Zsh:
source ~/.zshrcTested on Ubuntu Linux with an Nvidia GPU. The install script detects your hardware automatically — AMD and CPU-only setups are also supported., see https://techno.slomka.biz/?post=1121