-
Running Ollama with NVIDIA GPU inside WSL (Ubuntu) – Step-by-Step GuideSe
Running large language models locally with GPU acceleration inside WSL2 is not only possible—it’s surprisingly efficient once properly configured. This guide walks through a working setup using Ubuntu, NVIDIA GPU passthrough, and Ollama. 🧩 Target Setup 1. Prepare Windows Host Check Windows Version Ensure you’re on a supported version: Recommended: Enable WSL2 Install NVIDIA Driver (with WSL Support) on your Windows machine Install a current NVIDIA driver that supports WSL CUDA. Verify: If this fails, stop here—GPU passthrough will not work. 2. Prepare Ubuntu (WSL) Start WSL: Update packages: 3. Verify GPU inside WSL Expected: 4. (Optional) Install CUDA Toolkit Verify: 5. Install Ollama ⚠️ Required Dependency for Ollama Before…