Install Ollama under Win11 & WSL - CUDA Installation guide
CMD prompt - verify WSL2 is installed
wsl --list --verbose
or
wsl -l -v
git clone CUDA samples - I used location at disk d:\LLM\Ollama , so I can find samples with ease
d: && cd d:\LLM\Ollama
git clone --recursive -j6 https://github.com/NVIDIA/cuda-samples.git
CUDA 12.3 was previously installed on Win11, but not under wsl. So, check if Linux instance recognizes the GPU.
wsl --user root -d ubuntu
nvidia-smi
Then, install CUDA for wsl
wget https://developer.download.nvidia.com/compute/cuda/repos/wsl-ubuntu/x86_64/7fa2af80.pub
sudo apt-key add 7fa2af80.pub
sudo sh -c 'echo "deb https://developer.download.nvidia.com/compute/cuda/repos/wsl-ubuntu/x86_64/ /" > /etc/apt/sources.list.d/cuda.list'
sudo apt-get update
nvidia-smi
Check CUDA Toolkit Installation
nvcc --version
cd /mnt/d/llm/ollama/cuda-samples
make 1_Utilities/deviceQuery
From root@rpc:/mnt/d/llm/ollama/cuda-samples/samples/1_Utilities/deviceQuery# execute
./deviceQuery
…. deviceQuery, CUDA Driver = CUDART, CUDA Driver Version = 12.3, CUDA Runtime Version = 12.3, NumDevs = 1 Result = PASS Result = PASS, indicates that CUDA is installed correctly
cd /mnt/c/Users/user
curl https://ollama.ai/install.sh | sh
ollama run mistral
From https://gist.github.com/nekiee13/c8ec43bce5fd75d20e38b31a613fd83d