Run Ollama using podman with amdgpu on Ubuntu 24.04
I have some freetime this afternoon so I decided that I will try running Ollama on podman. Ollama provides a docker image on docker hub at https://hub.docker.com/r/ollama/ollama, so I launched a container following the manual in the docker hub.
podman run -d -v ollama:/root/.ollama -p 11434:11434 \ --name ollama ollama/ollama My laptop (A ThinkPad T14 Gen 3 AMD) has a AMD GPU so I rerun the ollama with GPU supports.