This project provides a simple, self-contained docker-compose setup to run Ollama and the Open WebUI on a machine with an AMD GPU.
It is specifically configured to provide the Ollama container with the necessary access to the host's ROCm drivers and GPU hardware.
- Docker and Docker Compose: Must be installed on your system.
- AMD GPU: A supported AMD graphics card.
- ROCm Drivers: The appropriate ROCm user-space drivers must be installed on the host system. This setup was tested on Fedora 43 with the
rocm-hip,rocm-opencl, androcm-smi-libpackages installed.
-
Clone the Repository:
git clone <repository-url> cd ollama-rocm-webui-docker
-
Start the Services: Run the following command from within the project directory:
docker compose up -d
This will pull the official Ollama (ROCm version) and Open WebUI images and start both services in the background.
-
Access the WebUI: Open your web browser and navigate to: http://localhost:3000
The Open WebUI should be running and connected to the backend Ollama service. You can start pulling models and chatting.
This project consists of two main services defined in docker-compose.yml:
-
ollama: Runs the officialollama/ollama:rocmimage.devices: Passes the AMD GPU devices (/dev/kfdand/dev/dri) into the container.group_add: Adds the container user to thevideo(39) andrender(105) groups on the host, granting necessary permissions.volumes: Persists the Ollama models and data to a local./ollamadirectory.
-
open-webui: Runs the officialghcr.io/open-webui/open-webuiimage.ports: Exposes the web interface on port3000of the host machine.environment: SetsOLLAMA_BASE_URLto point to theollamaservice, allowing the two containers to communicate.volumes: Persists the Open WebUI data and configuration to a local./open-webuidirectory.depends_on: Ensures that theollamaservice is started before theopen-webuiservice.
-
Stop the services:
docker-compose down
-
View logs:
# View logs for both services docker-compose logs -f # View logs for a specific service docker-compose logs -f ollama docker-compose logs -f open-webui
Model data and configuration are stored locally in the ollama and open-webui directories within this project. These directories are created automatically when you first run docker-compose up.
Because these directories are specified in the .gitignore file, they will not be committed to your Git repository.