Skip to content

jesuswasrasta/ollama-rocm-webui-docker

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Ollama and Open WebUI with Docker for AMD ROCm GPUs

This project provides a simple, self-contained docker-compose setup to run Ollama and the Open WebUI on a machine with an AMD GPU.

It is specifically configured to provide the Ollama container with the necessary access to the host's ROCm drivers and GPU hardware.

Prerequisites

  • Docker and Docker Compose: Must be installed on your system.
  • AMD GPU: A supported AMD graphics card.
  • ROCm Drivers: The appropriate ROCm user-space drivers must be installed on the host system. This setup was tested on Fedora 43 with the rocm-hip, rocm-opencl, and rocm-smi-lib packages installed.

Getting Started

  1. Clone the Repository:

    git clone <repository-url>
    cd ollama-rocm-webui-docker
  2. Start the Services: Run the following command from within the project directory:

    docker compose up -d

    This will pull the official Ollama (ROCm version) and Open WebUI images and start both services in the background.

  3. Access the WebUI: Open your web browser and navigate to: http://localhost:3000

    The Open WebUI should be running and connected to the backend Ollama service. You can start pulling models and chatting.

How It Works

This project consists of two main services defined in docker-compose.yml:

  • ollama: Runs the official ollama/ollama:rocm image.

    • devices: Passes the AMD GPU devices (/dev/kfd and /dev/dri) into the container.
    • group_add: Adds the container user to the video (39) and render (105) groups on the host, granting necessary permissions.
    • volumes: Persists the Ollama models and data to a local ./ollama directory.
  • open-webui: Runs the official ghcr.io/open-webui/open-webui image.

    • ports: Exposes the web interface on port 3000 of the host machine.
    • environment: Sets OLLAMA_BASE_URL to point to the ollama service, allowing the two containers to communicate.
    • volumes: Persists the Open WebUI data and configuration to a local ./open-webui directory.
    • depends_on: Ensures that the ollama service is started before the open-webui service.

Managing the Services

  • Stop the services:

    docker-compose down
  • View logs:

    # View logs for both services
    docker-compose logs -f
    
    # View logs for a specific service
    docker-compose logs -f ollama
    docker-compose logs -f open-webui

Data Persistence

Model data and configuration are stored locally in the ollama and open-webui directories within this project. These directories are created automatically when you first run docker-compose up.

Because these directories are specified in the .gitignore file, they will not be committed to your Git repository.

About

Ollama and OpenWebUI on Docker, ROCm and AMD 6600RX GPU customizations

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published