A minimal demo that adds a Node proxy and a static front‑end to an already‑running OpenWebUI instance.
The proxy validates a shared secret, rate‑limits requests, and streams the OpenWebUI response back to the browser.
- backend – Express proxy (
backend/Dockerfile) → imageopenwebui-proxy - frontend – Nginx serving the widget (
frontend/Dockerfile) → imageopenwebui-frontend - OpenWebUI – Must be running separately (e.g.
docker run -p 3000:8080 ghcr.io/open-webui/open-webui:latest)
- Docker installed and daemon running.
- An OpenWebUI container reachable at
http://localhost:3000/v1(or another host/port you know). - You want the demo reachable on host ports 3001 (proxy) and 3002 (frontend).
Save the following as backend/.env (next to the Dockerfile).
The PROXY_SHARED_SECRET value must match the one you’ll pass to the front‑end container.
# Base URL of the OpenWebUI API (no trailing slash)
OPENWEBUI_BASE_URL=http://host.docker.internal:3000/v1
# Optional API key for OpenWebUI (leave empty if not required)
OPENWEBUI_API_KEY=
# Default model used when the front‑end does not specify one
DEFAULT_MODEL=gpt-4o-mini
# Shared secret – must match the value passed to the front‑end
PROXY_SHARED_SECRET=demo-proxy-secret-123# From the repository root (where the `oui-demo` folder lives)
cd oui-demo
# Build the proxy image
docker build -t openwebui-proxy ./backend
# Build the static front‑end image
docker build -t openwebui-frontend ./frontend# 1️⃣ Proxy (exposes port 3001)
docker run -d \
--name chat-proxy \
--restart unless-stopped \
-p 3001:3001 \
--env-file "$(pwd)/backend/.env" \
openwebui-proxy
# 2️⃣ Front‑end (exposes port 3002)
docker run -d \
--name chat-frontend \
--restart unless-stopped \
-p 3002:80 \
-e PROXY_URL=http://host.docker.internal:3001/api/chat \
-e PROXY_SHARED_SECRET=demo-proxy-secret-123 \
openwebui-frontendNote:
PROXY_SHARED_SECRETmust be identical to the value inbackend/.env.
# Health‑check the proxy (should return {"status":"ok"})
curl http://localhost:3001/health
# Verify the front‑end is serving static files (HTTP 200)
curl -I http://localhost:3002 | head -n 1Open a browser and navigate to http://localhost:3002.
You’ll see a page with a floating chat button in the lower‑right corner. Clicking it sends a request through the front‑end → proxy → OpenWebUI and streams the response back.
docker rm -f chat-proxy chat-frontend
# Images stay cached; remove with `docker rmi` if you wish.⚙️ Further considerations (optional)
- Shared secret – protects the proxy from random traffic. Keep it secret.
- OpenWebUI address – if OpenWebUI runs in another Docker network, replace
host.docker.internalwith the container name (e.g.,http://openwebui:8080/v1). - API key – set
OPENWEBUI_API_KEYonly when your OpenWebUI instance requires authentication. - Production hardening – add TLS, proper auth (JWT/OAuth), and Docker health‑checks.
- UI tweaks – the demo disables input while streaming; you may add a spinner or cancel button for a smoother experience.
This demo code is provided under the MIT License. Feel free to fork, modify, and adapt it to your own projects.