Skip to content

veup-engineering/chatdemo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

📖 Floating Chat Demo (Docker)

A minimal demo that adds a Node proxy and a static front‑end to an already‑running OpenWebUI instance.
The proxy validates a shared secret, rate‑limits requests, and streams the OpenWebUI response back to the browser.


📦 What’s inside

  • backend – Express proxy (backend/Dockerfile) → image openwebui-proxy
  • frontend – Nginx serving the widget (frontend/Dockerfile) → image openwebui-frontend
  • OpenWebUI – Must be running separately (e.g. docker run -p 3000:8080 ghcr.io/open-webui/open-webui:latest)

✅ Prerequisites

  • Docker installed and daemon running.
  • An OpenWebUI container reachable at http://localhost:3000/v1 (or another host/port you know).
  • You want the demo reachable on host ports 3001 (proxy) and 3002 (frontend).

🛠️ Create the proxy configuration

Save the following as backend/.env (next to the Dockerfile).
The PROXY_SHARED_SECRET value must match the one you’ll pass to the front‑end container.

# Base URL of the OpenWebUI API (no trailing slash)
OPENWEBUI_BASE_URL=http://host.docker.internal:3000/v1

# Optional API key for OpenWebUI (leave empty if not required)
OPENWEBUI_API_KEY=

# Default model used when the front‑end does not specify one
DEFAULT_MODEL=gpt-4o-mini

# Shared secret – must match the value passed to the front‑end
PROXY_SHARED_SECRET=demo-proxy-secret-123

📦 Build the Docker images

# From the repository root (where the `oui-demo` folder lives)
cd oui-demo

# Build the proxy image
docker build -t openwebui-proxy ./backend

# Build the static front‑end image
docker build -t openwebui-frontend ./frontend

🚀 Run the containers

# 1️⃣ Proxy (exposes port 3001)
docker run -d \
  --name chat-proxy \
  --restart unless-stopped \
  -p 3001:3001 \
  --env-file "$(pwd)/backend/.env" \
  openwebui-proxy

# 2️⃣ Front‑end (exposes port 3002)
docker run -d \
  --name chat-frontend \
  --restart unless-stopped \
  -p 3002:80 \
  -e PROXY_URL=http://host.docker.internal:3001/api/chat \
  -e PROXY_SHARED_SECRET=demo-proxy-secret-123 \
  openwebui-frontend

Note: PROXY_SHARED_SECRET must be identical to the value in backend/.env.


🔍 Verify the stack

# Health‑check the proxy (should return {"status":"ok"})
curl http://localhost:3001/health

# Verify the front‑end is serving static files (HTTP 200)
curl -I http://localhost:3002 | head -n 1

Open a browser and navigate to http://localhost:3002.
You’ll see a page with a floating chat button in the lower‑right corner. Clicking it sends a request through the front‑end → proxy → OpenWebUI and streams the response back.


🧹 Cleanup (optional)

docker rm -f chat-proxy chat-frontend
# Images stay cached; remove with `docker rmi` if you wish.

⚙️ Further considerations (optional)
  • Shared secret – protects the proxy from random traffic. Keep it secret.
  • OpenWebUI address – if OpenWebUI runs in another Docker network, replace host.docker.internal with the container name (e.g., http://openwebui:8080/v1).
  • API key – set OPENWEBUI_API_KEY only when your OpenWebUI instance requires authentication.
  • Production hardening – add TLS, proper auth (JWT/OAuth), and Docker health‑checks.
  • UI tweaks – the demo disables input while streaming; you may add a spinner or cancel button for a smoother experience.

📜 License

This demo code is provided under the MIT License. Feel free to fork, modify, and adapt it to your own projects.

About

Chatting with an OpenWebUI-hosted LLM with seeded RAG data.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published