A Next.js starter template demonstrating how to build durable video AI pipelines with @mux/ai and the Vercel Workflow DevKit.
| Layer | Pattern | Example |
|---|---|---|
| 1. Primitives | Call functions directly | getSummaryAndTags() — instant results |
| 2. Workflows | Run durably via Vercel Workflows | translateCaptions, translateAudio — retries, progress tracking |
| 3. Connectors | Compose with external tools | Clip creation with Remotion — multi-step pipelines |
This project showcases resumable, durable workflows out of the box:
- Start a workflow (captions, dubbing, or summary).
- Refresh the page, or navigate away and back.
- You should see the workflow still running asynchronously, with status rehydrated from browser
localStorage.
npm install
npm run devInspect workflow runs locally:
npx workflow webThis demo includes IP-based rate limiting to protect against excessive API costs. Limits are automatically bypassed in development mode.
| Endpoint | Limit | Window |
|---|---|---|
translate-audio |
3 | 24h |
translate-captions |
10 | 24h |
render |
6 | 24h |
summary |
10 | 24h |
search |
50 | 1h |
See DOCS/RATE-LIMITS.md for implementation details and maintenance.
Remotion is used within this example app for composing @mux/ai with video rendering.
# Open the Remotion Studio for live preview and iteration
npm run remotion:studio
# Render a video locally (for testing)
# Pass the composition name as an argument
npm run remotion:render:local default-composition
# Optionally specify an output path
npm run remotion:render:local default-composition out/foo.mp4# Deploy Remotion site to AWS Lambda for serverless rendering
npm run remotion:deployNote:
remotion:deploybundles and deploys your Remotion site to AWS Lambda for production video rendering. This is not for development — useremotion:studioandremotion:render:localfor local dev and testing.
Remotion is automatically deployed to AWS Lambda when changes to remotion/ are merged into main. See DOCS/AUTOMATED-REMOTION-DEPLOYMENTS.md for details.
See AGENTS.md for the full list. At minimum you'll need:
# Mux credentials
MUX_TOKEN_ID=
MUX_TOKEN_SECRET=
# OpenAI (required for embeddings)
OPENAI_API_KEY=
# Database (PostgreSQL with pgvector) — required to store/search the Mux catalog metadata
DATABASE_URL=This project stores your Mux catalog metadata in Postgres and generates pgvector embeddings for semantic search.
The database schema and migrations are managed with Drizzle (see db/schema.ts and db/migrations/), and the db:* scripts use Drizzle Kit.
Create a .env.local file (this is what both Drizzle and the import script load):
# Database (PostgreSQL + pgvector)
DATABASE_URL="postgresql://USER:PASSWORD@HOST:5432/DB_NAME"
# Mux (used by the import script)
MUX_TOKEN_ID="..."
MUX_TOKEN_SECRET="..."
# Embeddings (used by the import script)
OPENAI_API_KEY="..."Your Postgres must support
pgvector. The first migration will runCREATE EXTENSION IF NOT EXISTS vector;.
Apply the migrations in db/migrations/ (creates tables + indexes and enables pgvector):
npm run db:migrateThis fetches all ready Mux assets with playback IDs, upserts rows into videos, and writes embedding rows into video_chunks.
npm run import-mux-assetsTo embed subtitles from a specific captions track language, pass --language (defaults to en). This should match the language of an existing captions track on the source Mux asset — it does not translate captions:
npm run import-mux-assets -- --language ennpm run db:generate: Generates new migration files fromdb/schema.ts(use this after changing the schema).npm run db:migrate: Applies migrations to the database defined byDATABASE_URL.npm run db:studio: Opens Drizzle Studio to inspect tables/rows locally (also usesDATABASE_URL).
The media detail page (/media/[slug]) is organized into co-located feature folders:
app/media/[slug]/
├── media-content.tsx
├── page.tsx
├── localization/
│ ├── actions.ts (captions & audio translation)
│ ├── constants.ts
│ └── ui.tsx
├── player/
│ ├── context.ts
│ ├── provider.tsx
│ ├── ui.tsx
│ └── use-player.ts
├── social-clips/
│ ├── actions.ts (clip creation & Remotion Lambda rendering)
│ ├── constants.ts
│ ├── preview.tsx (client-side Remotion Player preview)
│ └── ui.tsx
├── summarize-and-tag/
│ ├── actions.ts (start/poll summary generation workflow)
│ └── ui.tsx
├── transcript/
│ ├── actions.ts (semantic search within video transcript)
│ ├── helpers.ts
│ └── ui.tsx
└── workflows-panel/
├── helpers.ts
└── ui.tsx (includes StatusBadge, StepProgress, etc.)
context/application-explained.md— what the app does and whycontext/design-explained.md— visual design and UX patternscontext/implementation-explained.md— routes, data model, and code patternsAGENTS.md— guidance for AI coding assistantsDOCS/RATE-LIMITS.md— rate limiting configuration and maintenance
pgvector— vector embeddings and similarity search for Postgres