Installation Guide
Everything you need to run ContextStream on your own infrastructure — from a single Docker Compose command to a production Kubernetes cluster.
Quick Start — Docker Compose
The fastest way to get ContextStream running locally or on a single server.
pgvector extension. You can use Neon or Supabase for a managed option, or the bundled Postgres container in the Compose file.# 1. Clone the repo git clone https://github.com/jimseiwert/context-stream cd context-stream # 2. Copy and configure environment cp .env.example .env # Edit .env: set DATABASE_URL, BETTER_AUTH_SECRET, OPENAI_API_KEY # 3. Start with Docker Compose (migrations run automatically on startup) docker compose -f docker/docker-compose.yml up -d # 4. Open http://localhost:3000 # The first user to register automatically becomes SUPER_ADMIN
That's it. The application, worker, and Postgres container all start together. Once migrations are done, visit http://localhost:3000 and register your admin account.
Environment Variables
Copy .env.example to .env and set the values below. Required variables must be present before starting the application.
Kubernetes / Helm
Use the bundled Helm chart for production Kubernetes deployments with ingress and auto-scaling.
# Install from the bundled chart (migrations run automatically as a pre-install Job) helm install contextstream ./helm/contextstream \ --set secrets.DATABASE_URL="postgresql://user:pass@host:5432/db" \ --set secrets.BETTER_AUTH_SECRET="$(openssl rand -base64 32)" \ --set secrets.OPENAI_API_KEY="sk-..." \ --set env.NEXT_PUBLIC_APP_URL="https://contextstream.example.com" \ --set ingress.enabled=true \ --set ingress.hosts[0].host="contextstream.example.com" \ --set migrations.strategy=job
Database Setup
ContextStream requires PostgreSQL 14+ with the pgvector extension for hybrid vector search.
-- Run once on your Postgres instance before migrating CREATE EXTENSION IF NOT EXISTS vector;
# Run all pending migrations npm run db:migrate # Seed the database (creates admin user + sample source) npm run db:seed
The migration runner automatically installs the vector extension if it is not present. If your database user lacks superuser privileges, install the extension manually first using the command above.
Embedding Providers
ContextStream supports three embedding backends. The first admin configures the provider via Admin → System → Embedding Config.
The default provider is OpenAI. To switch providers, log in as a super admin and navigate to Admin → System → Embedding Config. All three providers generate 1536-dimensional embeddings that are stored in the pgvector column alongside BM25 keyword indexes for hybrid search.
Worker Dispatch Modes
Choose how scraping and embedding jobs are dispatched. Set via the DISPATCH_MODE environment variable.
To run an external worker alongside the app container:
docker compose -f docker/docker-compose.yml --profile worker up -d
MCP Server Setup
The MCP server runs at /api/mcp on your deployment. Connect it to Claude Desktop, Cursor, Zed, or any MCP-compatible tool.
Generate your API key in the app under Settings → API Keys, then add the config block below to your AI tool. Replace your-domain.com and YOUR_API_KEY.
{
"mcpServers": {
"contextstream": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-http-client"],
"env": {
"MCP_SERVER_URL": "https://your-domain.com/api/mcp",
"MCP_AUTH_HEADER": "Authorization: Bearer YOUR_API_KEY"
}
}
}
}/mcp in the app for pre-generated config snippets tailored to Claude Desktop, Cursor, Zed, and Windsurf.Upgrading
Always run database migrations after pulling a new image or chart version.
# Pull latest images docker compose -f docker/docker-compose.yml pull # Restart containers (migrations run automatically on startup) docker compose -f docker/docker-compose.yml up -d
# Upgrade the Helm release (migrations run automatically as a pre-upgrade Job) helm upgrade contextstream ./helm/contextstream --reuse-values
Need help?
Open a GitHub issue for bug reports, feature requests, or questions about self-hosting. Community support is available on the repository discussions tab.