Run Appstrate anywhere. Even offline.
Apache 2.0. Docker-compose in 2 minutes. Progressive infrastructure — start with zero dependencies.
The whole platform, on your hardware, under your rules.
Regulated industries, air-gapped environments, sovereign deployments — many teams can't use cloud AI infrastructure. Period.
Appstrate runs on your servers. BYOM via Ollama/llama.cpp/vLLM. Zero external calls when configured offline. Apache 2.0, no feature gating.
Tier 0 to Tier 3. Scale up as you grow.
Start with PGlite + filesystem storage (no dependencies). Add PostgreSQL, then Redis, then S3 + Docker when you need them.
# compose.yaml
services:
appstrate:
image: ghcr.io/appstrate/appstrate:latest
ports: ["3000:3000"]
environment:
DATABASE_URL: postgres://appstrate@db/appstrate
REDIS_URL: redis://redis:6379
depends_on: [db, redis]
db:
image: postgres:16
environment:
POSTGRES_USER: appstrate
POSTGRES_DB: appstrate
volumes: ["db:/var/lib/postgresql/data"]
redis:
image: redis:7-alpine
volumes:
db: {}What makes it work.
Docker-compose
One file. prod profile. Boots the full stack.
Progressive infra
No Redis needed until you need it. Same for Docker, S3, PG.
Air-gap capable
BYOM (Ollama/llama.cpp). Zero external calls when configured.
Apache 2.0
Full source. No feature gating. Fork, modify, redistribute.
Works great with
Your servers. Your models. Your rules.
Docker-compose up. First run in 2 minutes. Zero vendor calls.