Overview
How Appstrate integrates with providers, LLMs, personal agents, and coding agents.
Appstrate integrates in two directions.
Inbound, plug things into Appstrate. Providers (Gmail, Slack, Notion, and 57 others) expose your end-users' credentials to agents. LLM models run via BYOK: you bring an Anthropic, OpenAI, Google, Mistral, Groq, OpenRouter, xAI, Cerebras, Azure OpenAI, Google Vertex, or AWS Bedrock key, or point at any OpenAI-compatible endpoint, and every agent uses them.
Outbound, plug Appstrate into other systems. Personal agents like OpenClaw or Hermes can use Appstrate as an execution backend: they reason locally, then call the Appstrate REST API when they need multi-tenant runs, credential-hidden tool calls, or team-scale parallelism. IDE-based coding agents (Cursor, Claude Code) can drive Appstrate the same way.
None of these are shipped as SDKs. They are patterns over the REST API, documented below so you know what to build.
Inbound
| Page | What it covers |
|---|---|
| Built-in Providers | The 60 providers we ship: OAuth flows, API-key flows, per-provider setup pages |
| Custom Providers | Ship your own provider: REST API path and AFPS package path, manifest shapes for each auth mode |
| LLM Models | BYOK Anthropic, OpenAI, Google, Mistral, Groq, OpenRouter, xAI, Cerebras, Azure, Bedrock, and custom endpoints |
Outbound
| Page | Pattern |
|---|---|
| Personal Agents | Your personal agent is the brain, Appstrate is the arms: multi-tenant, isolated, credentialed execution |
| Coding Agents | Drive Appstrate from Cursor, Claude Code, or any agent with HTTP access |