Integrations

Overview

How Appstrate integrates with providers, LLMs, personal agents, and coding agents.

Appstrate integrates in two directions.

Inbound, plug things into Appstrate. Providers (Gmail, Slack, Notion, and 57 others) expose your end-users' credentials to agents. LLM models run via BYOK: you bring an Anthropic, OpenAI, Google, Mistral, Groq, OpenRouter, xAI, Cerebras, Azure OpenAI, Google Vertex, or AWS Bedrock key, or point at any OpenAI-compatible endpoint, and every agent uses them.

Outbound, plug Appstrate into other systems. Personal agents like OpenClaw or Hermes can use Appstrate as an execution backend: they reason locally, then call the Appstrate REST API when they need multi-tenant runs, credential-hidden tool calls, or team-scale parallelism. IDE-based coding agents (Cursor, Claude Code) can drive Appstrate the same way.

None of these are shipped as SDKs. They are patterns over the REST API, documented below so you know what to build.

Inbound

PageWhat it covers
Built-in ProvidersThe 60 providers we ship: OAuth flows, API-key flows, per-provider setup pages
Custom ProvidersShip your own provider: REST API path and AFPS package path, manifest shapes for each auth mode
LLM ModelsBYOK Anthropic, OpenAI, Google, Mistral, Groq, OpenRouter, xAI, Cerebras, Azure, Bedrock, and custom endpoints

Outbound

PagePattern
Personal AgentsYour personal agent is the brain, Appstrate is the arms: multi-tenant, isolated, credentialed execution
Coding AgentsDrive Appstrate from Cursor, Claude Code, or any agent with HTTP access

On this page