KushRouter API Documentation
Welcome to the KushRouter API docs.
- OpenAI-compatible streaming:
POST /api/openai/v1/chat/completions - Anthropic-compatible streaming:
POST /api/anthropic/v1/messages
Quick start
- Get an API key from your dashboard:
kushrouter.com/dashboard/api-keys(opens in a new tab) - Try our browser Playground:
kushrouter.com/dashboard/playground(opens in a new tab) - Use your favorite SDK and point it to our base URLs
Try it now
-
Open the Playground prefilled for OpenAI-compatible Chat
- Model:
gpt-5-mini-2025-08-07 - Prompt: “Write a creative haiku about AI routers”
- Launch Playground (opens in a new tab)
- Model:
-
Try streaming with Anthropic (prefilled)
- Model:
claude-sonnet-4-5-20250929 - Prompt: “Summarize: Why use a multi‑provider LLM router?”
- Launch Streaming (opens in a new tab)
- Model:
Base URLs
- Production base URL:
https://api.kushrouter.com - OpenAI:
https://api.kushrouter.com/api/openai/v1/ - Anthropic:
https://api.kushrouter.com/api/anthropic/v1/
What’s covered here
- Endpoints for streaming chat across providers
- Request schemas and validation behaviors
- Parameter compatibility by surface
- CLI integration for OpenAI Codex and Anthropic Claude Code
- Pricing, model aliases, and credit consumption rules
- Usage analytics exports and dashboards
- Tokenize helper and MCP server integrations
- Troubleshooting guidance for common issues
If you encounter an error response, the message will be descriptive and aligned with the request validators described in the Schemas section.