KushRouter API Documentation
Welcome to the KushRouter API docs.
- Main unified endpoint:
POST /api/v1/messages
- OpenAI-compatible streaming:
POST /api/openai/v1/chat/completions
- Anthropic-compatible streaming:
POST /api/anthropic/v1/messages
Quick start
- Get an API key from your dashboard:
kushrouter.com/dashboard/api-keys
(opens in a new tab) - Try our browser Playground:
kushrouter.com/dashboard/playground
(opens in a new tab) - Use your favorite SDK and point it to our base URLs
Try it now
-
Open Playground with a prefilled prompt for the unified endpoint
- Model:
gpt-4o-mini
- Prompt: “Write a creative haiku about AI routers”
- Launch Playground (opens in a new tab)
- Model:
-
Try streaming with Anthropic (prefilled)
- Model:
claude-3-5-sonnet-v2@anthropic
- Prompt: “Summarize: Why use a unified LLM router?”
- Launch Streaming (opens in a new tab)
- Model:
Base URLs
- Production base URL:
https://api.kushrouter.com
- Unified:
https://api.kushrouter.com/api/v1/messages
- OpenAI:
https://api.kushrouter.com/api/openai/v1/
- Anthropic:
https://api.kushrouter.com/api/anthropic/v1/
What’s covered here
- Endpoints for streaming chat across providers
- Request schemas and validation behaviors
- CLI integration for OpenAI Codex and Anthropic Claude Code
- Pricing, model aliases, and credit consumption rules
- Usage analytics exports and dashboards
- File uploads, batch processing, tokenize helper, and MCP server integrations
- Troubleshooting guidance for common issues
If you encounter an error response, the message will be descriptive and aligned with the request validators described in the Schemas
section.