Files

Files API

Upload JSONL inputs or supporting documents for batches and long-running jobs.

Upload a file

Endpoint: POST /api/v1/files

curl -X POST "https://api.kushrouter.com/api/v1/files" \
  -H "Authorization: Bearer $API_KEY" \
  -F "file=@./requests.jsonl"

Response example:

{
  "id": "file_123",
  "object": "file",
  "bytes": 2048,
  "created_at": 1739123456,
  "filename": "requests.jsonl",
  "purpose": "batch"
}

You can also send JSON instead of multipart when your content is already in memory:

curl -X POST "https://api.kushrouter.com/api/v1/files" \
  -H "Authorization: Bearer $API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "content": "{\"model\":\"gpt-4o-mini\",\"messages\":[{\"role\":\"user\",\"content\":\"Hello\"}]}\n",
    "filename": "requests.jsonl"
  }'

List files

curl -X GET "https://api.kushrouter.com/api/v1/files" \
  -H "Authorization: Bearer $API_KEY"

Response:

{
  "object": "list",
  "data": [
    { "id": "file_123", "object": "file", "bytes": 2048, "created_at": 1739123456, "filename": "requests.jsonl", "purpose": "batch" }
  ],
  "has_more": false
}

Retrieve file metadata

curl -X GET "https://api.kushrouter.com/api/v1/files/file_123" \
  -H "Authorization: Bearer $API_KEY"

Download file content

curl -L -X GET "https://api.kushrouter.com/api/v1/files/file_123/content" \
  -H "Authorization: Bearer $API_KEY" \
  -o file_123.jsonl

Delete a file

curl -X DELETE "https://api.kushrouter.com/api/v1/files/file_123" \
  -H "Authorization: Bearer $API_KEY"

Deleting a file immediately invalidates any batches referencing it.

OpenAI-compatible Files

OpenAI-style routes are also available:

  • POST /api/openai/v1/files
  • GET /api/openai/v1/files
  • GET /api/openai/v1/files/{id}
  • GET /api/openai/v1/files/{id}/content
  • DELETE /api/openai/v1/files/{id}

Response shapes mirror OpenAI where applicable (object: "file", list object for listings).

Using files in batches

  • Upload JSONL requests, each line containing a JSON object accepted by the target endpoint.
  • Store the returned file.id and provide it to the Batches API (input_file_id).
  • Large files are streamed directly to the router, no need to chunk manually.

Further reading