Signature
Parameters
params: CompleteParams
| Field | Type | Description |
|---|---|---|
messages | ChatMessage[] | Conversation history. Roles: system, user, assistant. |
intent | RoutingIntent | language, vertical, optional optimizeFor. |
systemPrompt | string? | Shortcut for a leading system message. Providers that distinguish the system channel use it natively; others fold it into the message list. |
temperature | number? | Forwarded to the provider. Defaults to the provider’s default. |
maxTokens | number? | Max completion tokens. Defaults to the provider’s default. |
constraints | PipelineConstraints? | Allow-list constraints. |
ChatMessage
abortSignal?: AbortSignal
Cancel an in-flight request.
Returns
CompleteResult
| Field | Type | Description |
|---|---|---|
text | string | Assistant reply. |
provider | string | Upstream LLM provider (e.g. openai, anthropic, groq). |
model | string | Provider-specific model id. |
usage.promptTokens | number | Prompt token count. |
usage.completionTokens | number | Completion token count. |
failoverCount | number | Providers tried before this one succeeded. |
scoresRunId | string | null | Scoring run id that selected this provider. |
Non-streaming (v1)
The/v1/complete endpoint is buffered — each call returns one full completion. Streaming is on the roadmap; until then, tools / function calling are also not exposed through the proxy.