Skip to content

Shared

Shared

Shared

$shared
Domain Types
API Links
APILinksobject
ShowShow
API Meta
APIMetaobject
ShowShow
pagenumber
optional
formatint64
pagesnumber
optional
formatint64
totalnumber
optional
formatint64
Chat Completion Chunk
ChatCompletionChunkobject

Represents a streamed chunk of a chat completion response returned by the model, based on the provided input.

ShowShow
idstring

A unique identifier for the chat completion. Each chunk has the same ID.

choicesarray of object

A list of chat completion choices. Can contain more than one elements if n is greater than 1. Can also be empty for the last chunk if you set stream_options: {"include_usage": true}.

Hide ParametersShow Parameters
deltaobject

A chat completion delta generated by streamed model responses.

Hide ParametersShow Parameters
contentstring
optional

The contents of the chunk message.

refusalstring
optional

The refusal message generated by the model.

roleenum
optional
"developer" OR "user" OR "assistant"

The role of the author of this message.

Hide ParametersShow Parameters
"developer"
"user"
"assistant"
finish_reasonenum
"stop" OR "length"

The reason the model stopped generating tokens. This will be stop if the model hit a natural stop point or a provided stop sequence, or length if the maximum number of tokens specified in the request was reached

Hide ParametersShow Parameters
"stop"
"length"
indexnumber

The index of the choice in the list of choices.

logprobsobject
optional

Log probability information for the choice.

Hide ParametersShow Parameters
contentarray of tokenstringbytesarray of numberlogprobnumbertop_logprobsarray of objectChatCompletionTokenLogprob

A list of message content tokens with log probability information.

refusalarray of tokenstringbytesarray of numberlogprobnumbertop_logprobsarray of objectChatCompletionTokenLogprob

A list of message refusal tokens with log probability information.

creatednumber

The Unix timestamp (in seconds) of when the chat completion was created. Each chunk has the same timestamp.

modelstring

The model to generate the completion.

objectenum
"chat.completion.chunk"

The object type, which is always chat.completion.chunk.

Hide ParametersShow Parameters
"chat.completion.chunk"
usageobject
optional

An optional field that will only be present when you set stream_options: {"include_usage": true} in your request. When present, it contains a null value except for the last chunk which contains the token usage statistics for the entire request.

NOTE: If the stream is interrupted or cancelled, you may not receive the final usage chunk which contains the total token usage for the request.

Hide ParametersShow Parameters
completion_tokensnumber

Number of tokens in the generated completion.

prompt_tokensnumber

Number of tokens in the prompt.

total_tokensnumber

Total number of tokens used in the request (prompt + completion).

Chat Completion Token Logprob
ChatCompletionTokenLogprobobject
ShowShow
tokenstring

The token.

bytesarray of number

A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be null if there is no bytes representation for the token.

logprobnumber

The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value -9999.0 is used to signify that the token is very unlikely.

top_logprobsarray of object

List of the most likely tokens and their log probability, at this token position. In rare cases, there may be fewer than the number of requested top_logprobs returned.

Hide ParametersShow Parameters
tokenstring

The token.

bytesarray of number

A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be null if there is no bytes representation for the token.

logprobnumber

The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value -9999.0 is used to signify that the token is very unlikely.