Shared
Shared
Shared
$sharedAPI Links
ShowShow
Hide ParametersShow Parameters
API Meta
ShowShow
Chat Completion Chunk
Represents a streamed chunk of a chat completion response returned by the model, based on the provided input.
ShowShow
A unique identifier for the chat completion. Each chunk has the same ID.
A list of chat completion choices. Can contain more than one elements if n is greater than 1. Can also be empty for the
last chunk if you set stream_options: {"include_usage": true}.
Hide ParametersShow Parameters
A chat completion delta generated by streamed model responses.
Hide ParametersShow Parameters
The contents of the chunk message.
The refusal message generated by the model.
The role of the author of this message.
Hide ParametersShow Parameters
The reason the model stopped generating tokens. This will be stop if the model hit a natural stop point or a provided stop sequence, or
length if the maximum number of tokens specified in the request was reached
Hide ParametersShow Parameters
The index of the choice in the list of choices.
Log probability information for the choice.
Hide ParametersShow Parameters
A list of message content tokens with log probability information.
A list of message refusal tokens with log probability information.
The Unix timestamp (in seconds) of when the chat completion was created. Each chunk has the same timestamp.
The model to generate the completion.
The object type, which is always chat.completion.chunk.
Hide ParametersShow Parameters
An optional field that will only be present when you set
stream_options: {"include_usage": true} in your request. When present, it
contains a null value except for the last chunk which contains the
token usage statistics for the entire request.
NOTE: If the stream is interrupted or cancelled, you may not receive the final usage chunk which contains the total token usage for the request.
Hide ParametersShow Parameters
Number of tokens in the generated completion.
Number of tokens in the prompt.
Total number of tokens used in the request (prompt + completion).
Chat Completion Token Logprob
ShowShow
The token.
A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be null if there is no bytes representation for the token.
The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value -9999.0 is used to signify that the token is very unlikely.
List of the most likely tokens and their log probability, at this token position. In rare cases, there may be fewer than the number of requested top_logprobs returned.
Hide ParametersShow Parameters
The token.
A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be null if there is no bytes representation for the token.
The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value -9999.0 is used to signify that the token is very unlikely.