Class Message.Builder
-
- All Implemented Interfaces:
public final class Message.BuilderA builder for Message.
-
-
Method Summary
Modifier and Type Method Description final Message.Builderid(String id)Unique object identifier. final Message.Builderid(JsonField<String> id)Unique object identifier. final Message.Buildercontent(List<ContentBlock> content)Content generated by the model. final Message.Buildercontent(JsonField<List<ContentBlock>> content)Content generated by the model. final Message.BuilderaddContent(ContentBlock content)Content generated by the model. final Message.BuilderaddContent(TextBlock text)Content generated by the model. final Message.BuilderaddContent(ToolUseBlock toolUse)Content generated by the model. final Message.BuilderaddContent(ThinkingBlock thinking)Content generated by the model. final Message.BuilderaddContent(RedactedThinkingBlock redactedThinking)Content generated by the model. final Message.BuilderaddRedactedThinkingContent(String data)Content generated by the model. final Message.Buildermodel(Model model)The model that will complete your prompt. final Message.Buildermodel(JsonField<Model> model)The model that will complete your prompt. final Message.Buildermodel(String value)The model that will complete your prompt. final Message.Builderrole(JsonValue role)Conversational role of the generated message. final Message.BuilderstopReason(Message.StopReason stopReason)The reason that we stopped. final Message.BuilderstopReason(Optional<Message.StopReason> stopReason)The reason that we stopped. final Message.BuilderstopReason(JsonField<Message.StopReason> stopReason)The reason that we stopped. final Message.BuilderstopSequence(String stopSequence)Which custom stop sequence was generated, if any. final Message.BuilderstopSequence(Optional<String> stopSequence)Which custom stop sequence was generated, if any. final Message.BuilderstopSequence(JsonField<String> stopSequence)Which custom stop sequence was generated, if any. final Message.Buildertype(JsonValue type)Object type. final Message.Builderusage(Usage usage)Billing and rate-limit usage. final Message.Builderusage(JsonField<Usage> usage)Billing and rate-limit usage. final Message.BuilderadditionalProperties(Map<String, JsonValue> additionalProperties)final Message.BuilderputAdditionalProperty(String key, JsonValue value)final Message.BuilderputAllAdditionalProperties(Map<String, JsonValue> additionalProperties)final Message.BuilderremoveAdditionalProperty(String key)final Message.BuilderremoveAllAdditionalProperties(Set<String> keys)final Messagebuild()-
-
Method Detail
-
id
final Message.Builder id(String id)
Unique object identifier.
The format and length of IDs may change over time.
-
id
final Message.Builder id(JsonField<String> id)
Unique object identifier.
The format and length of IDs may change over time.
-
content
final Message.Builder content(List<ContentBlock> content)
Content generated by the model.
This is an array of content blocks, each of which has a
typethat determines its shape.Example:
[{ "type": "text", "text": "Hi, I'm Claude." }]If the request input
messagesended with anassistantturn, then the responsecontentwill continue directly from that last turn. You can use this to constrain the model's output.For example, if the input
messageswere:[ { "role": "user", "content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun" }, { "role": "assistant", "content": "The best answer is (" } ]Then the response
contentmight be:[{ "type": "text", "text": "B)" }]
-
content
final Message.Builder content(JsonField<List<ContentBlock>> content)
Content generated by the model.
This is an array of content blocks, each of which has a
typethat determines its shape.Example:
[{ "type": "text", "text": "Hi, I'm Claude." }]If the request input
messagesended with anassistantturn, then the responsecontentwill continue directly from that last turn. You can use this to constrain the model's output.For example, if the input
messageswere:[ { "role": "user", "content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun" }, { "role": "assistant", "content": "The best answer is (" } ]Then the response
contentmight be:[{ "type": "text", "text": "B)" }]
-
addContent
final Message.Builder addContent(ContentBlock content)
Content generated by the model.
This is an array of content blocks, each of which has a
typethat determines its shape.Example:
[{ "type": "text", "text": "Hi, I'm Claude." }]If the request input
messagesended with anassistantturn, then the responsecontentwill continue directly from that last turn. You can use this to constrain the model's output.For example, if the input
messageswere:[ { "role": "user", "content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun" }, { "role": "assistant", "content": "The best answer is (" } ]Then the response
contentmight be:[{ "type": "text", "text": "B)" }]
-
addContent
final Message.Builder addContent(TextBlock text)
Content generated by the model.
This is an array of content blocks, each of which has a
typethat determines its shape.Example:
[{ "type": "text", "text": "Hi, I'm Claude." }]If the request input
messagesended with anassistantturn, then the responsecontentwill continue directly from that last turn. You can use this to constrain the model's output.For example, if the input
messageswere:[ { "role": "user", "content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun" }, { "role": "assistant", "content": "The best answer is (" } ]Then the response
contentmight be:[{ "type": "text", "text": "B)" }]
-
addContent
final Message.Builder addContent(ToolUseBlock toolUse)
Content generated by the model.
This is an array of content blocks, each of which has a
typethat determines its shape.Example:
[{ "type": "text", "text": "Hi, I'm Claude." }]If the request input
messagesended with anassistantturn, then the responsecontentwill continue directly from that last turn. You can use this to constrain the model's output.For example, if the input
messageswere:[ { "role": "user", "content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun" }, { "role": "assistant", "content": "The best answer is (" } ]Then the response
contentmight be:[{ "type": "text", "text": "B)" }]
-
addContent
final Message.Builder addContent(ThinkingBlock thinking)
Content generated by the model.
This is an array of content blocks, each of which has a
typethat determines its shape.Example:
[{ "type": "text", "text": "Hi, I'm Claude." }]If the request input
messagesended with anassistantturn, then the responsecontentwill continue directly from that last turn. You can use this to constrain the model's output.For example, if the input
messageswere:[ { "role": "user", "content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun" }, { "role": "assistant", "content": "The best answer is (" } ]Then the response
contentmight be:[{ "type": "text", "text": "B)" }]
-
addContent
final Message.Builder addContent(RedactedThinkingBlock redactedThinking)
Content generated by the model.
This is an array of content blocks, each of which has a
typethat determines its shape.Example:
[{ "type": "text", "text": "Hi, I'm Claude." }]If the request input
messagesended with anassistantturn, then the responsecontentwill continue directly from that last turn. You can use this to constrain the model's output.For example, if the input
messageswere:[ { "role": "user", "content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun" }, { "role": "assistant", "content": "The best answer is (" } ]Then the response
contentmight be:[{ "type": "text", "text": "B)" }]
-
addRedactedThinkingContent
final Message.Builder addRedactedThinkingContent(String data)
Content generated by the model.
This is an array of content blocks, each of which has a
typethat determines its shape.Example:
[{ "type": "text", "text": "Hi, I'm Claude." }]If the request input
messagesended with anassistantturn, then the responsecontentwill continue directly from that last turn. You can use this to constrain the model's output.For example, if the input
messageswere:[ { "role": "user", "content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun" }, { "role": "assistant", "content": "The best answer is (" } ]Then the response
contentmight be:[{ "type": "text", "text": "B)" }]
-
model
final Message.Builder model(Model model)
The model that will complete your prompt.\n\nSee models for additional details and options.
-
model
final Message.Builder model(JsonField<Model> model)
The model that will complete your prompt.\n\nSee models for additional details and options.
-
model
final Message.Builder model(String value)
The model that will complete your prompt.\n\nSee models for additional details and options.
-
role
final Message.Builder role(JsonValue role)
Conversational role of the generated message.
This will always be
"assistant".
-
stopReason
final Message.Builder stopReason(Message.StopReason stopReason)
The reason that we stopped.
This may be one the following values:
"end_turn": the model reached a natural stopping point"max_tokens": we exceeded the requestedmax_tokensor the model's maximum"stop_sequence": one of your provided customstop_sequenceswas generated"tool_use": the model invoked one or more tools
In non-streaming mode this value is always non-null. In streaming mode, it is null in the
message_startevent and non-null otherwise.
-
stopReason
final Message.Builder stopReason(Optional<Message.StopReason> stopReason)
The reason that we stopped.
This may be one the following values:
"end_turn": the model reached a natural stopping point"max_tokens": we exceeded the requestedmax_tokensor the model's maximum"stop_sequence": one of your provided customstop_sequenceswas generated"tool_use": the model invoked one or more tools
In non-streaming mode this value is always non-null. In streaming mode, it is null in the
message_startevent and non-null otherwise.
-
stopReason
final Message.Builder stopReason(JsonField<Message.StopReason> stopReason)
The reason that we stopped.
This may be one the following values:
"end_turn": the model reached a natural stopping point"max_tokens": we exceeded the requestedmax_tokensor the model's maximum"stop_sequence": one of your provided customstop_sequenceswas generated"tool_use": the model invoked one or more tools
In non-streaming mode this value is always non-null. In streaming mode, it is null in the
message_startevent and non-null otherwise.
-
stopSequence
final Message.Builder stopSequence(String stopSequence)
Which custom stop sequence was generated, if any.
This value will be a non-null string if one of your custom stop sequences was generated.
-
stopSequence
final Message.Builder stopSequence(Optional<String> stopSequence)
Which custom stop sequence was generated, if any.
This value will be a non-null string if one of your custom stop sequences was generated.
-
stopSequence
final Message.Builder stopSequence(JsonField<String> stopSequence)
Which custom stop sequence was generated, if any.
This value will be a non-null string if one of your custom stop sequences was generated.
-
type
final Message.Builder type(JsonValue type)
Object type.
For Messages, this is always
"message".
-
usage
final Message.Builder usage(Usage usage)
Billing and rate-limit usage.
Anthropic's API bills and rate-limits by token counts, as tokens represent the underlying cost to our systems.
Under the hood, the API transforms requests into a format suitable for the model. The model's output then goes through a parsing stage before becoming an API response. As a result, the token counts in
usagewill not match one-to-one with the exact visible content of an API request or response.For example,
output_tokenswill be non-zero, even for an empty string response from Claude.Total input tokens in a request is the summation of
input_tokens,cache_creation_input_tokens, andcache_read_input_tokens.
-
usage
final Message.Builder usage(JsonField<Usage> usage)
Billing and rate-limit usage.
Anthropic's API bills and rate-limits by token counts, as tokens represent the underlying cost to our systems.
Under the hood, the API transforms requests into a format suitable for the model. The model's output then goes through a parsing stage before becoming an API response. As a result, the token counts in
usagewill not match one-to-one with the exact visible content of an API request or response.For example,
output_tokenswill be non-zero, even for an empty string response from Claude.Total input tokens in a request is the summation of
input_tokens,cache_creation_input_tokens, andcache_read_input_tokens.
-
additionalProperties
final Message.Builder additionalProperties(Map<String, JsonValue> additionalProperties)
-
putAdditionalProperty
final Message.Builder putAdditionalProperty(String key, JsonValue value)
-
putAllAdditionalProperties
final Message.Builder putAllAdditionalProperties(Map<String, JsonValue> additionalProperties)
-
removeAdditionalProperty
final Message.Builder removeAdditionalProperty(String key)
-
removeAllAdditionalProperties
final Message.Builder removeAllAdditionalProperties(Set<String> keys)
-
-
-
-