Store Chat Messages & State Without Managing Infrastructure.Check Out DialogueDB
Skip to content

Anthropic

When using Anthropic models via AWS Bedrock, llm-exe will make POST requests to https://bedrock-runtime.us-west-2.amazonaws.com/model/{MODEL_ID}/invoke.

Setup

Anthropic Chat

ts
const llm = useLlm("amazon:anthropic.chat.v1", {
  model: "anthropic.claude-sonnet-4-v2:0",  // This is the model id from Bedrock
});

Anthropic-Specific Options

OptionTypeDefaultDescription
anthropicApiKeystringundefinedAPI key for Anthropic. Optionally can be set using process.env.ANTHROPIC_API_KEY
modelstringThe model to use. Must be specified when using anthropic.chat.v1.
temperaturenumberundefinedMaps to temperature. See Anthropic Docs
maxTokensnumber4096Maps to max_tokens. See Anthropic Docs
topPnumberundefinedMaps to top_p. See Anthropic Docs
topKnumberundefinedMaps to top_k. See Anthropic Docs
stopSequencesarrayundefinedMaps to stop_sequences. See Anthropic Docs
streambooleanundefinedNote: Not supported yet.
metadataobjectundefinedMaps to metadata. See Anthropic Docs
serviceTierstringundefinedMaps to service_tier. See Anthropic Docs

Anthropic Docs: link