Skip to content

Anthropic

When using Anthropic models via AWS Bedrock, llm-exe will make POST requests to https://bedrock-runtime.us-west-2.amazonaws.com/model/{MODEL_ID}/invoke.

Setup

Anthropic Chat

ts
const llm = useLlm("amazon:anthropic.chat.v1", {
  model: "anthropic.claude-3-sonnet-20240229-v1:0", // specify a model
});

Anthropic-Specific Options

OptionTypeDefaultDescription
anthropicApiKeystringundefinedAPI key for Anthropic. Optionally can be set using process.env.OPEN_AI_API_KEY
modelstringclaude-3-5-sonnetThe model to use. Can be any one of: claude-3-5-sonnet, etc.
temperaturenumber0Maps to temperature. See Anthropic Docs
maxTokensnumber500Maps to max_tokens. See Anthropic Docs
topPnumbernullMaps to top_p. See Anthropic Docs
streambooleannullNote: Not supported yet.
stop?nullMaps to stop_sequences. See Anthropic Docs

Anthropic Docs: link