Skip to content

Anthropic

When using Anthropic models via AWS Bedrock, llm-exe will make POST requests to https://api.anthropic.com/v1/messages.

Setup

Anthropic Chat

ts
const llm = useLlm("", {
  model: "claude-3-5-sonnet-20240620", // specify a model
});

Anthropic-Specific Options

OptionTypeDefaultDescription
anthropicApiKeystringundefinedAPI key for Anthropic. Optionally can be set using process.env.OPEN_AI_API_KEY
modelstringclaude-3-5-sonnetThe model to use. Can be any one of: claude-3-5-sonnet, etc.
temperaturenumber0Maps to temperature. See Anthropic Docs
maxTokensnumber500Maps to max_tokens. See Anthropic Docs
topPnumbernullMaps to top_p. See Anthropic Docs
streambooleannullNote: Not supported yet.
stop?nullMaps to stop_sequences. See Anthropic Docs

Anthropic Docs: link