Skip to content

Anthropic

When using Anthropic models, llm-exe will make POST requests to https://api.anthropic.com/v1/messages.

Setup

Anthropic Chat

ts
const llm = useLlm("anthropic.chat.v1", {
  model: "claude-3-5-sonnet-20240620", // specify a model
});

Anthropic Chat By Model

ts
const llm = useLlm("anthropic.claude-3-5-sonnet", {
  // other options,
  // no model needed, using claude-3-5-sonnet
});

NOTE

You can use the following models using this shorthand:

  • anthropic.claude-3-5-sonnet
  • anthropic.claude-3-opus
  • anthropic.claude-3-sonnet
  • anthropic.claude-3-haiku

Authentication

To authenticate, you need to provide an Anthropic API Key. You can either provide the API key various ways, depending on your use case.

  • Pass in as execute options using anthropicApiKey
  • Pass in as setup options using anthropicApiKey
  • Use a default key by setting an environment variable of ANTHROPIC_API_KEY

Basic Usage

Generally you pass the LLM instance off to an LLM Executor and call that. However, it is possible to interact with the LLM object directly, if you wanted.

ts
// given array of chat messages, calls chat completion
await llm.call([]);

// given string prompt, calls completion
await llm.call("");

Anthropic-Specific Options

OptionTypeDefaultDescription
anthropicApiKeystringundefinedAPI key for Anthropic. Optionally can be set using process.env.OPEN_AI_API_KEY
modelstringclaude-3-5-sonnetThe model to use. Can be any one of: claude-3-5-sonnet, etc.
temperaturenumber0Maps to temperature. See Anthropic Docs
maxTokensnumber500Maps to max_tokens. See Anthropic Docs
topPnumbernullMaps to top_p. See Anthropic Docs
streambooleannullNote: Not supported yet.
stop?nullMaps to stop_sequences. See Anthropic Docs

Anthropic Docs: link