Anthropic
When using Anthropic models, llm-exe will make POST requests to https://api.anthropic.com/v1/messages
.
Setup
Anthropic Chat
ts
const llm = useLlm("anthropic.chat.v1", {
model: "claude-3-5-sonnet-20240620", // specify a model
});
Anthropic Chat By Model
ts
const llm = useLlm("anthropic.claude-3-5-sonnet", {
// other options,
// no model needed, using claude-3-5-sonnet
});
NOTE
You can use the following models using this shorthand:
- anthropic.claude-3-5-sonnet
- anthropic.claude-3-opus
- anthropic.claude-3-sonnet
- anthropic.claude-3-haiku
Authentication
To authenticate, you need to provide an Anthropic API Key. You can either provide the API key various ways, depending on your use case.
- Pass in as execute options using
anthropicApiKey
- Pass in as setup options using
anthropicApiKey
- Use a default key by setting an environment variable of
ANTHROPIC_API_KEY
Basic Usage
Generally you pass the LLM instance off to an LLM Executor and call that. However, it is possible to interact with the LLM object directly, if you wanted.
ts
// given array of chat messages, calls chat completion
await llm.call([]);
// given string prompt, calls completion
await llm.call("");
Anthropic-Specific Options
Option | Type | Default | Description |
---|---|---|---|
anthropicApiKey | string | undefined | API key for Anthropic. Optionally can be set using process.env.OPEN_AI_API_KEY |
model | string | claude-3-5-sonnet | The model to use. Can be any one of: claude-3-5-sonnet, etc. |
temperature | number | 0 | Maps to temperature. See Anthropic Docs |
maxTokens | number | 500 | Maps to max_tokens. See Anthropic Docs |
topP | number | null | Maps to top_p. See Anthropic Docs |
stream | boolean | null | Note: Not supported yet. |
stop | ? | null | Maps to stop_sequences. See Anthropic Docs |
Anthropic Docs: link