v2.3.4-1773537873
Anthropic
When using Anthropic models via AWS Bedrock, llm-exe will make POST requests to https://bedrock-runtime.us-west-2.amazonaws.com/model/{MODEL_ID}/invoke.
Setup
Anthropic Chat
ts
const llm = useLlm("amazon:anthropic.chat.v1", {
model: "anthropic.claude-sonnet-4-v2:0", // This is the model id from Bedrock
});Anthropic-Specific Options
| Option | Type | Default | Description |
|---|---|---|---|
| anthropicApiKey | string | undefined | API key for Anthropic. Optionally can be set using process.env.ANTHROPIC_API_KEY |
| model | string | — | The model to use. Must be specified when using anthropic.chat.v1. |
| temperature | number | undefined | Maps to temperature. See Anthropic Docs |
| maxTokens | number | 4096 | Maps to max_tokens. See Anthropic Docs |
| topP | number | undefined | Maps to top_p. See Anthropic Docs |
| topK | number | undefined | Maps to top_k. See Anthropic Docs |
| stopSequences | array | undefined | Maps to stop_sequences. See Anthropic Docs |
| stream | boolean | undefined | Note: Not supported yet. |
| metadata | object | undefined | Maps to metadata. See Anthropic Docs |
| serviceTier | string | undefined | Maps to service_tier. See Anthropic Docs |
Anthropic Docs: link
