Skip to content

xAI

When using xAI models, llm-exe will make POST requests to https://api.x.ai/v1/chat/completions. All models are supported if you pass xai.chat.v1 as the first argument, and then specify a model in the options.

Basic Usage

xAI Chat

ts
const llm = useLlm("xai.chat.v1", {
  model: "grok-2", // specify a model
});

x.ai Chat By Model

ts
const llm = useLlm("xai.grok-2", {
  // other options,
  // no model needed, using grok-2
});
INFO
You can use the following models using this shorthand:
  • xai.grok-2

Authentication

To authenticate, you need to provide an xAI API Key. You can provide the API key various ways, depending on your use case.

  1. Pass in as execute options using xAiApiKey
  2. Pass in as setup options using xAiApiKey
  3. Use a default key by setting an environment variable of XAI_API_KEY

Generally you pass the LLM instance off to an LLM Executor and call that. However, it is possible to interact with the LLM object directly, if you wanted.

ts
// given array of chat messages, calls chat completion
await llm.chat([]);

xAI-Specific Options

In addition to the generic options, the following options are xAI-specific and can be passed in when creating a llm function.

OptionTypeDefaultDescription
modelstringgpt-4o-miniThe model to use. Can be any valid chat model. See xAI Docs
xAiApiKeystringundefinedAPI key for xAI.
temperaturenumberundefinedMaps to temperature.*
maxTokensnumberundefinedMaps to max_tokens. See xAI Docs
topPnumberundefinedMaps to top_p. See xAI Docs
nnumberundefinedMaps to n. See xAI Docs
streambooleanundefinedSee xAI Docs. Note: Not supported yet.
stop?undefinedMaps to stop. See xAI Docs
presencePenaltynumberundefinedMaps to presence_penalty. See xAI Docs
frequencyPenaltynumberundefinedMaps to frequency_penalty. See xAI Docs
logitBiasobjectundefinedMaps to logit_bias. See xAI Docs
userstringundefinedMaps to user. See xAI Docs

* xAI Docs: link