Store Chat Messages & State Without Managing Infrastructure.Check Out DialogueDB
Skip to content
Back to examples

Hello World

Here is a simple example of an llm executor.

Step 2 - Prepare Prompt

ts
export const PROMPT = `We are conducting a test, follow the instructions exactly.

Do not ask questions or make conversation.

The user will provide an input, you need to reply only with:

"Hello World, you said <and then insert here what they said>".

So for example, if they say "Hello", you should reply only with: 

Hello World, you said Hello.`;

Step 3 - Create LLM Executor

Combine the prompt, LLM, and parser into a single function.

ts
export async function helloWorld(input: string) {
  const llm = useLlm("openai.gpt-4o-mini");
  const prompt = createPrompt("chat", PROMPT);
  const parser = createParser("string");

  prompt.addUserMessage(input);

  return createLlmExecutor({
    name: "extract",
    llm,
    prompt,
    parser,
  }).execute({});
}

Step 4 - Use it!

ts
import { helloWorld } from "<your-file-path>";

const response = await helloWorld("Hello");

console.log(response);
/**
 * Hello World, you said Hello.
 **/

Complete File

ts
import { useLlm, createPrompt, createParser, createLlmExecutor } from "llm-exe";

export const PROMPT = `We are conducting a test, follow the instructions exactly.

Do not ask questions or make conversation.

The user will provide an input, you need to reply only with:

"Hello World, you said <and then insert here what they said>".

So for example, if they say "Hello", you should reply only with: 

Hello World, you said Hello.`;

export async function helloWorld(input: string) {
  const llm = useLlm("openai.gpt-4o-mini");
  const prompt = createPrompt("chat", PROMPT);
  const parser = createParser("string");

  prompt.addUserMessage(input);

  return createLlmExecutor({
    name: "extract",
    llm,
    prompt,
    parser,
  }).execute({});
}