Extract

In this example, we will create a function that is able to tell the intent of the user's most recent message in the conversation.

This can be useful as:

  • First step in a pipeline to filter options

This takes advantage of a custom output parser to not only ensure formatting, but slightly transform the output.

Step 1 - Prepare Types & Intents

interface ExtractInformationInput {
  chatHistory: IChatMessages;
  mostRecentMessage: string;
}

Step 2 - Prepare Prompt

export const PROMPT = `# Instructions: I need you to identify and extract the following information from the context and conversation. Reply with only this information, formatted as valid JSON. Do not carry on a conversation. Make sure you read through the context and work step-by-step to make sure you identify accurate information. If you do not know the value, use the default value.

Your response must EXACTLY follow the JSON Schema specified below:

{{>JsonSchema key='schema'}}

# Context
Today is {{context_formatCurrentDate}}`;

export const INSTRUCT = `Respond with:
{{>JsonSchemaExampleJson key='schema'}}`;

Step 3 - Create LLM Executor

Combine the prompt, LLM, and parser into a single function.

export async function extractInformation<
  S extends JSONSchema,
  I extends ExtractInformationInput
>(llm: BaseLlm, input: I, schema: S) {
  const prompt = createPrompt<I>("chat", PROMPT)
    .addChatHistoryPlaceholder("chatHistory")
    .addMessagePlaceholder(`{{mostRecentMessage}}`)
    .addSystemMessage(INSTRUCT);

  const parser = createParser("json", { schema });

  return createLlmExecutor({
    name: "extract",
    llm,
    prompt,
    parser,
  }).execute(Object.assign(input, { schema }));
}

Step 4 - Use it!

  const schema = defineSchema({
    type: "object",
    properties: {
      city: {
        type: "string",
        description: "what city does the user want to book a hotel in",
        default: "unknown",
      },
      startDate: {
        type: "string",
        description: "the date the user would like to start their stay",
        default: "unknown",
      },
      endDate: {
        type: "string",
        description: "the date the user would like to end their stay",
        default: "unknown",
      },
    },
    required: ["city", "startDate", "endDate"],
    additionalProperties: false,
  });
import { extractInformation } from "./somewhere"

// a chat history, loaded from somewhere
const chatHistory = [];

const response = await extractInformation({
    // the input you get from somewhere
    input: "I'm going to be in berlin",
    chatHistory
}, schema);

/**
 * 
 * console.log(response)
 * {
 *   "city": "Berlin",
 *   "startDate": "unknown",
 *   "endDate": "unknown",
 * }
 **/

// the intent is focused on the most recent message
chatHistory.push({ 
    role: "user",
    content: "I'm going to be in berlin"
});

const response2 = await identifyIntent().execute({
    input: "I get there the 14th and leave the 18th",
    chatHistory
}, schema);

/**
 * 
 * console.log(response)
 * {
 *   "city": "Berlin",
 *   "startDate": "06/14/2023",
 *   "endDate": "06/18/2023",
 * }
 **/

Complete File

interface ExtractInformationInput {
  chatHistory: IChatMessages;
  mostRecentMessage: string;
}

export const PROMPT = `# Instructions: I need you to identify and extract the following information from the context and conversation. Reply with only this information, formatted as valid JSON. Do not carry on a conversation. Make sure you read through the context and work step-by-step to make sure you identify accurate information. If you do not know the value, use the default value.

Your response must EXACTLY follow the JSON Schema specified below:

{{>JsonSchema key='schema'}}

# Context
Today is {{context_formatCurrentDate}}`;

export const INSTRUCT = `Respond with:
{{>JsonSchemaExampleJson key='schema'}}`;

export async function extractInformation<
  S extends JSONSchema,
  I extends ExtractInformationInput
>(llm: BaseLlm, input: I, schema: S) {
  const prompt = createPrompt<I>("chat", PROMPT)
    .addChatHistoryPlaceholder("chatHistory")
    .addMessagePlaceholder(`{{mostRecentMessage}}`)
    .addSystemMessage(INSTRUCT);

  const parser = createParser("json", { schema });

  return createLlmExecutor({
    name: "extract",
    llm,
    prompt,
    parser,
  }).execute(Object.assign(input, { schema }));
}
Last Updated:
Contributors: Greg Reindel