šŸ“• LangChain for JavaScript developers is now out!
Build 5 Next.js apps with ChatGPT integrated features using LangChain.

Fixing the ‘Missing value for input variable chat_history’ error in LangChain JS

While working on the examples of the šŸ“˜ LangChain for JavaScript Developers book I encountered the following error:

Error: Missing value for input variable `chat_history`
    at eval (webpack-internal:///(rsc)/./node_modules/@langchain/core/dist/prompts/chat.js:598:31)
    at Array.reduce (<anonymous>)

It turned out that the reason for this Missing value for input variable chat_history`` error was the way I was invoking a chain based on ChatPromptTemplate class.

LangChain has multiple types of chains we can use. Each of them is useful for different situations.

Given its scope, if you use the ChatPromptTemplate as a starting link for your chain you must provide the chat_history parameter when calling its invoke() method:

await chain.invoke({
        input: question,
        // failing to add this will cause the error 
        // Missing value for input variable `chat_history`
        chat_history: myChatHistory
 })

If you don't have at that moment a chat history object at least you need to provide an empty array as the starting container:

const history = []
await chain.invoke({
        input: question,
        chat_history: history
 })

You can later update this object.

Below is a full working code snippet that uses the ChatPromptTemplate with chat_history:

import { ChatOpenAI } from "@langchain/openai"
import { ChatPromptTemplate } from "@langchain/core/prompts"
import { HumanMessage, AIMessage } from "@langchain/core/messages"
import { MessagesPlaceholder } from "@langchain/core/prompts"
import { StringOutputParser } from "@langchain/core/output_parsers"

const model = new ChatOpenAI({
    openAIApiKey: process.env.OPENAI_API_KEY, 
    modelName: "gpt-3.5-turbo",
})

const outputParser = new StringOutputParser()

const chatHistory = [
    new HumanMessage(`You are a helpful assistant`)
]

const prompt = ChatPromptTemplate.fromMessages([
    new MessagesPlaceholder("chat_history"),
 ["human", "{input}"]
])

const chain = prompt.pipe(model).pipe(outputParser)

const input = `Tell me a fun fact cats.` 
chatHistory.push(new HumanMessage(question))
const responses = await chain.invoke({
    input,
    chat_history: chatHistory
})
chatHistory.push(new AIMessage(responses))
console.log(responses)

šŸ“– 50 Javascript, React and NextJs Projects

Learn by doing with this FREE ebook! Not sure what to build? Dive in with 50 projects with project briefs and wireframes! Choose from 8 project categories and get started right away.

šŸ“– 50 Javascript, React and NextJs Projects

Learn by doing with this FREE ebook! Not sure what to build? Dive in with 50 projects with project briefs and wireframes! Choose from 8 project categories and get started right away.


Leave a Reply

Your email address will not be published. Required fields are marked *