📕 LangChain for JavaScript developers is now out!
Build 5 Next.js apps with ChatGPT integrated features using LangChain.

StructuredOutputParser with Zod schema in LangChain (Javascript version)

The StructuredOutputParser from Langchain.js is maybe the most versatile output parser we ca use. It can be used to return multiple fields or, if you want complex structures, such as a JSON object with arrays of strings, we can mix it with the Zod Schema.

Large Language Models like structure! Their output is much more accurate when we define a clear structure for both the input and the output. More info about this here.

While the CommaSeparatedListOutputParser or the StringOutputParser are great tools for simpler direct responses, when we have to deal with more complex outputs we will need to reach for other types of parsers such as StructuredOutputParser or JsonOutputParser.

Let's say we want to build the below app:
StructuredOutputParser with Zod schema in Langchain.js example app

Each time the user presses the button the LLM will provide us with a movie suggestion.

If you want to skip ahead you can see the full code of the example here.

Instaling Langchain and getting an Open AI key

We will use NextJs and React to build this app. You can use the create-next-app tool to make the scaffolding of the app.

The first thing we need to do is to install the needed libraries. For this one will need to add Langhchain to our NextJs project:

npm install -S langchain 

You will need an Open AI key that you can get from here.

Once you have the key you need to make a .env file in the root of the project and add the folowing:

// 💾 file: 02-structured-output/.env

OPENAI_API_KEY='my_key_here'

Using the StructuredOutputParser with the Zod Schema to define the structure of the response

All the magic will happen in the backend part of the app. In our case, this will be the api/route.js file.

We will make an Open AI model, and add to the chain a StructuredOutputParser containing all the info the model needs for the structure of the output:

// 💾 file: 02-structured-output/src/app/api/route.js

import { ChatOpenAI } from "@langchain/openai"
import { PromptTemplate } from "@langchain/core/prompts"
import { RunnableSequence } from "@langchain/core/runnables"
//adding the StructuredOutputParse
import { StructuredOutputParser } from "langchain/output_parsers"
//we will use the zod schema to define the types of the returned data
import { z } from "zod"

const model = new ChatOpenAI({
    openAIApiKey: process.env.OPENAI_API_KEY,
    temperature: 0.9
})

//🟠 Zod is used to define if a field is a string, number, array
const parser = StructuredOutputParser.fromZodSchema(
    z.object({
        title: z.string().describe(
            `tell me a good movie to watch`
        ),
        actors: z
            .array(z.string())
            .describe(`give the main actors`),
        year: z.number().describe(
            `the year the movie was launched`
        )
    })
)

const chain = RunnableSequence.from([
    PromptTemplate.fromTemplate(
        `Answer the user's question as best as possible.\n
        {format_instructions}`
    ),
    model,
    parser,
])

//using the StructuredOutputParser we can now wrap all the
//data into one single structure
const makeQuestionAndAnswers = async () => {
    return await chain.invoke({
        format_instructions: parser.getFormatInstructions(),
    })
}

export async function GET() {
    const data = await makeQuestionAndAnswers()
    return Response.json({data})
}

Even if Zod is a TypeScript-first schema, it's very easy to use directly with plain old Javascript.

This is an example of what we will get if we print a data object:
StructuredOutputParser with Zod schema object ouput

One of the cool things when working with Langchain is that this will work out of the box even if we change the model type from ChatGPT to something like Llama.

Setting up the frontend

Once we have all of this in place we can just call the endpoint from the frontend and display the result:

// 💾 file: 02-structured-output/src/app/page.js
'use client'
import { useState } from "react"
export default function Home() {
  const [movie, setMovie] = useState()

  const getMovieSuggestion = async ()=> {
    const response = await fetch('api')
    const { data } = await response.json()
    setMovie(data)
  }

  return (<>
    <h1>🍿 Movie ideas</h1>
    <button onClick={getMovieSuggestion}>
      Give me a sugestion for a movie
    </button>
    {movie && <table><tbody>
      <tr>
        <td>🎥 Movie tittle</td>
        <td>{movie.title}</td>
      </tr>
      <tr>
        <td>📅 Year</td>
        <td>{movie.year}</td>
      </tr>
      <tr>
        <td>🦹 Actors</td>
        <td>{movie.actors.join(', ')}</td>
      </tr>
      </tbody></table>}
  </>)
}

And this is it! We now have a small nice NextJs app that uses the StructuredOutputParser from Langchain.js to define the structure of the output.

Remember that you can see the full code here.

📖 50 Javascript, React and NextJs Projects

Learn by doing with this FREE ebook! Not sure what to build? Dive in with 50 projects with project briefs and wireframes! Choose from 8 project categories and get started right away.

📖 50 Javascript, React and NextJs Projects

Learn by doing with this FREE ebook! Not sure what to build? Dive in with 50 projects with project briefs and wireframes! Choose from 8 project categories and get started right away.


Leave a Reply

Your email address will not be published. Required fields are marked *