"Structured Output" with Ollama and LangChainJS [the return]
The LangChainJS Team is Fantastic ✨, and my previous article is already obsolete
👋 updated on Tue 11 Feb with this update:
@langchain/ollama@0.2.0
In my previous article Actually Benefiting from Structured Output Support with Ollama and LangChainJS, I explained that the withStructuredOutput
method wasn't working when using the latest Ollama API version, and consequently, I wasn't getting exactly the expected results. However, I proposed another method to achieve this (I'll let you read the article, which aimed to generate random names for role-playing game characters).
I was lucky that Jacob Lee, a key maintainer of @LangChain JS/TS, came across my article and told me on X: (👀 tweet)
Just published `@langchain/ollama@0.1.6` that allows you to do:
```
const model = new ChatOllama({ ... })
.withStructuredOutput(z.object({ ... }, {
method: "jsonSchema",
});
```
So naturally, I had to test it 🥰
I updated my dependencies in the package.json
file:
"dependencies": {
"@langchain/ollama": "^0.1.6",
"dotenv": "^16.4.7",
"langchain": "^0.3.15",
"prompts": "^2.4.2",
"zod": "^3.24.1"
}
New Code Version
And so, I modified my source code as follows:
import { ChatOllama } from "@langchain/ollama"
import { z } from "zod"
const llm = new ChatOllama({
model: 'qwen2.5:0.5b',
baseUrl: "http://ollama-service:11434",
temperature: 1.5,
repeatLastN: 2,
repeatPenalty: 2.2,
})
const schema = z.object({
name: z.string().describe("The name of the NPC"),
kind: z.string().describe("The kind of NPC"),
})
const structuredLLM = llm.withStructuredOutput(schema, {
method: "jsonSchema",
})
var systemInstructions = `
You are an expert for games like D&D 5th edition.
You have freedom to be creative to get the best possible output.
`
let kind = "Dwarf"
let userContent = `Generate a random name for a ${kind}.`
var messages = [
["system", systemInstructions],
["user", userContent]
]
const response = await structuredLLM.invoke(messages)
console.log("Response:", response)
console.log("")
And everything works perfectly with the ability to get random names:
Response: { name: 'Elthos Caine', kind: 'dwarf' }
Response: { name: 'Nedryllia', kind: 'Dwarf' }
Response: { name: 'Silent Fleece', kind: 'Dwarf' }
...
A big thank you for this Jacob Lee 🎉🙏
Update:
Following this tweet https://x.com/Hacubu/status/1889072811924447676, with the @langchain/ollama@0.2.0
update, the default behaviour of llm.withStructuredOutput()
do not need to specify method: "jsonSchema"
as a parameter.
So, you can use it like this:
const structuredLLM = llm.withStructuredOutput(schema)
Or with a { name: string }
option to give the LLM more context:
const structuredLLM = llm.withStructuredOutput(schema, {name:"generate-random-name"})