Skip to main content

How to use BaseChatMessageHistory with LangGraph

Prerequisites

This guide assumes familiarity with the following concepts:

We recommend that new LangChain applications take advantage of the built-in LangGraph peristence to implement memory.

In some situations, users may need to keep using an existing persistence solution for chat message history.

Here, we will show how to use LangChain chat message histories (implementations of BaseChatMessageHistory) with LangGraph.

Set up​

process.env.ANTHROPIC_API_KEY = "YOUR_API_KEY";
yarn add @langchain/core @langchain/langgraph @langchain/anthropic

ChatMessageHistory​

A message history needs to be parameterized by a conversation ID or maybe by the 2-tuple of (user ID, conversation ID).

Many of the LangChain chat message histories will have either a sessionId or some namespace to allow keeping track of different conversations. Please refer to the specific implementations to check how it is parameterized.

The built-in InMemoryChatMessageHistory does not contains such a parameterization, so we’ll create a dictionary to keep track of the message histories.

import { InMemoryChatMessageHistory } from "@langchain/core/chat_history";

const chatsBySessionId: Record<string, InMemoryChatMessageHistory> = {};

const getChatHistory = (sessionId: string) => {
let chatHistory: InMemoryChatMessageHistory | undefined =
chatsBySessionId[sessionId];
if (!chatHistory) {
chatHistory = new InMemoryChatMessageHistory();
chatsBySessionId[sessionId] = chatHistory;
}
return chatHistory;
};

Use with LangGraph​

Next, we’ll set up a basic chat bot using LangGraph. If you’re not familiar with LangGraph, you should look at the following Quick Start Tutorial.

We’ll create a LangGraph node for the chat model, and manually manage the conversation history, taking into account the conversation ID passed as part of the RunnableConfig.

The conversation ID can be passed as either part of the RunnableConfig (as we’ll do here), or as part of the graph state.

import { v4 as uuidv4 } from "uuid";
import { ChatAnthropic } from "@langchain/anthropic";
import {
StateGraph,
MessagesAnnotation,
END,
START,
} from "@langchain/langgraph";
import { HumanMessage } from "@langchain/core/messages";
import { RunnableConfig } from "@langchain/core/runnables";

// Define a chat model
const model = new ChatAnthropic({ modelName: "claude-3-haiku-20240307" });

// Define the function that calls the model
const callModel = async (
state: typeof MessagesAnnotation.State,
config: RunnableConfig
): Promise<Partial<typeof MessagesAnnotation.State>> => {
if (!config.configurable?.sessionId) {
throw new Error(
"Make sure that the config includes the following information: {'configurable': {'sessionId': 'some_value'}}"
);
}

const chatHistory = getChatHistory(config.configurable.sessionId as string);

let messages = [...(await chatHistory.getMessages()), ...state.messages];

if (state.messages.length === 1) {
// First message, ensure it's in the chat history
await chatHistory.addMessage(state.messages[0]);
}

const aiMessage = await model.invoke(messages);

// Update the chat history
await chatHistory.addMessage(aiMessage);

return { messages: [aiMessage] };
};

// Define a new graph
const workflow = new StateGraph(MessagesAnnotation)
.addNode("model", callModel)
.addEdge(START, "model")
.addEdge("model", END);

const app = workflow.compile();

// Create a unique session ID to identify the conversation
const sessionId = uuidv4();
const config = { configurable: { sessionId }, streamMode: "values" as const };

const inputMessage = new HumanMessage("hi! I'm bob");

for await (const event of await app.stream(
{ messages: [inputMessage] },
config
)) {
const lastMessage = event.messages[event.messages.length - 1];
console.log(lastMessage.content);
}

// Here, let's confirm that the AI remembers our name!
const followUpMessage = new HumanMessage("what was my name?");

for await (const event of await app.stream(
{ messages: [followUpMessage] },
config
)) {
const lastMessage = event.messages[event.messages.length - 1];
console.log(lastMessage.content);
}
hi! I'm bob
Hello Bob! It's nice to meet you. How can I assist you today?
what was my name?
You said your name is Bob.

Using With RunnableWithMessageHistory​

This how-to guide used the messages and addMessages interface of BaseChatMessageHistory directly.

Alternatively, you can use RunnableWithMessageHistory, as LCEL can be used inside any LangGraph node.

To do that replace the following code:

const callModel = async (
state: typeof MessagesAnnotation.State,
config: RunnableConfig
): Promise<Partial<typeof MessagesAnnotation.State>> => {
if (!config.configurable?.sessionId) {
throw new Error(
"Make sure that the config includes the following information: {'configurable': {'sessionId': 'some_value'}}"
);
}

const chatHistory = getChatHistory(config.configurable.sessionId as string);

let messages = [...(await chatHistory.getMessages()), ...state.messages];

if (state.messages.length === 1) {
// First message, ensure it's in the chat history
await chatHistory.addMessage(state.messages[0]);
}

const aiMessage = await model.invoke(messages);

// Update the chat history
await chatHistory.addMessage(aiMessage);
return { messages: [aiMessage] };
};

With the corresponding instance of RunnableWithMessageHistory defined in your current application.

const runnable = new RunnableWithMessageHistory({
// ... configuration from existing code
});

const callModel = async (
state: typeof MessagesAnnotation.State,
config: RunnableConfig
): Promise<Partial<typeof MessagesAnnotation.State>> => {
// RunnableWithMessageHistory takes care of reading the message history
// and updating it with the new human message and AI response.
const aiMessage = await runnable.invoke(state.messages, config);
return {
messages: [aiMessage],
};
};

Was this page helpful?


You can also leave detailed feedback on GitHub.