Skip to main content

Redis-Backed Chat Memory

For longer-term persistence across chat sessions, you can swap out the default in-memory chatHistory that backs chat memory classes like BufferMemory for a Redis instance.

Setup​

You will need to install node-redis in your project:

npm install @langchain/openai redis @langchain/community

You will also need a Redis instance to connect to. See instructions on the official Redis website for running the server locally.

Usage​

Each chat history session stored in Redis must have a unique id. You can provide an optional sessionTTL to make sessions expire after a give number of seconds. The config parameter is passed directly into the createClient method of node-redis, and takes all the same arguments.

import { BufferMemory } from "langchain/memory";
import { RedisChatMessageHistory } from "@langchain/community/stores/message/ioredis";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";

const memory = new BufferMemory({
chatHistory: new RedisChatMessageHistory({
sessionId: new Date().toISOString(), // Or some other unique identifier for the conversation
sessionTTL: 300, // 5 minutes, omit this parameter to make sessions never expire
url: "redis://localhost:6379", // Default value, override with your own instance's URL
}),
});

const model = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0,
});

const chain = new ConversationChain({ llm: model, memory });

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/

const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });

/*
{
res1: {
text: "You said your name was Jim."
}
}
*/

API Reference:

Advanced Usage​

You can also directly pass in a previously created node-redis client instance:

import { Redis } from "ioredis";
import { BufferMemory } from "langchain/memory";
import { RedisChatMessageHistory } from "@langchain/community/stores/message/ioredis";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";

const client = new Redis("redis://localhost:6379");

const memory = new BufferMemory({
chatHistory: new RedisChatMessageHistory({
sessionId: new Date().toISOString(),
sessionTTL: 300,
client,
}),
});

const model = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0,
});

const chain = new ConversationChain({ llm: model, memory });

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/

const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });

/*
{
res1: {
text: "You said your name was Jim."
}
}
*/

API Reference:

Redis Sentinel Support​

You can enable a Redis Sentinel backed cache using ioredis

This will require the installation of ioredis in your project.

npm install ioredis
import { Redis } from "ioredis";
import { BufferMemory } from "langchain/memory";
import { RedisChatMessageHistory } from "@langchain/community/stores/message/ioredis";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";

// Uses ioredis to facilitate Sentinel Connections see their docs for details on setting up more complex Sentinels: https://github.com/redis/ioredis#sentinel
const client = new Redis({
sentinels: [
{ host: "localhost", port: 26379 },
{ host: "localhost", port: 26380 },
],
name: "mymaster",
});

const memory = new BufferMemory({
chatHistory: new RedisChatMessageHistory({
sessionId: new Date().toISOString(),
sessionTTL: 300,
client,
}),
});

const model = new ChatOpenAI({ temperature: 0.5 });

const chain = new ConversationChain({ llm: model, memory });

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/

const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });

/*
{
res1: {
text: "You said your name was Jim."
}
}
*/

API Reference:


Help us out by providing feedback on this documentation page: