LangChain v0.3
Last updated: 09.14.24
What's changedβ
- All LangChain packages now have
@langchain/core
as a peer dependency instead of a direct dependency to help avoid type errors around core version conflicts.- You will now need to explicitly install
@langchain/core
rather than relying on an internally resolved version from other packages.
- You will now need to explicitly install
- Callbacks are now backgrounded and non-blocking by default rather than blocking.
- This means that if you are using e.g. LangSmith for tracing in a serverless environment, you will need to await the callbacks to ensure they finish before your function ends.
- Removed deprecated document loader and self-query entrypoints from
langchain
in favor of entrypoints in@langchain/community
and integration packages. - Removed deprecated Google PaLM entrypoints from community in favor of entrypoints in
@langchain/google-vertexai
and@langchain/google-genai
. - Deprecated using objects with a
"type"
as aBaseMessageLike
in favor of the more OpenAI-likeMessageWithRole
Whatβs newβ
The following features have been added during the development of 0.2.x:
- Simplified tool definition and usage. Read more here.
- Added a generalized chat model constructor.
- Added the ability to dispatch custom events.
- Released LangGraph.js 0.2.0 and made it the recommended way to create agents with LangChain.js.
- Revamped integration docs and API reference. Read more here.
How to update your codeβ
If you're using langchain
/ @langchain/community
/ @langchain/core
0.0 or 0.1, we recommend that you first upgrade to 0.2.
If you're using @langchain/langgraph
, upgrade to @langchain/langgraph>=0.2.3
. This will work with either 0.2 or 0.3 versions of all the base packages.
Here is a complete list of all packages that have been released and what we recommend upgrading your version constraints to in your package.json
.
Any package that now supports @langchain/core
0.3 had a minor version bump.
Base packagesβ
Package | Latest | Recommended package.json constraint |
---|---|---|
langchain | 0.3.0 | >=0.3.0 <0.4.0 |
@langchain/community | 0.3.0 | >=0.3.0 <0.4.0 |
@langchain/textsplitters | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/core | 0.3.0 | >=0.3.0 <0.4.0 |
Downstream packagesβ
Package | Latest | Recommended package.json constraint |
---|---|---|
@langchain/langgraph | 0.2.3 | >=0.2.3 <0.3 |
Integration packagesβ
Package | Latest | Recommended package.json constraint |
---|---|---|
@langchain/anthropic | 0.3.0 | >=0.3.0 <0.4.0 |
@langchain/aws | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/azure-cosmosdb | 0.2.0 | >=0.2.0 <0.3.0 |
@langchain/azure-dynamic-sessions | 0.2.0 | >=0.2.0 <0.3.0 |
@langchain/baidu-qianfan | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/cloudflare | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/cohere | 0.3.0 | >=0.3.0 <0.4.0 |
@langchain/exa | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/google-genai | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/google-vertexai | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/google-vertexai-web | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/groq | 0.1.1 | >=0.1.1 <0.2.0 |
@langchain/mistralai | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/mixedbread-ai | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/mongodb | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/nomic | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/ollama | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/openai | 0.3.0 | >=0.3.0 <0.4.0 |
@langchain/pinecone | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/qdrant | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/redis | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/weaviate | 0.1.0 | >=0.1.0 <0.2.0 |
@langchain/yandex | 0.1.0 | >=0.1.0 <0.2.0 |
Once you've updated to recent versions of the packages, you will need to explicitly install @langchain/core
if you haven't already:
- npm
- Yarn
- pnpm
npm install @langchain/core
yarn add @langchain/core
pnpm add @langchain/core
We also suggest checking your lockfile or running the appropriate package manager command to make sure that your package manager only has one version of @langchain/core
installed.
If you are currently running your code in a serverless environment (e.g., a Cloudflare Worker, Edge function, or AWS Lambda function) and you are using LangSmith tracing or other callbacks, you will need to await callbacks to ensure they finish before your function ends. Here's a quick example:
import { RunnableLambda } from "@langchain/core/runnables";
import { awaitAllCallbacks } from "@langchain/core/callbacks/promises";
const runnable = RunnableLambda.from(() => "hello!");
const customHandler = {
handleChainEnd: async () => {
await new Promise((resolve) => setTimeout(resolve, 2000));
console.log("Call finished");
},
};
const startTime = new Date().getTime();
await runnable.invoke({ number: "2" }, { callbacks: [customHandler] });
console.log(`Elapsed time: ${new Date().getTime() - startTime}ms`);
await awaitAllCallbacks();
console.log(`Final elapsed time: ${new Date().getTime() - startTime}ms`);
Elapsed time: 1ms
Call finished
Final elapsed time: 2164ms
You can also set LANGCHAIN_CALLBACKS_BACKGROUND
to "false"
to make all callbacks blocking:
process.env.LANGCHAIN_CALLBACKS_BACKGROUND = "false";
const startTimeBlocking = new Date().getTime();
await runnable.invoke({ number: "2" }, { callbacks: [customHandler] });
console.log(
`Initial elapsed time: ${new Date().getTime() - startTimeBlocking}ms`
);
Call finished
Initial elapsed time: 2002ms