Skip to main content


The OpenAIEmbeddings class uses the OpenAI API to generate embeddings for a given text. By default it strips new line characters from the text, as recommended by OpenAI, but you can disable this by passing stripNewLines: false to the constructor.

npm install @langchain/openai
import { OpenAIEmbeddings } from "@langchain/openai";

const embeddings = new OpenAIEmbeddings({
openAIApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
batchSize: 512, // Default value if omitted is 512. Max is 2048
modelName: "text-embedding-3-large",

If you're part of an organization, you can set process.env.OPENAI_ORGANIZATION to your OpenAI organization id, or pass it in as organization when initializing the model.

Specifying dimensions

With the text-embedding-3 class of models, you can specify the size of the embeddings you want returned. For example by default text-embedding-3-large returns embeddings of dimension 3072:

const embeddings = new OpenAIEmbeddings({
modelName: "text-embedding-3-large",

const vectors = await embeddings.embedDocuments(["some text"]);

But by passing in dimensions: 1024 we can reduce the size of our embeddings to 1024:

const embeddings1024 = new OpenAIEmbeddings({
modelName: "text-embedding-3-large",
dimensions: 1024,

const vectors2 = await embeddings1024.embedDocuments(["some text"]);

Custom URLs

You can customize the base URL the SDK sends requests to by passing a configuration parameter like this:

const model = new OpenAIEmbeddings({
configuration: {
baseURL: "",

You can also pass other ClientOptions parameters accepted by the official SDK.

If you are hosting on Azure OpenAI, see the dedicated page instead.