libSQL
Turso is a SQLite-compatible database built on libSQL, the Open Contribution fork of SQLite. Vector Similiarity Search is built into Turso and libSQL as a native datatype, enabling you to store and query vectors directly in the database.
LangChain.js supports using a local libSQL, or remote Turso database as a vector store, and provides a simple API to interact with it.
This guide provides a quick overview for getting started with libSQL vector stores. For detailed documentation of all libSQL features and configurations head to the API reference.
Overview
Integration details
Class | Package | PY support | Package latest |
---|---|---|---|
LibSQLVectorStore | @langchain/community | ❌ |
Setup
To use libSQL vector stores, you'll need to create a Turso account or set up a local SQLite database, and install the @langchain/community
integration package.
This guide will also use OpenAI embeddings, which require you to install the @langchain/openai
integration package. You can also use other supported embeddings models if you wish.
You can use local SQLite when working with the libSQL vector store, or use a hosted Turso Database.
- npm
- Yarn
- pnpm
npm install @libsql/client @langchain/openai @langchain/community
yarn add @libsql/client @langchain/openai @langchain/community
pnpm add @libsql/client @langchain/openai @langchain/community
Now it's time to create a database. You can create one locally, or use a hosted Turso database.
Local libSQL
Create a new local SQLite file and connect to the shell:
sqlite3 file.db
Hosted Turso
Visit sqlite.new to create a new database, give it a name, and create a database auth token.
Make sure to copy the database auth token, and the database URL, it should look something like:
libsql://[database-name]-[your-username].turso.io
Setup the table and index
Execute the following SQL command to create a new table or add the embedding column to an existing table.
Make sure to modify the following parts of the SQL:
TABLE_NAME
is the name of the table you want to create.content
is used to store theDocument.pageContent
values.metadata
is used to store theDocument.metadata
object.EMBEDDING_COLUMN
is used to store the vector values, use the dimensions size used by the model you plan to use (1536 for OpenAI).
CREATE TABLE IF NOT EXISTS TABLE_NAME (
id INTEGER PRIMARY KEY AUTOINCREMENT,
content TEXT,
metadata TEXT,
EMBEDDING_COLUMN F32_BLOB(1536) -- 1536-dimensional f32 vector for OpenAI
);
Now create an index on the EMBEDDING_COLUMN
column - the index name is important!:
CREATE INDEX IF NOT EXISTS idx_TABLE_NAME_EMBEDDING_COLUMN ON TABLE_NAME(libsql_vector_idx(EMBEDDING_COLUMN));
Make sure to replace the TABLE_NAME
and EMBEDDING_COLUMN
with the values you used in the previous step.
Instantiation
To initialize a new LibSQL
vector store, you need to provide the database URL and Auth Token when working remotely, or by passing the filename for a local SQLite.
import { LibSQLVectorStore } from "@langchain/community/vectorstores/libsql";
import { OpenAIEmbeddings } from "@langchain/openai";
import { createClient } from "@libsql/client";
const embeddings = new OpenAIEmbeddings({
model: "text-embedding-3-small",
});
const libsqlClient = createClient({
url: "libsql://[database-name]-[your-username].turso.io",
authToken: "...",
});
// Local instantiation
// const libsqlClient = createClient({
// url: "file:./dev.db",
// });
const vectorStore = new LibSQLVectorStore(embeddings, {
db: libsqlClient,
table: "TABLE_NAME",
column: "EMBEDDING_COLUMN",
});
Manage vector store
Add items to vector store
import type { Document } from "@langchain/core/documents";
const documents: Document[] = [
{ pageContent: "Hello", metadata: { topic: "greeting" } },
{ pageContent: "Bye bye", metadata: { topic: "greeting" } },
];
await vectorStore.addDocuments(documents);
Delete items from vector store
await vectorStore.deleteDocuments({ ids: [1, 2] });
Query vector store
Once you have inserted the documents, you can query the vector store.
Query directly
Performing a simple similarity search can be done as follows:
const resultOne = await vectorStore.similaritySearch("hola", 1);
for (const doc of similaritySearchResults) {
console.log(`${doc.pageContent} [${JSON.stringify(doc.metadata, null)}]`);
}
For similarity search with scores:
const similaritySearchWithScoreResults =
await vectorStore.similaritySearchWithScore("hola", 1);
for (const [doc, score] of similaritySearchWithScoreResults) {
console.log(
`${score.toFixed(3)} ${doc.pageContent} [${JSON.stringify(doc.metadata)}]`
);
}
API reference
For detailed documentation of all LibSQLVectorStore
features and configurations head to the API reference.
Related
- Vector store conceptual guide
- Vector store how-to guides