AzionVectorStore
The AzionVectorStore
is used to manage and search through a collection
of documents using vector embeddings, directly on Azionβs Edge Plataform
using Edge SQL.
This guide provides a quick overview for getting started with Azion
EdgeSQL vector stores. For detailed
documentation of all AzionVectorStore
features and configurations head
to the API
reference.
Overviewβ
Integration detailsβ
Class | Package | [PY support] | Package latest |
---|---|---|---|
AzionVectorStore | @langchain/community | β | ![]() |
Setupβ
To use the AzionVectorStore
vector store, you will need to install the
@langchain/community
package. Besides that, you will need an Azion
account
and a
Token
to use the Azion API, configuring it as environment variable
AZION_TOKEN
. Further information about this can be found in the
Documentation.
This guide will also use OpenAI
embeddings, which require you
to install the @langchain/openai
integration package. You can also use
other supported embeddings models
if you wish.
- npm
- yarn
- pnpm
npm i azion @langchain/openai @langchain/community
yarn add azion @langchain/openai @langchain/community
pnpm add azion @langchain/openai @langchain/community
Credentialsβ
Once youβve done this set the AZION_TOKEN environment variable:
process.env.AZION_TOKEN = "your-api-key";
If you are using OpenAI embeddings for this guide, youβll need to set your OpenAI key as well:
process.env.OPENAI_API_KEY = "YOUR_API_KEY";
If you want to get automated tracing of your model calls you can also set your LangSmith API key by uncommenting below:
// process.env.LANGCHAIN_TRACING_V2="true"
// process.env.LANGCHAIN_API_KEY="your-api-key"
Instantiationβ
import { AzionVectorStore } from "@langchain/community/vectorstores/azion_edgesql";
import { OpenAIEmbeddings } from "@langchain/openai";
const embeddings = new OpenAIEmbeddings({
model: "text-embedding-3-small",
});
// Instantiate with the constructor if the database and table have already been created
const vectorStore = new AzionVectorStore(embeddings, {
dbName: "langchain",
tableName: "documents",
});
// If you have not created the database and table yet, you can do so with the setupDatabase method
// await vectorStore.setupDatabase({ columns:["topic","language"], mode: "hybrid" })
// OR instantiate with the static method if the database and table have not been created yet
// const vectorStore = await AzionVectorStore.initialize(embeddingModel, { dbName: "langchain", tableName: "documents" }, { columns:[], mode: "hybrid" })
Manage vector storeβ
Add items to vector storeβ
import type { Document } from "@langchain/core/documents";
const document1: Document = {
pageContent: "The powerhouse of the cell is the mitochondria",
metadata: { language: "en", topic: "biology" },
};
const document2: Document = {
pageContent: "Buildings are made out of brick",
metadata: { language: "en", topic: "history" },
};
const document3: Document = {
pageContent: "Mitochondria are made out of lipids",
metadata: { language: "en", topic: "biology" },
};
const document4: Document = {
pageContent: "The 2024 Olympics are in Paris",
metadata: { language: "en", topic: "history" },
};
const documents = [document1, document2, document3, document4];
await vectorStore.addDocuments(documents);
Inserting chunks
Inserting chunk 0
Chunks inserted!
Delete items from vector storeβ
await vectorStore.delete(["4"]);
Deleted 1 items from documents
Query vector storeβ
Once your vector store has been created and the relevant documents have been added you will most likely wish to query it during the running of your chain or agent.
Query directlyβ
Performing a simple similarity search can be done as follows:
const filter = [{ operator: "=", column: "language", value: "en" }];
const hybridSearchResults = await vectorStore.azionHybridSearch("biology", {
kfts: 2,
kvector: 1,
filter: [{ operator: "=", column: "language", value: "en" }],
});
console.log("Hybrid Search Results");
for (const doc of hybridSearchResults) {
console.log(`${JSON.stringify(doc)}`);
}
Hybrid Search Results
[{"pageContent":"The Australian dingo is a unique species that plays a key role in the ecosystem","metadata":{"searchtype":"fulltextsearch"},"id":"6"},-0.25748711028997995]
[{"pageContent":"The powerhouse of the cell is the mitochondria","metadata":{"searchtype":"fulltextsearch"},"id":"16"},-0.31697985337654005]
[{"pageContent":"Australia s indigenous people have inhabited the continent for over 65,000 years","metadata":{"searchtype":"similarity"},"id":"3"},0.14822345972061157]
const similaritySearchResults = await vectorStore.azionSimilaritySearch(
"australia",
{ kvector: 3, filter: [{ operator: "=", column: "topic", value: "history" }] }
);
console.log("Similarity Search Results");
for (const doc of similaritySearchResults) {
console.log(`${JSON.stringify(doc)}`);
}
Similarity Search Results
[{"pageContent":"Australia s indigenous people have inhabited the continent for over 65,000 years","metadata":{"searchtype":"similarity"},"id":"3"},0.4486490488052368]
Query by turning into retrieverβ
You can also transform the vector store into a retriever for easier usage in your chains.
const retriever = vectorStore.asRetriever({
// Optional filter
filter: filter,
k: 2,
});
await retriever.invoke("biology");
[
Document {
pageContent: 'Australia s indigenous people have inhabited the continent for over 65,000 years',
metadata: { searchtype: 'similarity' },
id: '3'
},
Document {
pageContent: 'Mitochondria are made out of lipids',
metadata: { searchtype: 'similarity' },
id: '18'
}
]
Usage for retrieval-augmented generationβ
For guides on how to use this vector store for retrieval-augmented generation (RAG), see the following sections:
- Tutorials: working with external knowledge.
- How-to: Question and answer with RAG
- Retrieval conceptual docs
API referenceβ
For detailed documentation of all AzionVectorStore features and configurations head to the API reference.
Relatedβ
- Vector store conceptual guide
- Vector store how-to guides