Skip to main content

ChatCohere

info

The Cohere Chat API is still in beta. This means Cohere may make breaking changes at any time.

Setup

In order to use the LangChain.js Cohere integration you'll need an API key. You can sign up for a Cohere account and create an API key here.

You'll first need to install the @langchain/cohere package:

npm install @langchain/cohere

Usage

import { ChatCohere } from "@langchain/cohere";
import { ChatPromptTemplate } from "@langchain/core/prompts";

const model = new ChatCohere({
apiKey: process.env.COHERE_API_KEY, // Default
model: "command", // Default
});
const prompt = ChatPromptTemplate.fromMessages([
["ai", "You are a helpful assistant"],
["human", "{input}"],
]);
const chain = prompt.pipe(model);
const response = await chain.invoke({
input: "Hello there friend!",
});
console.log("response", response);
/**
response AIMessage {
lc_serializable: true,
lc_namespace: [ 'langchain_core', 'messages' ],
content: "Hi there! I'm not your friend, but I'm happy to help you in whatever way I can today. How are you doing? Is there anything I can assist you with? I am an AI chatbot capable of generating thorough responses, and I'm designed to have helpful, inclusive conversations with users. \n" +
'\n' +
"If you have any questions, feel free to ask away, and I'll do my best to provide you with helpful responses. \n" +
'\n' +
'Would you like me to help you with anything in particular right now?',
additional_kwargs: {
response_id: 'c6baa057-ef94-4bb0-9c25-3a424963a074',
generationId: 'd824fcdc-b922-4ae6-8d45-7b65a21cdd6a',
token_count: {
prompt_tokens: 66,
response_tokens: 104,
total_tokens: 170,
billed_tokens: 159
},
meta: { api_version: [Object], billed_units: [Object] },
tool_inputs: null
}
}
*/

API Reference:

info

You can see a LangSmith trace of this example here

Streaming

Cohere's API also supports streaming token responses. The example below demonstrates how to use this feature.

import { ChatCohere } from "@langchain/cohere";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { StringOutputParser } from "@langchain/core/output_parsers";

const model = new ChatCohere({
apiKey: process.env.COHERE_API_KEY, // Default
model: "command", // Default
});
const prompt = ChatPromptTemplate.fromMessages([
["ai", "You are a helpful assistant"],
["human", "{input}"],
]);
const outputParser = new StringOutputParser();
const chain = prompt.pipe(model).pipe(outputParser);
const response = await chain.stream({
input: "Why is the sky blue? Be concise with your answer.",
});
let streamTokens = "";
let streamIters = 0;
for await (const item of response) {
streamTokens += item;
streamIters += 1;
}
console.log("stream tokens:", streamTokens);
console.log("stream iters:", streamIters);
/**
stream item:
stream item: Hello! I'm here to help answer any questions you
stream item: might have or assist you with any task you'd like to
stream item: accomplish. I can provide information
stream item: on a wide range of topics
stream item: , from math and science to history and literature. I can
stream item: also help you manage your schedule, set reminders, and
stream item: much more. Is there something specific you need help with? Let
stream item: me know!
stream item:
*/

API Reference:

info

You can see a LangSmith trace of this example here

Stateful conversation API

Cohere's chat API supports stateful conversations. This means the API stores previous chat messages which can be accessed by passing in a conversation_id field. The example below demonstrates how to use this feature.

import { ChatCohere } from "@langchain/cohere";
import { HumanMessage } from "@langchain/core/messages";

const model = new ChatCohere({
apiKey: process.env.COHERE_API_KEY, // Default
model: "command", // Default
});

const conversationId = `demo_test_id-${Math.random()}`;

const response = await model.invoke(
[new HumanMessage("Tell me a joke about bears.")],
{
conversationId,
}
);
console.log("response: ", response.content);
/**
response: Why did the bear go to the dentist?

Because she had bear teeth!

Hope you found that joke about bears to be a little bit tooth-arious!

Would you like me to tell you another one? I could also provide you with a list of jokes about bears if you prefer.

Just let me know if you have any other jokes or topics you'd like to hear about!
*/

const response2 = await model.invoke(
[new HumanMessage("What was the subject of my last question?")],
{
conversationId,
}
);
console.log("response2: ", response2.content);
/**
response2: Your last question was about bears. You asked me to tell you a joke about bears, which I am programmed to assist with.

Would you like me to assist you with anything else bear-related? I can provide you with facts about bears, stories about bears, or even list other topics that might be of interest to you.

Please let me know if you have any other questions and I will do my best to provide you with a response.
*/

API Reference:

info

You can see the LangSmith traces from this example here and here

RAG

Cohere also comes out of the box with RAG support. You can pass in documents as context to the API request and Cohere's models will use them when generating responses.

import { ChatCohere } from "@langchain/cohere";
import { HumanMessage } from "@langchain/core/messages";

const model = new ChatCohere({
apiKey: process.env.COHERE_API_KEY, // Default
model: "command", // Default
});

const documents = [
{
title: "Harrison's work",
snippet: "Harrison worked at Kensho as an engineer.",
},
{
title: "Harrison's work duration",
snippet: "Harrison worked at Kensho for 3 years.",
},
{
title: "Polar berars in the Appalachian Mountains",
snippet:
"Polar bears have surprisingly adapted to the Appalachian Mountains, thriving in the diverse, forested terrain despite their traditional arctic habitat. This unique situation has sparked significant interest and study in climate adaptability and wildlife behavior.",
},
];

const response = await model.invoke(
[new HumanMessage("Where did Harrison work and for how long?")],
{
documents,
}
);
console.log("response: ", response.content);
/**
response: Harrison worked as an engineer at Kensho for about 3 years.
*/

API Reference:

info

You can see a LangSmith trace of this example here

Connectors

The API also allows for other connections which are not static documents. An example of this is their web-search connector which allows you to pass in a query and the API will search the web for relevant documents. The example below demonstrates how to use this feature.

import { ChatCohere } from "@langchain/cohere";
import { HumanMessage } from "@langchain/core/messages";

const model = new ChatCohere({
apiKey: process.env.COHERE_API_KEY, // Default
model: "command", // Default
});

const response = await model.invoke(
[new HumanMessage("How tall are the largest pengiuns?")],
{
connectors: [{ id: "web-search" }],
}
);
console.log("response: ", JSON.stringify(response, null, 2));
/**
response: {
"lc": 1,
"type": "constructor",
"id": [
"langchain_core",
"messages",
"AIMessage"
],
"kwargs": {
"content": "The tallest penguin species currently in existence is the Emperor Penguin, with a height of 110cm to the top of their head or 115cm to the tip of their beak. This is equivalent to being approximately 3 feet and 7 inches tall.\n\nA fossil of an Anthropornis penguin was found in New Zealand and is suspected to have been even taller at 1.7 metres, though this is uncertain as the fossil is only known from preserved arm and leg bones. The height of a closely related species, Kumimanu biceae, has been estimated at 1.77 metres.\n\nDid you know that because larger-bodied penguins can hold their breath for longer, the colossus penguin could have stayed underwater for 40 minutes or more?",
"additional_kwargs": {
"response_id": "a3567a59-2377-439d-894f-0309f7fea1de",
"generationId": "65dc5b1b-6099-44c4-8338-50eed0d427c5",
"token_count": {
"prompt_tokens": 1394,
"response_tokens": 149,
"total_tokens": 1543,
"billed_tokens": 159
},
"meta": {
"api_version": {
"version": "1"
},
"billed_units": {
"input_tokens": 10,
"output_tokens": 149
}
},
"citations": [
{
"start": 58,
"end": 73,
"text": "Emperor Penguin",
"documentIds": [
"web-search_3:2",
"web-search_4:10"
]
},
{
"start": 92,
"end": 157,
"text": "110cm to the top of their head or 115cm to the tip of their beak.",
"documentIds": [
"web-search_4:10"
]
},
{
"start": 200,
"end": 225,
"text": "3 feet and 7 inches tall.",
"documentIds": [
"web-search_3:2",
"web-search_4:10"
]
},
{
"start": 242,
"end": 262,
"text": "Anthropornis penguin",
"documentIds": [
"web-search_9:4"
]
},
{
"start": 276,
"end": 287,
"text": "New Zealand",
"documentIds": [
"web-search_9:4"
]
},
{
"start": 333,
"end": 343,
"text": "1.7 metres",
"documentIds": [
"web-search_9:4"
]
},
{
"start": 403,
"end": 431,
"text": "preserved arm and leg bones.",
"documentIds": [
"web-search_9:4"
]
},
{
"start": 473,
"end": 488,
"text": "Kumimanu biceae",
"documentIds": [
"web-search_9:4"
]
},
{
"start": 512,
"end": 524,
"text": "1.77 metres.",
"documentIds": [
"web-search_9:4"
]
},
{
"start": 613,
"end": 629,
"text": "colossus penguin",
"documentIds": [
"web-search_3:2"
]
},
{
"start": 663,
"end": 681,
"text": "40 minutes or more",
"documentIds": [
"web-search_3:2"
]
}
],
"documents": [
{
"id": "web-search_3:2",
"snippet": " By comparison, the largest species of penguin alive today, the emperor penguin, is \"only\" about 4 feet tall and can weigh as much as 100 pounds.\n\nInterestingly, because larger bodied penguins can hold their breath for longer, the colossus penguin probably could have stayed underwater for 40 minutes or more. It boggles the mind to imagine the kinds of huge, deep sea fish this mammoth bird might have been capable of hunting.\n\nThe fossil was found at the La Meseta formation on Seymour Island, an island in a chain of 16 major islands around the tip of the Graham Land on the Antarctic Peninsula.",
"title": "Giant 6-Foot-8 Penguin Discovered in Antarctica",
"url": "https://www.treehugger.com/giant-foot-penguin-discovered-in-antarctica-4864169"
},
{
"id": "web-search_4:10",
"snippet": "\n\nWhat is the Tallest Penguin?\n\nThe tallest penguin is the Emperor Penguin which is 110cm to the top of their head or 115cm to the tip of their beak.\n\nHow Tall Are Emperor Penguins in Feet?\n\nAn Emperor Penguin is about 3 feet and 7 inches to the top of its head. They are the largest penguin species currently in existence.\n\nHow Much Do Penguins Weigh in Pounds?\n\nPenguins weigh between 2.5lbs for the smallest species, the Little Penguin, up to 82lbs for the largest species, the Emperor Penguin.\n\nDr. Jackie Symmons is a professional ecologist with a Ph.D. in Ecology and Wildlife Management from Bangor University and over 25 years of experience delivering conservation projects.",
"title": "How Big Are Penguins? [Height & Weight of Every Species] - Polar Guidebook",
"url": "https://polarguidebook.com/how-big-are-penguins/"
},
{
"id": "web-search_9:4",
"snippet": "\n\nA fossil of an Anthropornis penguin found on the island may have been even taller, but this is likely to be an exception. The majority of these penguins were only 1.7 metres tall and weighed around 80 kilogrammes.\n\nWhile Palaeeudyptes klekowskii remains the tallest ever penguin, it is no longer the heaviest. At an estimated 150 kilogrammes, Kumimanu fordycei would have been around three times heavier than any living penguin.\n\nWhile it's uncertain how tall the species was, the height of a closely related species, Kumimanu biceae, has been estimated at 1.77 metres.\n\nThese measurements, however, are all open for debate. Many fossil penguins are only known from preserved arm and leg bones, rather than complete skeletons.",
"title": "The largest ever penguin species has been discovered in New Zealand | Natural History Museum",
"url": "https://www.nhm.ac.uk/discover/news/2023/february/largest-ever-penguin-species-discovered-new-zealand.html"
}
],
"searchResults": [
{
"searchQuery": {
"text": "largest penguin species height",
"generationId": "908fe321-5d27-48c4-bdb6-493be5687344"
},
"documentIds": [
"web-search_3:2",
"web-search_4:10",
"web-search_9:4"
],
"connector": {
"id": "web-search"
}
}
],
"tool_inputs": null,
"searchQueries": [
{
"text": "largest penguin species height",
"generationId": "908fe321-5d27-48c4-bdb6-493be5687344"
}
]
}
}
}
*/

API Reference:

info

You can see a LangSmith trace of this example here

We can see in the kwargs object that the API request did a few things:

  • Performed a search query, storing the result data in the searchQueries and searchResults fields. In the searchQueries field we see they rephrased our query to largest penguin species height for better results.
  • Generated three documents from the search query.
  • Generated a list of citations
  • Generated a final response based on the above actions & content.

Help us out by providing feedback on this documentation page: