Skip to main content

ChatTogetherAI

Setup

  1. Create a TogetherAI account and get your API key here.
  2. Export or set your API key inline. The ChatTogetherAI class defaults to process.env.TOGETHER_AI_API_KEY.
export TOGETHER_AI_API_KEY=your-api-key

You can use models provided by TogetherAI as follows:

npm install @langchain/community
tip

We're unifying model params across all packages. We now suggest using model instead of modelName, and apiKey for API keys.

import { ChatTogetherAI } from "@langchain/community/chat_models/togetherai";
import { HumanMessage } from "@langchain/core/messages";

const model = new ChatTogetherAI({
temperature: 0.9,
// In Node.js defaults to process.env.TOGETHER_AI_API_KEY
apiKey: "YOUR-API-KEY",
});

console.log(await model.invoke([new HumanMessage("Hello there!")]));

API Reference:

Tool calling & JSON mode

The TogetherAI chat supports JSON mode and calling tools.

Tool calling

import { ChatTogetherAI } from "@langchain/community/chat_models/togetherai";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { convertToOpenAITool } from "@langchain/core/utils/function_calling";
import { Calculator } from "@langchain/community/tools/calculator";

// Use a pre-built tool
const calculatorTool = convertToOpenAITool(new Calculator());

const modelWithCalculator = new ChatTogetherAI({
temperature: 0,
// This is the default env variable name it will look for if none is passed.
apiKey: process.env.TOGETHER_AI_API_KEY,
// Together JSON mode/tool calling only supports a select number of models
model: "mistralai/Mixtral-8x7B-Instruct-v0.1",
}).bind({
// Bind the tool to the model.
tools: [calculatorTool],
tool_choice: calculatorTool, // Specify what tool the model should use
});

const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a super not-so-smart mathmatician."],
["human", "Help me out, how can I add {math}?"],
]);

// Use LCEL to chain the prompt to the model.
const response = await prompt.pipe(modelWithCalculator).invoke({
math: "2 plus 3",
});

console.log(JSON.stringify(response.additional_kwargs.tool_calls));
/**
[
{
"id": "call_f4lzeeuho939vs4dilwd7267",
"type":"function",
"function": {
"name":"calculator",
"arguments": "{\"input\":\"2 + 3\"}"
}
}
]
*/

API Reference:

tip

See a LangSmith trace of the above example here.

JSON mode

To use JSON mode you must include the string "JSON" inside the prompt. Typical conventions include telling the model to use JSON, eg: Respond to the user in JSON format.

import { ChatTogetherAI } from "@langchain/community/chat_models/togetherai";
import { ChatPromptTemplate } from "@langchain/core/prompts";

// Define a JSON schema for the response
const responseSchema = {
type: "object",
properties: {
orderedArray: {
type: "array",
items: {
type: "number",
},
},
},
required: ["orderedArray"],
};
const modelWithJsonSchema = new ChatTogetherAI({
temperature: 0,
apiKey: process.env.TOGETHER_AI_API_KEY,
model: "mistralai/Mixtral-8x7B-Instruct-v0.1",
}).bind({
response_format: {
type: "json_object", // Define the response format as a JSON object
schema: responseSchema, // Pass in the schema for the model's response
},
});

const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful assistant who responds in JSON."],
["human", "Please list this output in order of DESC {unorderedList}."],
]);

// Use LCEL to chain the prompt to the model.
const response = await prompt.pipe(modelWithJsonSchema).invoke({
unorderedList: "[1, 4, 2, 8]",
});

console.log(JSON.parse(response.content as string));
/**
{ orderedArray: [ 8, 4, 2, 1 ] }
*/

API Reference:

tip

See a LangSmith trace of the above example here.

Behind the scenes, TogetherAI uses the OpenAI SDK and OpenAI compatible API, with some caveats:

  • Certain properties are not supported by the TogetherAI API, see here.

Was this page helpful?


You can also leave detailed feedback on GitHub.