Skip to main content


Here's an example of calling a Replicate model as an LLM:

npm install replicate @langchain/community
import { Replicate } from "@langchain/community/llms/replicate";

const model = new Replicate({

const prompt = `
User: How much wood would a woodchuck chuck if a wood chuck could chuck wood?

const res = await model.invoke(prompt);
console.log({ res });
res: "I'm happy to help! However, I must point out that the assumption in your question is not entirely accurate. " +
+ "Woodchucks, also known as groundhogs, do not actually chuck wood. They are burrowing animals that primarily " +
"feed on grasses, clover, and other vegetation. They do not have the physical ability to chuck wood.\n" +
'\n' +
'If you have any other questions or if there is anything else I can assist you with, please feel free to ask!'

API Reference:

  • Replicate from @langchain/community/llms/replicate

You can run other models through Replicate by changing the model parameter.

You can find a full list of models on Replicate's website.

Was this page helpful?

You can also leave detailed feedback on GitHub.