β‘ Building applications with LLMs through composability β‘
Looking for the Python version? Check out LangChain.
Production Support: As you move your LangChains into production, we'd love to offer more comprehensive support. Please fill out this form and we'll set up a dedicated support Slack channel.
yarn add langchain-gpt4all
import { OpenAI } from "langchain-gpt4all/llms/openai";
LangChain is written in TypeScript and can be used in:
- Node.js (ESM and CommonJS) - 18.x, 19.x, 20.x
- Cloudflare Workers
- Vercel / Next.js (Browser, Serverless and Edge functions)
- Supabase Edge Functions
- Browser
- Deno
Large language models (LLMs) are emerging as a transformative technology, enabling developers to build applications that they previously could not. But using these LLMs in isolation is often not enough to create a truly powerful app - the real power comes when you can combine them with other sources of computation or knowledge.
This library is aimed at assisting in the development of those types of applications.
For full documentation of prompts, chains, agents and more, please see here.
This is built to integrate as seamlessly as possible with the LangChain Python package. Specifically, this means all objects (prompts, LLMs, chains, etc) are designed in a way where they can be serialized and shared between languages.
The LangChainHub is a central place for the serialized versions of these prompts, chains, and agents.
As an open source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infra, or better documentation.
Check out our contributing guidelines for instructions on how to contribute.