The largest community building the future of LLM apps

LangChain’s flexible abstractions and AI-first toolkit make it the #1 choice for developers when building with GenAI.
Join 1M+ builders standardizing their LLM app development
in LangChain's Python and JavaScript frameworks.

A complete set of interoperable building blocks

Build end-to-end applications with an extensive library of components. Want to change your model? Future-proof your application by incorporating vendor optionality into your LLM infrastructure design.

Read the docs

Customizable chains with a durable runtime

Create a composable app fit for your needs with LangChain Expression Language (LCEL).  Get out-of-the-box support for parallelization, fallbacks, batch, streaming, and async methods, freeing you to focus on what matters.

See a tutorial

Augment the power of LLMs with your data

LangChain connects LLMs to your company’s private data and APIs to build context-aware, reasoning applications. Rapidly move from prototype to production with popular methods like RAG or simple chains.

Build a RAG app

Smart connections to any source of data or knowledge

Need turnkey observability?

LangSmith shines a light into application behavior and performance. Get prompt-level visibility coupled with tools to debug, test, evaluate, deploy, and monitor your applications with your team.

Why choose LangChain?

LangChain is easy to get started with and gives you choice, flexibility, and power as you scale.

2,600+ contributors
The biggest community of any LLM-centric developer framework.
600+ integrations
Providing the largest library of pluggable integrations.
From 0 to 100
Simple to get started, yet robust enough 
for production. LCEL and LangServe give 
you control and a fast path to deploy.

One framework. 
Infinite use cases.

LangChain FAQs

Is LangChain still useful if I’m only using one model or vector database provider?

Yes - LangChain is valuable even if you’re using one provider. Its LangChain Expression Language standardizes methods such as parallelization, fallbacks, and async for more durable execution. We also provide observability out of the box with LangSmith, making the process of getting to production more seamless.

Is LangChain open source?

Yes - LangChain is an MIT-licensed open-source library and is free to use.

What are the most common ways people use LangChain?

LangChain is often used for chaining together a series of LLM calls or for retrieval augmented generation. We recommend you try out LangGraph if you want to build an agent.

Can I use LangChain in production?

Yes, LangChain 0.1 and later are production-ready. We've streamlined the package, which has fewer dependencies for better compatibility with the rest of your code base. We're also committed to no breaking changes on any minor version of LangChain after 0.1, so you can upgrade your patch versions (e.g., 0.2.x) on any minor version without impact.

Is LangChain suitable for the enterprise?

Yes, LangChain is widely used by Fortune 2000 companies. Many enterprises use LangChain to future-proof their stack, allowing for the easy integration of additional model providers as their needs evolve. Visit our inspiration page to see how companies are using LangChain.

How is LangChain different from LangGraph?

For straight-forward chains and retrieval flows, start building with LangChain using LangChain Expression Language to piece together components. If you’re building agents or need complex orchestration, use LangGraph instead.

Ready to start shipping 
reliable GenAI apps faster?

Get started with LangChain, LangGraph, and LangSmith to enhance your LLM app development, from prototype to production.