Get your LLM app from prototype to production

LangSmith is an all-in-one developer platform for every step of the LLM-powered application lifecycle, whether you’re building with LangChain or not.
Debug, collaborate, test, and monitor your LLM applications.

250K+
Users signed up
1Bn
Traces logged
25K+
Monthly active teams

The platform for your LLM development lifecycle

LLM-apps are powerful, but have peculiar characteristics. The non-determinism, coupled with unpredictable, natural language inputs, make for countless ways the system can fall short. Traditional engineering best practices need to be re-imagined for working with LLMs, and LangSmith supports all phases of the development lifecycle.

Sign Up

Develop with greater visibility

Unexpected results happen all the time with LLMs. With full visibility into the entire sequence of calls, you can spot the source of errors and performance bottlenecks in real-time with surgical precision. Debug. Experiment. Observe. Repeat. Until you’re happy with your results.

Go to Docs

Collaborate with teammates to get app behavior just right.

Building LLM-powered applications requires a close partnership between developers and subject matter experts.

Traces

Easily share a chain trace with colleagues, clients, or end users, bringing explainability to anyone with the shared link.

001

Hub

Use LangSmith Hub to craft, version, and comment on prompts. No engineering experience required.

002

Annotation Queues

Try out LangSmith Annotation Queues to add human labels and feedback on traces.

003

Datasets

Easily collect examples, and construct datasets from production data or existing sources. Datasets can be used for evaluations, few-shot prompting, and even fine-tuning.

004

Get tips for testing your LLM application,
from design to production.

Monitor cost, latency, quality.

See what’s happening with your production application, so you can take action when needed or rest assured while your chains and agents do the hard work.

Go to Docs
User feedback collection
Advanced filtering
Online 
auto-evaluation
Cost tracking
Inspect anomalies and errors
Spot latency spikes
User feedback collection
Advanced filtering
Online 
auto-evaluation
Cost tracking
Inspect anomalies and errors
Spot latency spikes

LangSmith turns LLM 
"magic" into enterprise-ready applications.

Got a question?

Can I use LangSmith if I don’t use LangChain?

Yes! Many companies who don’t build with LangChain use LangSmith. You can log traces to LangSmith via the Python SDK, the TypeScript SDK, or the API. See here for more information.

I can’t have data leave my environment. Can I self-host LangSmith?

Yes, we allow customers to self-host LangSmith on our enterprise plan. We deliver the software to run on your Kubernetes cluster, and data will not leave your environment. For more information, check out our documentation.

How easy is it to start using LangSmith if I use LangChain?

Getting started is as easy as setting three environment variables in your LangChain code. When you use the LangSmith SDK, there’s a callback handler to collect traces and send them to your LangSmith Organization. All your trace steps will be formatted automatically, so there’s virtually no set up cost.

My application isn’t written in Python or TypeScript. Will LangSmith be helpful?

Yes, we have an API that allows you to programmatically interact with every feature LangSmith has to offer, including logging traces, creating datasets, and running evals. See documentation for more detail.

Where is LangSmith data stored?

Traces are stored in GCP us-central-1. Organizations' traces are logically separated from each other in a Clickhouse database and encrypted in transit and at rest.

What information is contained in LangSmith traces?

LangSmith traces contain the full information of all the inputs and outputs of each step of the application. This level of information is needed, so that users have full visibility into what's happening in their applications for debugging purposes.

Can I sample the traces that I send to LangSmith?

Yes, starting with the LangSmith Python SDK version 0.0.84 and JS SDK version 0.0.64, you can specify the % of traces you send to LangSmith. See documentation for more detail.

Will LangSmith add latency to my application?

No, LangSmith does not add any latency to your application. In the LangSmith SDK, there’s a callback handler that sends traces to a LangSmith trace collector which runs as an async, distributed process. Additionally, if LangSmith experiences an incident, your application performance will not be disrupted.

Will you train on the data that I send LangSmith?

We will not train on your data, and you own all rights to your data. See LangSmith Terms of Service for more information.

How much does LangSmith cost?

See our pricing page for more information, and find a plan that works for you.

Ready to start shipping 
reliable GenAI apps faster?

Get started with LangChain, LangSmith, and LangGraph to enhance your LLM app development, from prototype to production.