The **[AI SDK](https://ai-sdk.dev)** LangChain adapter provides seamless integration between [LangChain](https://langchain.com/) and the AI SDK, enabling you to use LangChain agents and graphs with AI SDK UI components.
npm install @ai-sdk/langchainThe AI SDK LangChain adapter provides seamless integration between LangChain and the AI SDK, enabling you to use LangChain agents and graphs with AI SDK UI components.
``bash`
npm install @ai-sdk/langchain @langchain/core
> Note: @langchain/core is a required peer dependency.
- Convert AI SDK UIMessage to LangChain BaseMessage formatUIMessageStream
- Transform LangChain/LangGraph streams to AI SDK ChatTransport
- implementation for LangSmith deploymentsdata-{type}
- Full support for text, tool calls, and tool results
- Custom data streaming with typed events ()
Use toBaseMessages to convert AI SDK messages to LangChain format:
`ts
import { toBaseMessages } from '@ai-sdk/langchain';
// Convert UI messages to LangChain format
const langchainMessages = await toBaseMessages(uiMessages);
// Use with any LangChain model
const response = await model.invoke(langchainMessages);
`
Use toUIMessageStream to convert LangGraph streams to AI SDK format:
`ts
import { toBaseMessages, toUIMessageStream } from '@ai-sdk/langchain';
import { createUIMessageStreamResponse } from 'ai';
// Convert messages and stream from a LangGraph graph
const langchainMessages = await toBaseMessages(uiMessages);
const langchainStream = await graph.stream(
{ messages: langchainMessages },
{ streamMode: ['values', 'messages'] },
);
// Convert to UI message stream response
return createUIMessageStreamResponse({
stream: toUIMessageStream(langchainStream),
});
`
You can also use toUIMessageStream with streamEvents() for more granular event handling:
`ts
import { toBaseMessages, toUIMessageStream } from '@ai-sdk/langchain';
import { createUIMessageStreamResponse } from 'ai';
// Using streamEvents with an agent
const langchainMessages = await toBaseMessages(uiMessages);
const streamEvents = agent.streamEvents(
{ messages: langchainMessages },
{ version: 'v2' },
);
// Convert to UI message stream response
return createUIMessageStreamResponse({
stream: toUIMessageStream(streamEvents),
});
`
The adapter automatically detects the stream type and handles:
- on_chat_model_stream events for text streamingon_tool_start
- and on_tool_end events for tool calls
- Reasoning content from contentBlocks
LangChain tools can emit custom data events using config.writer(). The adapter converts these to typed data-{type} parts:
`ts
import { tool, type ToolRuntime } from 'langchain';
const analyzeDataTool = tool(
async ({ query }, config: ToolRuntime) => {
// Emit progress updates - becomes 'data-progress' in the UI
config.writer?.({
type: 'progress',
id: 'analysis-1', // Include 'id' to persist in message.parts
step: 'fetching',
message: 'Fetching data...',
progress: 50,
});
// ... perform analysis ...
// Emit status update - becomes 'data-status' in the UI
config.writer?.({
type: 'status',
id: 'analysis-1-status',
status: 'complete',
message: 'Analysis finished',
});
return 'Analysis complete';
},
{
name: 'analyze_data',
description: 'Analyze data with progress updates',
schema: z.object({ query: z.string() }),
},
);
`
Enable the custom stream mode to receive these events:
`ts`
const stream = await graph.stream(
{ messages: langchainMessages },
{ streamMode: ['values', 'messages', 'custom'] },
);
Custom data behavior:
- Data with an id field is persistent (added to message.parts for rendering)id
- Data without an is transient (only delivered via the onData callback)type
- The field determines the event name: { type: 'progress' } → data-progress
Use LangSmithDeploymentTransport with the AI SDK useChat hook to connect directly to a LangGraph deployment from the browser:
`tsx
import { useChat } from 'ai/react';
import { LangSmithDeploymentTransport } from '@ai-sdk/langchain';
import { useMemo } from 'react';
function Chat() {
const transport = useMemo(
() =>
new LangSmithDeploymentTransport({
url: 'https://your-deployment.us.langgraph.app',
apiKey: process.env.LANGSMITH_API_KEY,
}),
[],
);
const { messages, input, handleInputChange, handleSubmit } = useChat({
transport,
});
return (
API Reference
$3
Converts AI SDK
UIMessage objects to LangChain BaseMessage objects.Parameters:
-
messages: UIMessage[] - Array of AI SDK UI messagesReturns:
Promise$3
Converts AI SDK
ModelMessage objects to LangChain BaseMessage objects.Parameters:
-
modelMessages: ModelMessage[] - Array of model messagesReturns:
BaseMessage[]$3
Converts a LangChain/LangGraph stream to an AI SDK
UIMessageStream.Parameters:
-
stream: AsyncIterable | ReadableStream - A stream from LangChain model.stream(), LangGraph graph.stream(), or streamEvents()Returns:
ReadableStreamSupported stream types:
- Model streams - Direct
AIMessageChunk streams from model.stream()
- LangGraph streams - Streams with streamMode: ['values', 'messages']
- streamEvents - Event streams from agent.streamEvents() or model.streamEvents()Supported LangGraph stream events:
-
messages - Streaming message chunks (text, tool calls)
- values - State updates that finalize pending message chunks
- custom - Custom data events (emitted as data-{type} chunks)Supported streamEvents events:
-
on_chat_model_stream - Token streaming from chat models
- on_tool_start - Tool execution start
- on_tool_end - Tool execution end with output$3
A
ChatTransport implementation for LangSmith/LangGraph deployments.Constructor Parameters:
-
options: LangSmithDeploymentTransportOptions - Configuration for the RemoteGraph connection
- url: string - LangSmith deployment URL or local server URL
- apiKey?: string - API key for authentication (optional for local development)
- graphId?: string - The ID of the graph to connect to (defaults to 'agent')Implements:
ChatTransport`Please check out the AI SDK documentation for more information.