Skip to content

Commit e574a12

Browse files
core (ai/mcp): update experimental MCP client documentation for Streamable HTTP transport usage (#5972) (#6093)
## Background The current experimental MCP client already supports the new HTTP transport through the custom transport option (over passing in a config). ## Summary Updates documentation to illustrate usage of the new Streamable HTTP Transport with `experimental_createMCPClient` for tool conversion. As of now, `experimental_createMCPClient` is only used in the AI SDK as a way to fetch and call MCP server tools; for this reason, we are **not** adding support for Session Management or Resumable Streams - which are new features supported by the new HTTP transport. Examples have been updated to illustrate basic usage, and a new Next.js example has been added to illustrate setting up a **stateless** MCP **server** to be used with `useChat` and `streamText`. ## Future Work Given the deprecation of the SSE transport, we should likely deprecate on our end as well. We should consider fully deprecating native support for transports (e.g. our custom `stdio` transport), and enforcing users to always pass in a custom transport to limit need for maintenance on our end. Fixes #5984 --------- Co-authored-by: Grace Yun <[email protected]>
1 parent 6caff7b commit e574a12

File tree

15 files changed

+641
-157
lines changed

15 files changed

+641
-157
lines changed

content/cookbook/01-next/73-mcp-tools.mdx

Lines changed: 20 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -12,10 +12,15 @@ The AI SDK supports Model Context Protocol (MCP) tools by offering a lightweight
1212

1313
Let's create a route handler for `/api/completion` that will generate text based on the input prompt and MCP tools that can be called at any time during a generation. The route will call the `streamText` function from the `ai` module, which will then generate text based on the input prompt and stream it to the client.
1414

15+
To use the `StreamableHTTPClientTransport`, you will need to install the official Typescript SDK for Model Context Protocol:
16+
17+
<Snippet text="pnpm install @modelcontextprotocol/sdk" />
18+
1519
```ts filename="app/api/completion/route.ts"
1620
import { experimental_createMCPClient, streamText } from 'ai';
1721
import { Experimental_StdioMCPTransport } from 'ai/mcp-stdio';
1822
import { openai } from '@ai-sdk/openai';
23+
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp';
1924

2025
export async function POST(req: Request) {
2126
const { prompt }: { prompt: string } = await req.json();
@@ -38,17 +43,17 @@ export async function POST(req: Request) {
3843
},
3944
});
4045

41-
// Similarly to the stdio example, you can pass in your own custom transport as long as it implements the `MCPTransport` interface:
42-
const transport = new MyCustomTransport({
43-
// ...
44-
});
45-
const customTransportClient = await experimental_createMCPClient({
46+
// Similarly to the stdio example, you can pass in your own custom transport as long as it implements the `MCPTransport` interface (e.g. `StreamableHTTPClientTransport`):
47+
const transport = new StreamableHTTPClientTransport(
48+
new URL('http://localhost:3000/mcp'),
49+
);
50+
const customClient = await experimental_createMCPClient({
4651
transport,
4752
});
4853

4954
const toolSetOne = await stdioClient.tools();
5055
const toolSetTwo = await sseClient.tools();
51-
const toolSetThree = await customTransportClient.tools();
56+
const toolSetThree = await customClient.tools();
5257
const tools = {
5358
...toolSetOne,
5459
...toolSetTwo,
@@ -63,7 +68,15 @@ export async function POST(req: Request) {
6368
onFinish: async () => {
6469
await stdioClient.close();
6570
await sseClient.close();
66-
await customTransportClient.close();
71+
await customClient.close();
72+
},
73+
// Closing clients onError is optional
74+
// - Closing: Immediately frees resources, prevents hanging connections
75+
// - Not closing: Keeps connection open for retries
76+
onError: async error => {
77+
await stdioClient.close();
78+
await sseClient.close();
79+
await customClient.close();
6780
},
6881
});
6982

content/docs/03-ai-sdk-core/15-tools-and-tool-calling.mdx

Lines changed: 17 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -679,7 +679,7 @@ Create an MCP client using either:
679679

680680
- `SSE` (Server-Sent Events): Uses HTTP-based real-time communication, better suited for remote servers that need to send data over the network
681681
- `stdio`: Uses standard input and output streams for communication, ideal for local tool servers running on the same machine (like CLI tools or local services)
682-
- Custom transport: Bring your own transport by implementing the `MCPTransport` interface
682+
- Custom transport: Bring your own transport by implementing the `MCPTransport` interface, ideal when implementing transports from MCP's official Typescript SDK (e.g. `StreamableHTTPClientTransport`)
683683

684684
#### SSE Transport
685685

@@ -719,18 +719,30 @@ const mcpClient = await createMCPClient({
719719

720720
#### Custom Transport
721721

722-
You can also bring your own transport by implementing the `MCPTransport` interface:
722+
You can also bring your own transport, as long as it implements the `MCPTransport` interface. Below is an example of using the new `StreamableHTTPClientTransport` from MCP's official Typescript SDK:
723723

724724
```typescript
725-
import { MCPTransport, createMCPClient } from 'ai';
725+
import {
726+
MCPTransport,
727+
experimental_createMCPClient as createMCPClient,
728+
} from 'ai';
729+
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp';
726730

731+
const url = new URL('http://localhost:3000/mcp');
727732
const mcpClient = await createMCPClient({
728-
transport: new MyCustomTransport({
729-
// ...
733+
transport: new StreamableHTTPClientTransport(url, {
734+
sessionId: 'session_123',
730735
}),
731736
});
732737
```
733738

739+
<Note>
740+
The client returned by the `experimental_createMCPClient` function is a
741+
lightweight client intended for use in tool conversion. It currently does not
742+
support all features of the full MCP client, such as: authorization, session
743+
management, resumable streams, and receiving notifications.
744+
</Note>
745+
734746
#### Closing the MCP Client
735747

736748
After initialization, you should close the MCP client based on your usage pattern:

examples/mcp/README.md

Lines changed: 15 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,20 @@ pnpm install
1818
pnpm build
1919
```
2020

21+
## Streamable HTTP Transport (Stateful)
22+
23+
Start server
24+
25+
```sh
26+
pnpm http:server
27+
```
28+
29+
Run example:
30+
31+
```sh
32+
pnpm http:client
33+
```
34+
2135
## Stdio Transport
2236

2337
Build
@@ -32,7 +46,7 @@ Run example:
3246
pnpm stdio:client
3347
```
3448

35-
## SSE Transport
49+
## SSE Transport (Legacy)
3650

3751
Start server
3852

examples/mcp/package.json

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,13 +7,15 @@
77
"sse:client": "tsx src/sse/client.ts",
88
"stdio:build": "tsc src/stdio/server.ts --outDir src/stdio/dist --target es2023 --module nodenext",
99
"stdio:client": "tsx src/stdio/client.ts",
10+
"http:server": "tsx src/http/server.ts",
11+
"http:client": "tsx src/http/client.ts",
1012
"custom-transport:build": "tsc src/custom-transport/server.ts --outDir src/custom-transport/dist --target es2023 --module nodenext",
1113
"custom-transport:client": "tsx src/custom-transport/client.ts",
1214
"type-check": "tsc --build"
1315
},
1416
"dependencies": {
1517
"@ai-sdk/openai": "workspace:*",
16-
"@modelcontextprotocol/sdk": "^1.7.0",
18+
"@modelcontextprotocol/sdk": "^1.10.2",
1719
"ai": "workspace:*",
1820
"dotenv": "16.4.5",
1921
"express": "5.0.1",

examples/mcp/src/http/client.ts

Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
import { openai } from '@ai-sdk/openai';
2+
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js';
3+
import { experimental_createMCPClient, generateText } from 'ai';
4+
import 'dotenv/config';
5+
6+
async function main() {
7+
const transport = new StreamableHTTPClientTransport(
8+
new URL('http://localhost:3000/mcp'),
9+
);
10+
11+
const mcpClient = await experimental_createMCPClient({
12+
transport,
13+
});
14+
15+
try {
16+
const tools = await mcpClient.tools();
17+
18+
const { text: answer } = await generateText({
19+
model: openai('gpt-4o-mini'),
20+
tools,
21+
maxSteps: 10,
22+
onStepFinish: async ({ toolResults }) => {
23+
console.log(`STEP RESULTS: ${JSON.stringify(toolResults, null, 2)}`);
24+
},
25+
system: 'You are a helpful chatbot',
26+
prompt: 'Look up information about user with the ID foo_123',
27+
});
28+
29+
console.log(`FINAL ANSWER: ${answer}`);
30+
} catch (error) {
31+
console.error('Error:', error);
32+
} finally {
33+
await mcpClient.close();
34+
}
35+
}
36+
37+
main();

examples/mcp/src/http/server.ts

Lines changed: 105 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,105 @@
1+
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
2+
import { StreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/streamableHttp.js';
3+
import express from 'express';
4+
import { z } from 'zod';
5+
6+
// Stateless Mode: see https://github.com/modelcontextprotocol/typescript-sdk/tree/main/src/examples#stateless-mode for more details
7+
8+
const app = express();
9+
app.use(express.json());
10+
11+
app.post('/mcp', async (req, res) => {
12+
const server = new McpServer({
13+
name: 'example-http-server',
14+
version: '1.0.0',
15+
});
16+
17+
server.tool(
18+
'get-user-info',
19+
'Get user info',
20+
{
21+
userId: z.string(),
22+
},
23+
async ({ userId }) => {
24+
return {
25+
content: [
26+
{
27+
type: 'text',
28+
text: `Here is information about user ${userId}:`,
29+
},
30+
{
31+
type: 'text',
32+
text: `Name: John Doe`,
33+
},
34+
{
35+
type: 'text',
36+
text: `Email: [email protected]`,
37+
},
38+
{
39+
type: 'text',
40+
text: `Age: 30`,
41+
},
42+
],
43+
};
44+
},
45+
);
46+
47+
try {
48+
const transport = new StreamableHTTPServerTransport({
49+
sessionIdGenerator: undefined,
50+
});
51+
await server.connect(transport);
52+
await transport.handleRequest(req, res, req.body);
53+
res.on('close', () => {
54+
transport.close();
55+
server.close();
56+
});
57+
} catch (error) {
58+
console.error('Error handling MCP request:', error);
59+
if (!res.headersSent) {
60+
res.status(500).json({
61+
jsonrpc: '2.0',
62+
error: {
63+
code: -32603,
64+
message: 'Internal server error',
65+
},
66+
id: null,
67+
});
68+
}
69+
}
70+
});
71+
72+
app.get('/mcp', async (_req, res) => {
73+
console.log('Received GET MCP request');
74+
res.writeHead(405).end(
75+
JSON.stringify({
76+
jsonrpc: '2.0',
77+
error: {
78+
code: -32000,
79+
message: 'Method not allowed.',
80+
},
81+
id: null,
82+
}),
83+
);
84+
});
85+
86+
app.delete('/mcp', async (_req, res) => {
87+
console.log('Received DELETE MCP request');
88+
res.writeHead(405).end(
89+
JSON.stringify({
90+
jsonrpc: '2.0',
91+
error: {
92+
code: -32000,
93+
message: 'Method not allowed.',
94+
},
95+
id: null,
96+
}),
97+
);
98+
});
99+
100+
app.listen(3000);
101+
102+
process.on('SIGINT', async () => {
103+
console.log('Shutting down server...');
104+
process.exit(0);
105+
});

examples/next-openai/app/api/mcp/route.ts

Lines changed: 0 additions & 35 deletions
This file was deleted.
Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
import { openai } from '@ai-sdk/openai';
2+
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js';
3+
import { experimental_createMCPClient, streamText } from 'ai';
4+
5+
export async function POST(req: Request) {
6+
const url = new URL('http://localhost:3000/mcp/server');
7+
const transport = new StreamableHTTPClientTransport(url);
8+
9+
const [client, { messages }] = await Promise.all([
10+
experimental_createMCPClient({
11+
transport,
12+
}),
13+
req.json(),
14+
]);
15+
16+
try {
17+
const tools = await client.tools();
18+
19+
const result = streamText({
20+
model: openai('gpt-4o-mini'),
21+
tools,
22+
maxSteps: 5,
23+
onStepFinish: async ({ toolResults }) => {
24+
console.log(`STEP RESULTS: ${JSON.stringify(toolResults, null, 2)}`);
25+
},
26+
system: 'You are a helpful chatbot capable of basic arithmetic problems',
27+
messages,
28+
onFinish: async () => {
29+
await client.close();
30+
},
31+
// Optional, enables immediate clean up of resources but connection will not be retained for retries:
32+
// onError: async error => {
33+
// await client.close();
34+
// },
35+
});
36+
37+
return result.toDataStreamResponse();
38+
} catch (error) {
39+
console.error(error);
40+
return Response.json({ error: 'Unexpected error' }, { status: 500 });
41+
}
42+
}

0 commit comments

Comments
 (0)