Anthropic SDK integration for Microsoft Teams.ai - Use Claude models (Opus, Sonnet, Haiku) in your Teams.ai applications with just a few lines of code.
- Type-Safe Model Selection - Use enums instead of error-prone strings
- Streaming Support - Get responses token-by-token with
onChunkcallback - Function Calling - Auto-execute functions with Claude's tool use
- Multi-Part Messages - Send text and handle complex conversations
- Full IChatModel Interface - Drop-in replacement for OpenAI models
- Configurable - Set temperature, max tokens, and all Anthropic parameters
For Teams Anthropic Integration: Use the teams-anthropic-integration skill to quickly set up Teams apps with You.com MCP server integration.
# Install the Teams Anthropic integration skill
npx skills add youdotcom-oss/agent-skills --skill teams-anthropic-integrationOnce installed, ask your AI agent: "Add Anthropic Claude to my Teams app" or "Create a Teams app with You.com MCP and Anthropic"
Supported AI agents: Claude Code, Cursor, Windsurf, Cody, Continue, and more.
See Skill Documentation for complete integration guide.
Get up and running with Claude in your Teams.ai app in 3 quick steps:
npm install @youdotcom-oss/teams-anthropic @anthropic-ai/sdkGet your API key from console.anthropic.com and set it in your environment:
export ANTHROPIC_API_KEY=your-api-key-hereimport { AnthropicChatModel, AnthropicModel } from '@youdotcom-oss/teams-anthropic';
const model = new AnthropicChatModel({
model: AnthropicModel.CLAUDE_SONNET_4_6,
});
const response = await model.send(
{ role: 'user', content: 'What is the capital of France?' }
);
console.log(response.content); // "The capital of France is Paris."That's it! Your Teams.ai app now uses Claude models.
Integrate with You.com MCP server for web search and AI capabilities:
import { App } from '@microsoft/teams.apps';
import { ChatPrompt } from '@microsoft/teams.ai';
import { ConsoleLogger } from '@microsoft/teams.common';
import { McpClientPlugin } from '@microsoft/teams.mcpclient';
import { AnthropicChatModel, AnthropicModel } from '@youdotcom-oss/teams-anthropic';
// Validate required environment variables
if (!process.env.YDC_API_KEY) {
throw new Error('YDC_API_KEY environment variable is required');
}
if (!process.env.ANTHROPIC_API_KEY) {
throw new Error('ANTHROPIC_API_KEY environment variable is required');
}
const logger = new ConsoleLogger('mcp-client', { level: 'info' });
const prompt = new ChatPrompt(
{
instructions: 'You are a helpful assistant with access to web search and AI capabilities.',
model: new AnthropicChatModel({
model: AnthropicModel.CLAUDE_SONNET_4_6,
apiKey: process.env.ANTHROPIC_API_KEY,
}),
},
[new McpClientPlugin({ logger })]
).usePlugin('mcpClient', {
url: process.env.MCP_SERVER_URL || 'http://localhost:4000/mcp',
params: {
headers: {
'User-Agent': 'teams-ai-mcp-client/1.0.0 (teams.ai; anthropic)',
Authorization: `Bearer ${process.env.YDC_API_KEY}`,
},
},
});
const app = new App();
app.on('message', async ({ send, activity }) => {
await send({ type: 'typing' });
const result = await prompt.send(activity.text);
if (result.content) {
await send(result.content);
}
});
app.start().catch(console.error);Complete template available: node_modules/@youdotcom-oss/teams-anthropic/templates/mcp-client.ts
Send a message and get a response:
import { AnthropicChatModel, AnthropicModel } from '@youdotcom-oss/teams-anthropic';
const model = new AnthropicChatModel({
model: AnthropicModel.CLAUDE_SONNET_4_6,
requestOptions: {
max_tokens: 2048,
temperature: 0.7,
},
});
const response = await model.send(
{ role: 'user', content: 'Explain quantum computing in simple terms' },
{
system: {
role: 'system',
content: 'You are a helpful teacher who explains complex topics simply.'
}
}
);
console.log(response.content);Get responses token-by-token for a better user experience:
const model = new AnthropicChatModel({
model: AnthropicModel.CLAUDE_SONNET_4_6,
});
const response = await model.send(
{ role: 'user', content: 'Write a short story about a robot' },
{
onChunk: async (delta) => {
// Stream each token as it arrives
process.stdout.write(delta);
},
}
);
console.log('\n\nFull response:', response.content);Let Claude call functions to get information:
const model = new AnthropicChatModel({
model: AnthropicModel.CLAUDE_SONNET_4_6,
});
const response = await model.send(
{ role: 'user', content: 'What is the weather in San Francisco?' },
{
functions: {
get_weather: {
description: 'Get the current weather for a location',
parameters: {
location: { type: 'string', description: 'City name' },
},
handler: async (args: { location: string }) => {
// Your API call here
return { temperature: 72, conditions: 'Sunny' };
},
},
},
}
);
console.log(response.content); // Claude uses the function result to answerMaintain context across multiple messages:
import { LocalMemory } from '@microsoft/teams.ai';
const memory = new LocalMemory();
const model = new AnthropicChatModel({
model: AnthropicModel.CLAUDE_SONNET_4_6,
});
// First message
await model.send(
{ role: 'user', content: 'My name is Alice' },
{ messages: memory }
);
// Second message - Claude remembers the context
const response = await model.send(
{ role: 'user', content: 'What is my name?' },
{ messages: memory }
);
console.log(response.content); // "Your name is Alice."Choose from the latest Claude models using type-safe enums:
| Enum | Model ID | Description |
|---|---|---|
AnthropicModel.CLAUDE_OPUS_4_6 |
claude-opus-4-6 |
Most capable, best for complex tasks and agents |
AnthropicModel.CLAUDE_SONNET_4_6 |
claude-sonnet-4-6 |
Best combination of speed and intelligence |
AnthropicModel.CLAUDE_HAIKU_4_5 |
claude-haiku-4-5-20251001 |
Fastest, near-frontier intelligence |
AnthropicModel.CLAUDE_OPUS_4_5 |
claude-opus-4-5-20251101 |
Previous generation Opus |
AnthropicModel.CLAUDE_SONNET_4_6 |
claude-sonnet-4-5-20250929 |
Previous generation Sonnet |
AnthropicModel.CLAUDE_3_HAIKU |
claude-3-haiku-20240307 |
Legacy (deprecated, retiring 2026-04-19) |
See all available models with helper functions:
import { getAllModels, getModelDisplayName, getModelFamily } from '@youdotcom-oss/teams-anthropic';
const models = getAllModels();
const displayName = getModelDisplayName(AnthropicModel.CLAUDE_SONNET_4_6); // "Claude Sonnet 4.6"
const family = getModelFamily(AnthropicModel.CLAUDE_HAIKU_4_5); // "haiku"Customize the model behavior with configuration options:
AnthropicChatModelOptions
const model = new AnthropicChatModel({
// Required: Type-safe model selection
model: AnthropicModel.CLAUDE_SONNET_4_6,
// Optional: API key (defaults to ANTHROPIC_API_KEY env var)
apiKey: 'your-api-key',
// Optional: Custom base URL for proxies
baseUrl: 'https://your-proxy.com',
// Optional: Custom headers
headers: {
'X-Custom-Header': 'value',
},
// Optional: Request timeout in milliseconds
timeout: 60_000,
// Optional: Default request options
requestOptions: {
max_tokens: 4096,
temperature: 0.7,
top_p: 0.9,
top_k: 40,
},
// Optional: Custom logger
logger: myLogger,
});Request Options (per message)
const response = await model.send(message, {
// System message
system: { role: 'system', content: 'You are a helpful assistant' },
// Memory for conversation context
messages: memory,
// Streaming callback
onChunk: async (delta) => console.log(delta),
// Function/tool definitions
functions: {
function_name: {
description: 'Function description',
parameters: { /* JSON schema */ },
handler: async (args) => { /* implementation */ },
},
},
// Auto-execute functions (default: true)
autoFunctionCalling: true,
// Override default request options
request: {
max_tokens: 2048,
temperature: 0.5,
},
});Problem: You're getting an authentication error when trying to use the model.
Solution: Make sure you've set your Anthropic API key:
# Option 1: Environment variable
export ANTHROPIC_API_KEY=your-api-key-here
# Option 2: Pass directly in code
const model = new AnthropicChatModel({
model: AnthropicModel.CLAUDE_SONNET_4_6,
apiKey: 'your-api-key-here',
});Problem: You're passing a string instead of using the enum.
Solution: Always use the AnthropicModel enum:
// ✅ Correct
const model = new AnthropicChatModel({
model: AnthropicModel.CLAUDE_SONNET_4_6,
});
// ❌ Wrong
const model = new AnthropicChatModel({
model: 'claude-sonnet-4-6', // Type error!
});Problem: You're not seeing token-by-token responses.
Solution: Make sure you provide the onChunk callback:
const response = await model.send(message, {
onChunk: async (delta) => {
// This callback is required for streaming
process.stdout.write(delta);
},
});Problem: Function calls are returned but not executed.
Solution: Functions auto-execute by default. If you want to control execution manually, set autoFunctionCalling: false:
const response = await model.send(message, {
functions: myFunctions,
autoFunctionCalling: false, // Disable auto-execution
});
// Now response.function_calls will contain the calls to execute manuallyAPI documentation is provided via TypeScript types and TSDoc comments in the source code. See the examples above and TypeScript intellisense in your IDE for complete API details.
See the templates directory for integration templates:
- mcp-client.ts - Complete MCP client integration with custom user agent headers
Access the template after installation:
# Template location in your node_modules
node_modules/@youdotcom-oss/teams-anthropic/templates/mcp-client.tsSee AGENTS.md for development setup and contribution guidelines.
MIT
- Issues: GitHub Issues
- Email: support@you.com