Remote ollama Model #2984
Replies: 3 comments
-
|
Hi @lohitslohit , I'm not entirely sure, but It seems possible, using the BTW, for further questions about VS Code usage, you should use Stack Overflow, which is the official channel accordingly to the VS Code documentation. This discussion forum is intended to extension development only, for now. Hope this helps |
Beta Was this translation helpful? Give feedback.
-
|
Hi! Yes, you can connect a remote Ollama instance to VS Code Chat. Here's how to set it up: Setup Steps1. Configure Remote Ollama ServerFirst, ensure your remote Ollama instance is accessible over the network: On your remote machine/server:
2. Configure VS Code to Use Remote OllamaIn VS Code settings:
Or configure via settings.json: {
"chat.models": [
{
"provider": "ollama",
"model": "llama2", // or your preferred model
"endpoint": "http://YOUR_REMOTE_IP:11434"
}
]
}3. For MCP (Model Context Protocol) IntegrationIf you're specifically using MCP for tool calling:
{
"mcp.providers": [
{
"name": "remote-ollama",
"type": "ollama",
"baseUrl": "http://YOUR_REMOTE_IP:11434",
"models": ["llama2", "codellama"] // Your available models
}
]
}4. Test the Connection
You can also test the connection directly: curl http://YOUR_REMOTE_IP:11434/api/tagsThis should return a list of available models if the connection is working. Security Considerations
TroubleshootingIf it doesn't work:
Alternative Approach: Using Continue.devIf the built-in VS Code Chat doesn't support remote Ollama well, consider using the Continue extension:
// ~/.continue/config.json
{
"models": [
{
"title": "Remote Llama2",
"provider": "ollama",
"model": "llama2",
"apiBase": "http://YOUR_REMOTE_IP:11434"
}
]
}Continue has excellent support for remote Ollama instances and MCP-style tool calling. Additional Resources |
Beta Was this translation helpful? Give feedback.
-
|
Yes! You can connect a remote Ollama instance to VS Code for use with MCP-based tool calling. Setting up Remote Ollama:
For MCP tool calling specifically: Security note: If exposing Ollama over a network, consider using a reverse proxy with authentication or SSH tunneling rather than exposing it directly. Hope this helps! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Is there a way to connect a remote Ollama instance (running on another machine/server) to VS Code Chat so that it can be used for MCP-based tool calling?
Beta Was this translation helpful? Give feedback.
All reactions