Skip to content

feat: add /add-ollama skill for local model inference#712

Merged
gavrielc merged 2 commits intoqwibitai:mainfrom
daniviber:feat/ollama-mcp-server
Mar 4, 2026
Merged

feat: add /add-ollama skill for local model inference#712
gavrielc merged 2 commits intoqwibitai:mainfrom
daniviber:feat/ollama-mcp-server

Conversation

@daniviber
Copy link
Copy Markdown
Contributor

@daniviber daniviber commented Mar 4, 2026

Summary

Adds an /add-ollama skill that integrates local Ollama models as MCP tools for the container agent. Claude stays as orchestrator but can offload cheaper/faster tasks (summarization, translation, general queries) to local models.

Skill contents

  • ollama-mcp-stdio.ts — stdio MCP server with two tools:
    • ollama_list_models: lists installed Ollama models
    • ollama_generate: sends a prompt to a specified model, returns response with timing metadata
  • ollama-watch.sh — macOS notification watcher for Ollama activity
  • Modifications to index.ts (MCP server config + allowedTools) and container-runner.ts ([OLLAMA] log surfacing at info level)

Usage

/add-ollama

Then ask the agent: "use ollama to summarize: ..."

Test plan

  • Run /add-ollama on a fresh NanoClaw install
  • Verify ollama_list_models returns installed models
  • Verify ollama_generate produces responses
  • Verify scripts/ollama-watch.sh shows macOS notifications
  • Verify skill can be cleanly applied via npx tsx scripts/apply-skill.ts

🤖 Generated with Claude Code

Adds a skill that integrates Ollama as an MCP server, allowing the
container agent to offload tasks to local models (summarization,
translation, general queries) while keeping Claude as orchestrator.

Skill contents:
- ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and
  ollama_generate tools
- ollama-watch.sh: macOS notification watcher for Ollama activity
- Modifications to index.ts (MCP config) and container-runner.ts
  (log surfacing)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@daniviber daniviber force-pushed the feat/ollama-mcp-server branch from 28c9ec6 to 7098366 Compare March 4, 2026 18:49
@daniviber daniviber changed the title feat: add Ollama MCP server for local model inference feat: add /add-ollama skill for local model inference Mar 4, 2026
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@gavrielc
Copy link
Copy Markdown
Collaborator

gavrielc commented Mar 4, 2026

Thanks @daniviber for contributing a great skill! 🙌

Renamed from /add-ollama to /add-ollama-tool to clarify that this adds Ollama as an MCP tool (for the agent to delegate tasks to), not as the agent model itself.

@daniviber
Copy link
Copy Markdown
Contributor Author

daniviber commented Mar 5, 2026 via email

zhnq pushed a commit to zhnq/nanoclaw that referenced this pull request Mar 6, 2026
* feat: add /add-ollama skill for local model inference

Adds a skill that integrates Ollama as an MCP server, allowing the
container agent to offload tasks to local models (summarization,
translation, general queries) while keeping Claude as orchestrator.

Skill contents:
- ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and
  ollama_generate tools
- ollama-watch.sh: macOS notification watcher for Ollama activity
- Modifications to index.ts (MCP config) and container-runner.ts
  (log surfacing)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: rename skill from /add-ollama to /add-ollama-tool

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: gavrielc <gabicohen22@yahoo.com>
jenskock pushed a commit to jenskock/nanoclaw that referenced this pull request Mar 6, 2026
* feat: add /add-ollama skill for local model inference

Adds a skill that integrates Ollama as an MCP server, allowing the
container agent to offload tasks to local models (summarization,
translation, general queries) while keeping Claude as orchestrator.

Skill contents:
- ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and
  ollama_generate tools
- ollama-watch.sh: macOS notification watcher for Ollama activity
- Modifications to index.ts (MCP config) and container-runner.ts
  (log surfacing)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: rename skill from /add-ollama to /add-ollama-tool

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: gavrielc <gabicohen22@yahoo.com>
terrylica pushed a commit to terrylica/nanoclaw that referenced this pull request Mar 8, 2026
# [1.3.0](v1.2.0...v1.3.0) (2026-03-08)

### Bug Fixes

* add-voice-transcription skill drops WhatsApp registerChannel call ([qwibitai#766](https://github.com/terrylica/nanoclaw/issues/766)) ([47ad2e6](47ad2e6))
* aggressive false positive prevention — 5-layer MiniMax pipeline, devil's advocate round, FP learning ([8bfa372](8bfa372))
* atomic claim prevents scheduled tasks from executing twice ([qwibitai#657](https://github.com/terrylica/nanoclaw/issues/657)) ([f794185](f794185)), closes [qwibitai#138](https://github.com/terrylica/nanoclaw/issues/138) [qwibitai#138](https://github.com/terrylica/nanoclaw/issues/138) [qwibitai#211](https://github.com/terrylica/nanoclaw/issues/211) [qwibitai#300](https://github.com/terrylica/nanoclaw/issues/300) [qwibitai#578](https://github.com/terrylica/nanoclaw/issues/578) [qwibitai#601](https://github.com/terrylica/nanoclaw/issues/601) [qwibitai#138](https://github.com/terrylica/nanoclaw/issues/138) [qwibitai#300](https://github.com/terrylica/nanoclaw/issues/300) [qwibitai#138](https://github.com/terrylica/nanoclaw/issues/138)
* cc-skills now reads label strategy + content types; Claude JSON parsing hardened ([fd7fc7f](fd7fc7f))
* correct misleading send_message tool description for scheduled tasks ([qwibitai#729](https://github.com/terrylica/nanoclaw/issues/729)) ([ec0e42b](ec0e42b))
* **db:** add LIMIT to unbounded message history queries ([qwibitai#692](https://github.com/terrylica/nanoclaw/issues/692)) ([qwibitai#735](https://github.com/terrylica/nanoclaw/issues/735)) ([74b02c8](74b02c8))
* format src/index.ts to pass CI prettier check ([qwibitai#711](https://github.com/terrylica/nanoclaw/issues/711)) ([df2bac6](df2bac6)), closes [qwibitai#710](https://github.com/terrylica/nanoclaw/issues/710)
* grant write permissions to CLAUDE.md maintenance claude -p call ([9ddb433](9ddb433))
* rename _chatJid to chatJid in onMessage callback ([1436186](1436186))
* use 'state' instead of 'stateReason' for gh compatibility on bigblack ([a4f2e92](a4f2e92))
* **whatsapp:** add error handling to messages.upsert handler ([qwibitai#695](https://github.com/terrylica/nanoclaw/issues/695)) ([5e3d8b6](5e3d8b6))
* **whatsapp:** write pairing code to file for immediate access ([qwibitai#745](https://github.com/terrylica/nanoclaw/issues/745)) ([be19911](be19911))

### Features

* add /add-ollama skill for local model inference ([qwibitai#712](https://github.com/terrylica/nanoclaw/issues/712)) ([298c3ea](298c3ea))
* add ast-grep rules for Python static analysis ([a548761](a548761))
* add mise deploy task for bigblack deployment ([c39a1f4](c39a1f4))
* add NDJSON telemetry logging for all Telegram messages ([7f64ea6](7f64ea6))
* add update_task tool and return task ID from schedule_task ([68123fd](68123fd))
* cc-skills integration — enhanced issue creation with taxonomy-aware labels, type-specific templates, and discovery provenance ([602e65d](602e65d))
* CLAUDE.md maintenance creates GitHub issues with full link to Telegram ([ba34620](ba34620))
* CLAUDE.md maintenance, devil's advocate fix, OpenGrep + proactive scanning ([ce66e88](ce66e88))
* confidence scoring, verification scripts, log rotation — 3 more FP prevention layers ([0ff2c3c](0ff2c3c))
* iterative MiniMax self-validation (3 adversarial rounds) ([fc05aff](fc05aff))
* Phase 0 — enable Telegram channel and Docker Compose deployment ([ebbf59c](ebbf59c))
* Phase 2 — MiniMax orchestrator loop for continuous validation ([17e90a3](17e90a3))
* proactive algo correctness scanning with full Telegram + GitHub issue reporting ([4b68c3e](4b68c3e))
* **skills:** add image vision skill for WhatsApp ([qwibitai#770](https://github.com/terrylica/nanoclaw/issues/770)) ([af937d6](af937d6))
* **skills:** add pdf-reader skill ([qwibitai#772](https://github.com/terrylica/nanoclaw/issues/772)) ([0b260ec](0b260ec))
* **skills:** add use-local-whisper skill package ([qwibitai#702](https://github.com/terrylica/nanoclaw/issues/702)) ([03f792b](03f792b))
* timezone-aware context injection for agent prompts ([qwibitai#691](https://github.com/terrylica/nanoclaw/issues/691)) ([632713b](632713b)), closes [qwibitai#483](https://github.com/terrylica/nanoclaw/issues/483) [qwibitai#483](https://github.com/terrylica/nanoclaw/issues/483) [qwibitai#526](https://github.com/terrylica/nanoclaw/issues/526)
* whole-repo scanning instead of 3-file batches ([1ace951](1ace951))
* wire trace UUIDs into all Telegram notifications ([b48f0e9](b48f0e9))
ortalis97 pushed a commit to ortalis97/alfred that referenced this pull request Mar 8, 2026
* feat: add /add-ollama skill for local model inference

Adds a skill that integrates Ollama as an MCP server, allowing the
container agent to offload tasks to local models (summarization,
translation, general queries) while keeping Claude as orchestrator.

Skill contents:
- ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and
  ollama_generate tools
- ollama-watch.sh: macOS notification watcher for Ollama activity
- Modifications to index.ts (MCP config) and container-runner.ts
  (log surfacing)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: rename skill from /add-ollama to /add-ollama-tool

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: gavrielc <gabicohen22@yahoo.com>
idgmatrix pushed a commit to Gurufin-AI/nanoclaw that referenced this pull request Mar 9, 2026
* feat: add /add-ollama skill for local model inference

Adds a skill that integrates Ollama as an MCP server, allowing the
container agent to offload tasks to local models (summarization,
translation, general queries) while keeping Claude as orchestrator.

Skill contents:
- ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and
  ollama_generate tools
- ollama-watch.sh: macOS notification watcher for Ollama activity
- Modifications to index.ts (MCP config) and container-runner.ts
  (log surfacing)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: rename skill from /add-ollama to /add-ollama-tool

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: gavrielc <gabicohen22@yahoo.com>
squarewings pushed a commit to squarewings/nanoclaw that referenced this pull request Mar 15, 2026
* feat: add /add-ollama skill for local model inference

Adds a skill that integrates Ollama as an MCP server, allowing the
container agent to offload tasks to local models (summarization,
translation, general queries) while keeping Claude as orchestrator.

Skill contents:
- ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and
  ollama_generate tools
- ollama-watch.sh: macOS notification watcher for Ollama activity
- Modifications to index.ts (MCP config) and container-runner.ts
  (log surfacing)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: rename skill from /add-ollama to /add-ollama-tool

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: gavrielc <gabicohen22@yahoo.com>
bogdano2 pushed a commit to bogdano2/nanoclaw that referenced this pull request Mar 17, 2026
* feat: add /add-ollama skill for local model inference

Adds a skill that integrates Ollama as an MCP server, allowing the
container agent to offload tasks to local models (summarization,
translation, general queries) while keeping Claude as orchestrator.

Skill contents:
- ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and
  ollama_generate tools
- ollama-watch.sh: macOS notification watcher for Ollama activity
- Modifications to index.ts (MCP config) and container-runner.ts
  (log surfacing)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: rename skill from /add-ollama to /add-ollama-tool

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: gavrielc <gabicohen22@yahoo.com>
onlyforart pushed a commit to onlyforart/nanoclaw that referenced this pull request Mar 27, 2026
* feat: add /add-ollama skill for local model inference

Adds a skill that integrates Ollama as an MCP server, allowing the
container agent to offload tasks to local models (summarization,
translation, general queries) while keeping Claude as orchestrator.

Skill contents:
- ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and
  ollama_generate tools
- ollama-watch.sh: macOS notification watcher for Ollama activity
- Modifications to index.ts (MCP config) and container-runner.ts
  (log surfacing)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: rename skill from /add-ollama to /add-ollama-tool

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: gavrielc <gabicohen22@yahoo.com>
XiRoSe pushed a commit to XiRoSe/nova-agent that referenced this pull request Apr 9, 2026
* feat: add /add-ollama skill for local model inference

Adds a skill that integrates Ollama as an MCP server, allowing the
container agent to offload tasks to local models (summarization,
translation, general queries) while keeping Claude as orchestrator.

Skill contents:
- ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and
  ollama_generate tools
- ollama-watch.sh: macOS notification watcher for Ollama activity
- Modifications to index.ts (MCP config) and container-runner.ts
  (log surfacing)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: rename skill from /add-ollama to /add-ollama-tool

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: gavrielc <gabicohen22@yahoo.com>
dm-j pushed a commit to dm-j/nanoclaw that referenced this pull request Apr 13, 2026
* feat: add /add-ollama skill for local model inference

Adds a skill that integrates Ollama as an MCP server, allowing the
container agent to offload tasks to local models (summarization,
translation, general queries) while keeping Claude as orchestrator.

Skill contents:
- ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and
  ollama_generate tools
- ollama-watch.sh: macOS notification watcher for Ollama activity
- Modifications to index.ts (MCP config) and container-runner.ts
  (log surfacing)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: rename skill from /add-ollama to /add-ollama-tool

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: gavrielc <gabicohen22@yahoo.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants