feat: add /add-ollama skill for local model inference#712
Merged
gavrielc merged 2 commits intoqwibitai:mainfrom Mar 4, 2026
Merged
feat: add /add-ollama skill for local model inference#712gavrielc merged 2 commits intoqwibitai:mainfrom
gavrielc merged 2 commits intoqwibitai:mainfrom
Conversation
Adds a skill that integrates Ollama as an MCP server, allowing the container agent to offload tasks to local models (summarization, translation, general queries) while keeping Claude as orchestrator. Skill contents: - ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and ollama_generate tools - ollama-watch.sh: macOS notification watcher for Ollama activity - Modifications to index.ts (MCP config) and container-runner.ts (log surfacing) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
28c9ec6 to
7098366
Compare
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Collaborator
|
Thanks @daniviber for contributing a great skill! 🙌 Renamed from |
This was referenced Mar 5, 2026
Contributor
Author
|
Thanks for accepting my PR ! Really like the project approach !
Le mer. 4 mars 2026, 22:48, gavrielc ***@***.***> a écrit :
… Merged #712 <#712> into main.
—
Reply to this email directly, view it on GitHub
<#712 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/BU3NTSNYKW4LIQAYZ3K7TXT4PCQDZAVCNFSM6AAAAACWHAE2IKVHI2DSMVQWIX3LMV45UABCJFZXG5LFIV3GK3TUJZXXI2LGNFRWC5DJN5XDWMRTGI3TIOJUGQ2DOMQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
This was referenced Mar 6, 2026
This was referenced Mar 6, 2026
zhnq
pushed a commit
to zhnq/nanoclaw
that referenced
this pull request
Mar 6, 2026
* feat: add /add-ollama skill for local model inference Adds a skill that integrates Ollama as an MCP server, allowing the container agent to offload tasks to local models (summarization, translation, general queries) while keeping Claude as orchestrator. Skill contents: - ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and ollama_generate tools - ollama-watch.sh: macOS notification watcher for Ollama activity - Modifications to index.ts (MCP config) and container-runner.ts (log surfacing) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: rename skill from /add-ollama to /add-ollama-tool Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: gavrielc <gabicohen22@yahoo.com>
jenskock
pushed a commit
to jenskock/nanoclaw
that referenced
this pull request
Mar 6, 2026
* feat: add /add-ollama skill for local model inference Adds a skill that integrates Ollama as an MCP server, allowing the container agent to offload tasks to local models (summarization, translation, general queries) while keeping Claude as orchestrator. Skill contents: - ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and ollama_generate tools - ollama-watch.sh: macOS notification watcher for Ollama activity - Modifications to index.ts (MCP config) and container-runner.ts (log surfacing) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: rename skill from /add-ollama to /add-ollama-tool Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: gavrielc <gabicohen22@yahoo.com>
terrylica
pushed a commit
to terrylica/nanoclaw
that referenced
this pull request
Mar 8, 2026
# [1.3.0](v1.2.0...v1.3.0) (2026-03-08) ### Bug Fixes * add-voice-transcription skill drops WhatsApp registerChannel call ([qwibitai#766](https://github.com/terrylica/nanoclaw/issues/766)) ([47ad2e6](47ad2e6)) * aggressive false positive prevention — 5-layer MiniMax pipeline, devil's advocate round, FP learning ([8bfa372](8bfa372)) * atomic claim prevents scheduled tasks from executing twice ([qwibitai#657](https://github.com/terrylica/nanoclaw/issues/657)) ([f794185](f794185)), closes [qwibitai#138](https://github.com/terrylica/nanoclaw/issues/138) [qwibitai#138](https://github.com/terrylica/nanoclaw/issues/138) [qwibitai#211](https://github.com/terrylica/nanoclaw/issues/211) [qwibitai#300](https://github.com/terrylica/nanoclaw/issues/300) [qwibitai#578](https://github.com/terrylica/nanoclaw/issues/578) [qwibitai#601](https://github.com/terrylica/nanoclaw/issues/601) [qwibitai#138](https://github.com/terrylica/nanoclaw/issues/138) [qwibitai#300](https://github.com/terrylica/nanoclaw/issues/300) [qwibitai#138](https://github.com/terrylica/nanoclaw/issues/138) * cc-skills now reads label strategy + content types; Claude JSON parsing hardened ([fd7fc7f](fd7fc7f)) * correct misleading send_message tool description for scheduled tasks ([qwibitai#729](https://github.com/terrylica/nanoclaw/issues/729)) ([ec0e42b](ec0e42b)) * **db:** add LIMIT to unbounded message history queries ([qwibitai#692](https://github.com/terrylica/nanoclaw/issues/692)) ([qwibitai#735](https://github.com/terrylica/nanoclaw/issues/735)) ([74b02c8](74b02c8)) * format src/index.ts to pass CI prettier check ([qwibitai#711](https://github.com/terrylica/nanoclaw/issues/711)) ([df2bac6](df2bac6)), closes [qwibitai#710](https://github.com/terrylica/nanoclaw/issues/710) * grant write permissions to CLAUDE.md maintenance claude -p call ([9ddb433](9ddb433)) * rename _chatJid to chatJid in onMessage callback ([1436186](1436186)) * use 'state' instead of 'stateReason' for gh compatibility on bigblack ([a4f2e92](a4f2e92)) * **whatsapp:** add error handling to messages.upsert handler ([qwibitai#695](https://github.com/terrylica/nanoclaw/issues/695)) ([5e3d8b6](5e3d8b6)) * **whatsapp:** write pairing code to file for immediate access ([qwibitai#745](https://github.com/terrylica/nanoclaw/issues/745)) ([be19911](be19911)) ### Features * add /add-ollama skill for local model inference ([qwibitai#712](https://github.com/terrylica/nanoclaw/issues/712)) ([298c3ea](298c3ea)) * add ast-grep rules for Python static analysis ([a548761](a548761)) * add mise deploy task for bigblack deployment ([c39a1f4](c39a1f4)) * add NDJSON telemetry logging for all Telegram messages ([7f64ea6](7f64ea6)) * add update_task tool and return task ID from schedule_task ([68123fd](68123fd)) * cc-skills integration — enhanced issue creation with taxonomy-aware labels, type-specific templates, and discovery provenance ([602e65d](602e65d)) * CLAUDE.md maintenance creates GitHub issues with full link to Telegram ([ba34620](ba34620)) * CLAUDE.md maintenance, devil's advocate fix, OpenGrep + proactive scanning ([ce66e88](ce66e88)) * confidence scoring, verification scripts, log rotation — 3 more FP prevention layers ([0ff2c3c](0ff2c3c)) * iterative MiniMax self-validation (3 adversarial rounds) ([fc05aff](fc05aff)) * Phase 0 — enable Telegram channel and Docker Compose deployment ([ebbf59c](ebbf59c)) * Phase 2 — MiniMax orchestrator loop for continuous validation ([17e90a3](17e90a3)) * proactive algo correctness scanning with full Telegram + GitHub issue reporting ([4b68c3e](4b68c3e)) * **skills:** add image vision skill for WhatsApp ([qwibitai#770](https://github.com/terrylica/nanoclaw/issues/770)) ([af937d6](af937d6)) * **skills:** add pdf-reader skill ([qwibitai#772](https://github.com/terrylica/nanoclaw/issues/772)) ([0b260ec](0b260ec)) * **skills:** add use-local-whisper skill package ([qwibitai#702](https://github.com/terrylica/nanoclaw/issues/702)) ([03f792b](03f792b)) * timezone-aware context injection for agent prompts ([qwibitai#691](https://github.com/terrylica/nanoclaw/issues/691)) ([632713b](632713b)), closes [qwibitai#483](https://github.com/terrylica/nanoclaw/issues/483) [qwibitai#483](https://github.com/terrylica/nanoclaw/issues/483) [qwibitai#526](https://github.com/terrylica/nanoclaw/issues/526) * whole-repo scanning instead of 3-file batches ([1ace951](1ace951)) * wire trace UUIDs into all Telegram notifications ([b48f0e9](b48f0e9))
ortalis97
pushed a commit
to ortalis97/alfred
that referenced
this pull request
Mar 8, 2026
* feat: add /add-ollama skill for local model inference Adds a skill that integrates Ollama as an MCP server, allowing the container agent to offload tasks to local models (summarization, translation, general queries) while keeping Claude as orchestrator. Skill contents: - ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and ollama_generate tools - ollama-watch.sh: macOS notification watcher for Ollama activity - Modifications to index.ts (MCP config) and container-runner.ts (log surfacing) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: rename skill from /add-ollama to /add-ollama-tool Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: gavrielc <gabicohen22@yahoo.com>
idgmatrix
pushed a commit
to Gurufin-AI/nanoclaw
that referenced
this pull request
Mar 9, 2026
* feat: add /add-ollama skill for local model inference Adds a skill that integrates Ollama as an MCP server, allowing the container agent to offload tasks to local models (summarization, translation, general queries) while keeping Claude as orchestrator. Skill contents: - ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and ollama_generate tools - ollama-watch.sh: macOS notification watcher for Ollama activity - Modifications to index.ts (MCP config) and container-runner.ts (log surfacing) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: rename skill from /add-ollama to /add-ollama-tool Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: gavrielc <gabicohen22@yahoo.com>
squarewings
pushed a commit
to squarewings/nanoclaw
that referenced
this pull request
Mar 15, 2026
* feat: add /add-ollama skill for local model inference Adds a skill that integrates Ollama as an MCP server, allowing the container agent to offload tasks to local models (summarization, translation, general queries) while keeping Claude as orchestrator. Skill contents: - ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and ollama_generate tools - ollama-watch.sh: macOS notification watcher for Ollama activity - Modifications to index.ts (MCP config) and container-runner.ts (log surfacing) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: rename skill from /add-ollama to /add-ollama-tool Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: gavrielc <gabicohen22@yahoo.com>
bogdano2
pushed a commit
to bogdano2/nanoclaw
that referenced
this pull request
Mar 17, 2026
* feat: add /add-ollama skill for local model inference Adds a skill that integrates Ollama as an MCP server, allowing the container agent to offload tasks to local models (summarization, translation, general queries) while keeping Claude as orchestrator. Skill contents: - ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and ollama_generate tools - ollama-watch.sh: macOS notification watcher for Ollama activity - Modifications to index.ts (MCP config) and container-runner.ts (log surfacing) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: rename skill from /add-ollama to /add-ollama-tool Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: gavrielc <gabicohen22@yahoo.com>
onlyforart
pushed a commit
to onlyforart/nanoclaw
that referenced
this pull request
Mar 27, 2026
* feat: add /add-ollama skill for local model inference Adds a skill that integrates Ollama as an MCP server, allowing the container agent to offload tasks to local models (summarization, translation, general queries) while keeping Claude as orchestrator. Skill contents: - ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and ollama_generate tools - ollama-watch.sh: macOS notification watcher for Ollama activity - Modifications to index.ts (MCP config) and container-runner.ts (log surfacing) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: rename skill from /add-ollama to /add-ollama-tool Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: gavrielc <gabicohen22@yahoo.com>
XiRoSe
pushed a commit
to XiRoSe/nova-agent
that referenced
this pull request
Apr 9, 2026
* feat: add /add-ollama skill for local model inference Adds a skill that integrates Ollama as an MCP server, allowing the container agent to offload tasks to local models (summarization, translation, general queries) while keeping Claude as orchestrator. Skill contents: - ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and ollama_generate tools - ollama-watch.sh: macOS notification watcher for Ollama activity - Modifications to index.ts (MCP config) and container-runner.ts (log surfacing) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: rename skill from /add-ollama to /add-ollama-tool Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: gavrielc <gabicohen22@yahoo.com>
dm-j
pushed a commit
to dm-j/nanoclaw
that referenced
this pull request
Apr 13, 2026
* feat: add /add-ollama skill for local model inference Adds a skill that integrates Ollama as an MCP server, allowing the container agent to offload tasks to local models (summarization, translation, general queries) while keeping Claude as orchestrator. Skill contents: - ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and ollama_generate tools - ollama-watch.sh: macOS notification watcher for Ollama activity - Modifications to index.ts (MCP config) and container-runner.ts (log surfacing) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: rename skill from /add-ollama to /add-ollama-tool Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: gavrielc <gabicohen22@yahoo.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds an
/add-ollamaskill that integrates local Ollama models as MCP tools for the container agent. Claude stays as orchestrator but can offload cheaper/faster tasks (summarization, translation, general queries) to local models.Skill contents
ollama-mcp-stdio.ts— stdio MCP server with two tools:ollama_list_models: lists installed Ollama modelsollama_generate: sends a prompt to a specified model, returns response with timing metadataollama-watch.sh— macOS notification watcher for Ollama activityindex.ts(MCP server config + allowedTools) andcontainer-runner.ts([OLLAMA]log surfacing at info level)Usage
Then ask the agent: "use ollama to summarize: ..."
Test plan
/add-ollamaon a fresh NanoClaw installollama_list_modelsreturns installed modelsollama_generateproduces responsesscripts/ollama-watch.shshows macOS notificationsnpx tsx scripts/apply-skill.ts🤖 Generated with Claude Code