Replies: 1 comment
-
|
Hey @WSINTRA! Cool to see you and your son exploring local LLMs with MaverickMCP! Quick Model Recommendations For Ollama, these work well with MaverickMCP's tools:
Prompting Tips That Actually Work Direct is better with local models. Skip the conversational style:
I just added 12 test examples to the README today - they're battle-tested and work great with local models. Start with these: Easy mode: Challenge mode: The Reality Check: Open vs Commercial Local models get you ~75% of the way there vs commercial models' 95%+. But honestly? For most stock analysis tasks, that 75% is plenty. Local models are faster (no network lag), private, and free to Main difference: Commercial models handle ambiguity better. Local models need clearer instructions but absolutely get the job done. Pro Tips for Your Setup
Your son's learning with real tools, that's awesome! Let me know which models work best for you both. Always interested in real-world feedback from the community. Happy hacking! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm using ollama with open-web-ui, got the mcp server up and running with mcpo, but seem to be having hit and miss with tool calling, been trying with gpt-oss and qwen3-30 with mixed results.
I'm also not really up to scratch on these kinds of tools, I'm mostly playing around so my son can have a go. I shared the creators blog so he can learn about the technical goings on, but my question. What open models are good for calling maverick and what kinds of prompts are good for triggering the tools, direct calls, like stock_price(ticker) or more conversational stuff like "stock price for apple today"
How big of a difference does it make using open source models vs openAI/Anthropic SOA models ?
Beta Was this translation helpful? Give feedback.
All reactions