Skip to content

Error: error creating provider: model "llama3.2" not found in model_list #958

@lynnchin

Description

@lynnchin

I need some help. I keep getting such error message and can't find what's wrong with it.

Here is the config.json:

"agents": {
"defaults": {
"workspace": "~/.picoclaw/workspace",
"restrict_to_workspace": true,
"provider": "ollama",
"model": "llama3.2",
"max_tokens": 8192,
"max_tool_iterations": 50
}
},

Here is what it is shown in the terminal in ollama list:

NAME ID SIZE MODIFIED
tinyllama:latest 2644915ede35 637 MB 38 minutes ago
llama3.2:latest a80c4f17acd5 2.0 GB About an hour ago
tomng/nanbeige4.1:latest dd2a8ce0f368 4.2 GB 10 hours ago

./picoclaw status

Workspace: /home/stanley/.picoclaw/workspace ✓
Model: llama3.2:latest
OpenRouter API: not set
Anthropic API: not set
OpenAI API: ✓
Gemini API: not set
Zhipu API: not set
Qwen API: not set
Groq API: not set
Moonshot API: not set
DeepSeek API: not set
VolcEngine API: not set
Nvidia API: not set
vLLM/Local: not set
Ollama: ✓ http://localhost:11434

Please provide some hints how to get the picoclaw working. Thanks!

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions