Skip to content

shariltumin/ollama-client-micropython

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Introducing ollama-client-micropython: Connecting MicroPython Boards to Ollama

ollama-client-micropython is a new client library that makes it easy to connect microcontroller units (MCUs) running MicroPython to the Ollama server. This library allows you to send TCP/IP HTTP(S) requests from your MicroPython board to the Ollama server, enabling a chat interface where you can send prompts and receive replies.

What is Ollama?

Ollama is a platform that lets you run large language models (LLMs) on your own PC. Once set up, it provides a web API for interacting with these models, similar to having your own private 'ChatGPT.' It's free to use as long as your PC is powerful enough to handle it.

Why is This Important?

Integrating AI with IoT devices can lead to many exciting possibilities. For example, you could use sensor data from your smart home to fine-tune a model, creating AI agents that automate tasks and make your home smarter. The potential applications are vast and exciting.

Getting Started

Our first goal with ollama-client-micropython is to connect MicroPython boards to the Ollama server. This involves sending prompts and receiving responses, allowing for basic interactions. While large Python libraries for chatbots and AI agents can't run on resource-limited MCUs like ESP or RP2040 boards, we can use these libraries on a powerful PC to fine-tune models and then utilize them on MicroPython.

Next Steps

By establishing a connection between MicroPython boards and the Ollama server, we can start exploring more advanced applications. This is just the beginning, and we look forward to seeing what innovative solutions can be developed by integrating AI with IoT.

Let's get started and see where this journey takes us!

About

A client library that makes it easy to connect microcontrollers running MicroPython to the Ollama server

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages