Releases: hellotunamayo/macLlama
Releases · hellotunamayo/macLlama
v1.0.1
macLlama v1.0.1 - In-App Server Control & UI Polish! 🎉
This update to macLlama focuses on making your experience with Ollama on macOS even smoother. We've added the ability to start the Ollama server directly from within the app and polished the user interface for a more intuitive experience.
✨ Highlights
- 🚀 In-App Server Control: Start the Ollama server directly from macLlama – no more terminal juggling just to get started!
- 🎨 UI Improvements: Enjoy a more polished and intuitive user interface.
- 📚 Updated README: The README has been revamped with clearer instructions for setup, system requirements, and usage.
🚀 Getting Started
- System & Ollama Ready?
- Ensure your Mac meets the System Requirements (macOS 14+ Sonoma, Apple Silicon).
- Make sure you have the Ollama CLI installed. If not, install it (e.g.,
brew install ollama).
- Get macLlama:
- Download
macLlama.app.zipfrom the Assets section below. - Unzip it and drag
macLlama.appto your/Applicationsfolder.
- Download
- Start Ollama Server:
- Use the new in-app button in macLlama.
- Alternatively, run
ollama servein your terminal.
- Pull an Ollama Model:
- If you're new or want a different model, use
ollama pull llama3:8b-instruct(or any model from the Ollama Library).
- If you're new or want a different model, use
- Launch & Chat! Open macLlama and start interacting with your models.
🤝 Contributing
We love contributions! Check out our README.md for details on how to contribute, or open an issue with your ideas and suggestions.
Project Link: https://github.com/hellotunamayo/macLlama
Enjoy the update! 😊
1.0.0-alpha
Version 1.0.0-alpha
Initial Release