Skip to content

Releases: hellotunamayo/macLlama

v1.0.1

12 May 07:41

Choose a tag to compare

macLlama v1.0.1 - In-App Server Control & UI Polish! 🎉

This update to macLlama focuses on making your experience with Ollama on macOS even smoother. We've added the ability to start the Ollama server directly from within the app and polished the user interface for a more intuitive experience.

✨ Highlights

  • 🚀 In-App Server Control: Start the Ollama server directly from macLlama – no more terminal juggling just to get started!
  • 🎨 UI Improvements: Enjoy a more polished and intuitive user interface.
  • 📚 Updated README: The README has been revamped with clearer instructions for setup, system requirements, and usage.

🚀 Getting Started

  1. System & Ollama Ready?
    • Ensure your Mac meets the System Requirements (macOS 14+ Sonoma, Apple Silicon).
    • Make sure you have the Ollama CLI installed. If not, install it (e.g., brew install ollama).
  2. Get macLlama:
    • Download macLlama.app.zip from the Assets section below.
    • Unzip it and drag macLlama.app to your /Applications folder.
  3. Start Ollama Server:
    • Use the new in-app button in macLlama.
    • Alternatively, run ollama serve in your terminal.
  4. Pull an Ollama Model:
    • If you're new or want a different model, use ollama pull llama3:8b-instruct (or any model from the Ollama Library).
  5. Launch & Chat! Open macLlama and start interacting with your models.

🤝 Contributing

We love contributions! Check out our README.md for details on how to contribute, or open an issue with your ideas and suggestions.


Project Link: https://github.com/hellotunamayo/macLlama

Enjoy the update! 😊

1.0.0-alpha

09 May 07:10

Choose a tag to compare

Version 1.0.0-alpha

Initial Release