Releases: hellotunamayo/macLlama
v1.0.9(2) - Release
macLlama 1.0.9(2) Release Note
- UI Performance Improvements : This build (1.0.9(2)) includes optimizations to address reported UI performance issues on macOS 26.1 and later.
macLlama 1.0.9(2) Pre-Release Release Note
UI Performance Improvements (Experimental): This build (1.0.9(2)) includes optimizations to address reported UI performance issues on macOS 26.1 and later. Please note this is an experimental build and may contain bugs. We strongly encourage you to test thoroughly and provide feedback. Your input is crucial as we continue to improve the final release.
1.0.8(5) - Release
macLlama 1.0.8(5) Release Notes
This version includes all updates from 1.0.8(4) and features multilingual support.
v1.0.8(4) - macLlama Experimental Release
macLlama Experimental Release Notes - 1.0.8(4)
This is an experimental build for macLlama.
In this version, we've added the "Resuming conversation from history" feature, building on the web search capability introduced in the 1.0.8(3) experimental build.
If you find any bugs or problems, please notify us in Discussions.
Thanks as always!
v1.0.8(3) - macLlama Experimental Release
v1.0.8(3) - macLlama Experimental Release
This release introduces several minor bug fixes and a new experimental feature: Web Search capability.
Key Changes:
-
Bug Fixes: Several minor bug fixes have been implemented.
-
Experimental Web Search Capability: macLlama now includes an experimental Web Search feature. This feature allows your model to search the web directly from within the application.
- Configuration Required: To enable the Web Search capability, you will need to provide a Google Cloud Console Custom Search API key and a CX (Custom Search Engine key) value. These keys can be configured in the application’s preference settings under the “Web Search” tab.
-
Apple Foundation Models Support (macOS 26.0+): Users running macOS 26.0 or later will benefit from additional support from Apple Foundation Models when web search capabilities are enabled.
v1.0.8(2) - Release
What's new in this pre-release
What's new in this version
- Chat Settings Redesign: The chat window settings have been redesigned and are now presented in a modal window.
- Bug Fixes: Several minor bug fixes.
Important Notes:
* This is an experimental release and may contain bugs.
* We value your feedback! Please report any issues you encounter.
* This release supersedes all previous experimental builds.
v1.0.8(1) - macLlama Experimental Release
macLlama experimental build : 1.0.8(1)
We're introducing a new model suggestion feature and have refined the GUI design to adopt Apple's Liquid Glass.
Model Suggestion
You can find this feature under the Model Suggestion button in Preferences > Model Management.
Notice
This build contains experimental features and may have bugs or unexpected behavior.
Please report any errors in our discussions.
v1.0.7(3) - Hotfix
macLlama - Release Notes - Version 1.0.7(3) Hotfix
This version resolves a connection problem some users were seeing with the Ollama server.
Thanks for your patience and feedback.
Thanks to
v1.0.7(2) - Release
macLlama - Release Notes - Version 1.0.7(2)
This release contains a minor update addressing a specific data type correction.
Changes:
- Host Port Type Correction: The type of
hostPorthas been changed from string to integer. This change addresses a suggestion from @doncornelius01 and ensures correct data handling.
v1.0.7(1) - Release
macLlama - Release Notes - Version 1.0.7(1)
This release introduces usability enhancements and expanded configuration options for macLlama.
Changes:
- User Interface: The Advanced settings panel has been relocated from the bottom of the interface to a dedicated sidebar for improved accessibility and organization.
- Advanced Settings: Two new parameters have been added to the Advanced settings panel:
num_predict: Allows control over the number of tokens to predict. A value of -1 indicates no limit.Temperature: Introduces a temperature parameter to influence the randomness and creativity of the generated text. The default value is 0.7.
Notes:
- This release focuses on usability and provides finer control over model generation.
- Refer to the documentation for detailed information on the new parameters and their effects.
v1.0.6(6) - Release
macLlama - Release Notes - Version 1.0.6(6)
- Added local prefix/suffix options in the chat window's advanced settings.
- Minor bug fixes and UI updates.