Skip to content

BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.

License

Notifications You must be signed in to change notification settings

zahintapadar/BeautifyOllama

Repository files navigation

BeautifyOllama

BeautifyOllama Logo

A modern, beautiful web interface for Ollama AI models

Transform your local AI interactions with an elegant, feature-rich chat interface

GitHub Stars GitHub Forks GitHub Issues License

Next.js React TypeScript TailwindCSS

DemoFeaturesInstallationUsageContributingRoadmap


📖 About

BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.

🎉 Latest Updates (v1.6.9)

  • 🔧 Custom Port Support - Configure Ollama to run on any port
  • ⚙️ Comprehensive Settings Panel - Complete Ollama service management
  • 🔍 Enhanced Web Search - Improved search with source tracking and reliability
  • 📱 Cross-Platform Desktop App - Native desktop application with Tauri
  • 🧠 Advanced AI Features - Thinking mode, verbose stats, and conversation management
  • 🛠️ Developer Experience - Improved error handling and Windows compatibility

⚠️ Early Development Notice
This project is in active development. Features and APIs may change. We welcome contributions and feedback from the community.

🎥 Demo

Video Demo

BeautifyOllama Demo

beautifyollamavideocompressed.mp4

✨ Features

Current Features

  • 🔍 Intelligent Web Search - Real-time internet search with SearxNG integration and source tracking
  • 🧠 Thinking Mode Control - Toggle AI reasoning traces on/off with clean rendering
  • 🌐 Multi-Engine Fallback - Multiple SearxNG instances for reliability and uptime
  • 🔧 Custom Port Support - Configure Ollama to run on any port (not just 11434)
  • ⚙️ Comprehensive Settings - Complete Ollama management, model downloads, and configuration
  • 🎬 Animated Shine Borders - Eye-catching animated message borders with color cycling
  • 📱 Responsive Design - Mobile-first approach with seamless cross-device compatibility
  • 🌙 Theme System - Dark/light mode with system preference detection
  • ⚡ Real-time Streaming - Live response streaming from Ollama models with typing effects
  • 🎯 Clean Interface - Simplified message rendering focused on readability
  • 🔄 Advanced Model Management - Download, delete, and switch between Ollama models
  • 📊 Verbose Statistics - Toggle detailed timing and performance stats for responses
  • 💬 Conversation Management - Persistent chat history with sidebar navigation
  • 🖥️ Cross-Platform Support - Windows, macOS, and Linux compatibility with platform-specific optimizations
  • ⌨️ Smart Input - Keyboard shortcuts (Enter to send, Shift+Enter for newlines)
  • 🎨 Modern UI/UX - Glassmorphism effects, smooth micro-animations, and polished design

🚧 Upcoming Features

  • ** File Upload Support** - Document and image processing capabilities
  • 🌐 Multi-language Support - Internationalization for global users
  • 📊 Advanced Usage Analytics - Enhanced token usage tracking and conversation insights
  • 🔌 Plugin System - Extensible architecture for third-party integrations
  • ☁️ Cloud Sync - Optional cloud backup for conversations and settings
  • 🔐 Multi-API Support - Integration with OpenAI, Anthropic, and other AI providers
  • 🎯 Advanced Prompt Templates - Pre-built and custom prompt management
  • 🔒 Enhanced Security - API key encryption and secure credential storage

🚀 Installation

Prerequisites

Ensure you have the following installed on your system:

  • Node.js (v18 or higher)
  • npm, yarn, or pnpm
  • Ollama (for local AI model serving)

Step 1: Install Ollama

# macOS
brew install ollama

# Windows
# Download from https://ollama.ai/download

Step 2: Setup Ollama Models

# Start Ollama service
ollama serve

# Pull recommended models
ollama pull llama2
ollama pull codellama
ollama pull mistral

# For web search feature, also pull a small model:
ollama pull qwen2:0.5b

# Verify installation
ollama list

Step 3: Setup Web Search (Optional)

For enhanced web search capabilities, BeautifyOllama includes integrated Python-based web search:

# Install Python dependencies for web search
pip install ollama requests

# The web search feature uses multiple SearxNG instances
# No additional setup required - it's built-in!

For detailed web search setup and configuration, see Web Search Integration Guide.

Step 4: Install BeautifyOllama

Install The Latest Release.

Step 5: Access the Application

Open your App Launcher And Search for BeautifyOllama

📚 Usage

Basic Chat Interface

  1. Start a Conversation: Type your message in the input field
  2. Send Messages: Press Enter or click the send button
  3. New Lines: Use Shift + Enter for multi-line messages
  4. Switch Models: Use the model selector in the header
  5. Theme Toggle: Click the theme button to switch between light/dark modes
  6. Enable Features: Use the toggle buttons below the input for:
    • Stats Mode: View detailed response timing and performance
    • Thinking Mode: See AI reasoning process (when supported)
    • Web Search: Include real-time internet search in responses

Advanced Features

  • Settings Panel: Click the gear icon to access:
    • Connection Settings: Configure custom Ollama ports
    • Model Management: Download new models or delete existing ones
    • Service Control: Start/stop Ollama service
    • Command Logs: View detailed operation logs
  • Conversation Management: Navigate between chats using the sidebar
  • Response Features: View sources for web search results and detailed statistics

Mobile Usage

  • Access Sidebar: Tap the menu button on mobile devices
  • Touch Gestures: Swipe gestures for navigation
  • Responsive Layout: Optimized for all screen sizes

🏗️ Architecture

Technology Stack

Layer Technology Purpose
Frontend Next.js 15 + React 19 Modern React framework with App Router
Backend Tauri + Rust Native desktop integration and system calls
Styling TailwindCSS 4 Utility-first CSS framework
Animation Framer Motion Smooth animations and transitions
Language TypeScript + Rust Type safety and high-performance backend
State Management React Hooks Local state management
Theme next-themes Dark/light mode functionality
Search Python + SearxNG Integrated web search capabilities

Project Structure

beautifyollama/
├── src/
│   ├── app/                    # Next.js App Router
│   │   ├── globals.css        # Global styles
│   │   ├── layout.tsx         # Root layout
│   │   ├── page.tsx           # Home page
│   │   └── services/          # Service layer
│   │       └── ollamaService.ts
│   ├── components/            # React components
│   │   ├── Chat.tsx          # Main chat interface
│   │   ├── Settings.tsx      # Settings modal
│   │   ├── MarkdownRenderer.tsx
│   │   ├── ThinkingRenderer.tsx
│   │   └── ui/               # Reusable UI components
│   ├── config/               # Configuration files
│   ├── hooks/                # Custom React hooks
│   ├── lib/                  # Utility functions
│   └── types/                # TypeScript type definitions
├── src-tauri/                # Tauri Rust backend
│   ├── src/
│   │   ├── main.rs          # Main Tauri entry
│   │   └── lib.rs           # Core backend logic
│   └── Cargo.toml          # Rust dependencies
├── tools/                    # External tools
│   ├── web-search/          # Python web search integration
│   └── README.md           # Tools documentation
├── public/                   # Static assets
├── docs/                     # Documentation
└── tests/                    # Test files

🤝 Contributing

We welcome contributions from the community! BeautifyOllama is an early-stage project with lots of opportunities to make an impact.

Ways to Contribute

  1. 🐛 Bug Reports - Help us identify and fix issues
  2. 💡 Feature Requests - Suggest new functionality
  3. 📝 Code Contributions - Submit pull requests
  4. 📚 Documentation - Improve README, guides, and code comments
  5. 🎨 Design - UI/UX improvements and suggestions
  6. 🧪 Testing - Help test new features and edge cases

Development Setup

  1. Fork the repository on GitHub
  2. Clone your fork locally:
    git clone https://github.com/your-username/BeautifyOllama.git
    cd BeautifyOllama
  3. Create a branch for your feature:
    git checkout -b feature/your-feature-name
  4. Install dependencies:
    npm install
  5. Start development server:
    npm run dev

Contribution Guidelines

  • Code Style: Follow the existing code style and use TypeScript
  • Commits: Use conventional commit messages (feat:, fix:, docs:, etc.)
  • Testing: Add tests for new features when applicable
  • Documentation: Update README and inline comments for new features
  • Pull Requests: Provide clear descriptions and link related issues

Development Scripts

npm run dev          # Start development server
npm run build        # Build for production  
npm run start        # Start production server
npm run lint         # Run ESLint
npm run type-check   # Run TypeScript compiler
npm run tauri dev    # Start Tauri development mode
npm run tauri build  # Build Tauri desktop application
npm test             # Run tests (when available)

🛣️ Roadmap

Phase 1: Core Features (Current)

  • Basic chat interface with real-time streaming
  • Ollama integration with custom port support
  • Theme system (dark/light mode)
  • Responsive design for all devices
  • Web search integration with SearxNG
  • Comprehensive settings and model management
  • Conversation history and management
  • Advanced thinking and verbose modes
  • Cross-platform support (Windows, macOS, Linux)
  • Enhanced error handling and user feedback
  • Performance optimizations for large conversations

Phase 2: Advanced Features (Next)

  • File upload and document processing
  • Advanced prompt templates and management
  • Export/import conversations (JSON, Markdown)
  • Custom model parameter configuration
  • Plugin architecture foundation
  • Enhanced search within conversation history

Phase 3: Enterprise Features (Future)

  • Multi-user support
  • Cloud synchronization
  • Plugin architecture
  • Usage analytics
  • Advanced security features

Phase 4: Ecosystem (Long-term)

  • Mobile applications
  • Desktop applications
  • API for third-party integrations
  • Marketplace for extensions

📊 Project Status

Feature Status Priority
Core Chat ✅ Complete High
Web Search ✅ Complete High
Settings Panel ✅ Complete High
Model Management ✅ Complete High
Theme System ✅ Complete High
Mobile Support ✅ Complete High
Custom Ports ✅ Complete Medium
File Upload 📋 Planned Medium
Multi-API Support 📋 Planned Medium
Cloud Sync 📋 Planned Low

🐛 Troubleshooting

Common Issues

Ollama Connection Failed

# Check if Ollama is running
ollama serve

# Verify models are available
ollama list

# Test API endpoint
curl http://localhost:11434/api/tags

# For custom ports, test the specific port
curl http://localhost:YOUR_PORT/api/tags

Model Loading Issues (Windows)

# Force refresh models in the app or try:
ollama list
ollama pull llama2
# Then refresh the model list in BeautifyOllama settings

Web Search Not Working

# Ensure Python dependencies are installed
pip install ollama requests

# Check if SearxNG instances are accessible
# The app will automatically try multiple instances

Build Errors

# Clear Next.js cache
rm -rf .next

# Reinstall dependencies
rm -rf node_modules package-lock.json
npm install

# For Tauri build issues
cd src-tauri && cargo clean

Hydration Errors

  • Clear browser cache and localStorage
  • Restart development server
  • Check for theme provider issues

Getting Help

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • Ollama Team - For the excellent local AI runtime
  • Next.js Team - For the amazing React framework
  • Vercel - For seamless deployment platform
  • TailwindCSS - For the utility-first CSS framework
  • Framer Motion - For beautiful animations
  • All Contributors - For making this project better

⭐ Star History

Star History Chart


Made with ❤️ by the BeautifyOllama team

⭐ Star us on GitHub🐛 Report Bug💬 Join Discussion

About

BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published