A sophisticated Chrome extension that helps you take control of your social media experience through advanced content filtering and rating, powered by local AI processing with Ollama.
- Real-time content analysis using Ollama AI models
- 0-100 rating scale with visual indicators
- Content classification with icons (π¬ Personal, πΌ Business, π» Tech, etc.)
- Comprehensive rating components:
- Content Quality (40%)
- Emotional Impact (30%)
- User Preferences (30%)
- Platform-specific content detection (Twitter, Facebook, Reddit, LinkedIn)
- Customizable filtering thresholds
- Visual feedback through color-coded ratings
- Quick actions: Hide/Block content
- Automatic content dimming or hiding based on quality scores
- All processing happens locally using Ollama
- No data sent to external servers
- Complete control over AI model selection
- Your social media content never leaves your device
- Track content quality trends over time
- Identify top authors and content types
- Export analytics data (CSV/JSON)
- LinkedIn-specific feed analysis
- Efficient post detection and processing
- Smart caching system with 24-hour expiration
- Minimal impact on browsing experience
- Real-time processing counter
- Clean, intuitive interface
- Dark mode support
- Material Design components
- Responsive overlays and popups
- Welcome screen with feature overview
- Frontend: React 18, TypeScript, Material-UI
- Styling: TailwindCSS, PostCSS
- AI: Ollama (local LLM server)
- Build: Webpack 5, Babel
- Storage: Chrome Storage API, SQL.js for analytics
Before you begin, ensure you have the following:
-
Node.js and npm
- Download and install from nodejs.org
- Required version: 16.x or higher
-
Ollama
- Download and install from ollama.ai
- Run Ollama locally:
ollama serve - Pull a model (e.g.,
ollama pull llama3.2orollama pull gemma3:4b)
-
Git
- For cloning the repository
git clone https://github.com/yourusername/chrome-extension-hardcode-blackout.git
cd chrome-extension-hardcode-blackoutnpm installnpm run build- Open Chrome and navigate to
chrome://extensions/ - Enable "Developer mode" (toggle in top right)
- Click "Load unpacked"
- Select the
distdirectory from the project
Make sure Ollama is running on your system:
ollama serve# Development build with watch mode
npm run dev
# Production build
npm run build
npm run prod
# Type checking
npm run type-check
# Linting
npm run lint
npm run lint:fix
# Testing
npm test
npm run test:watch
npm run test:coveragesrc/
βββ background/ # Background service worker
β βββ index.ts # Main background script
β βββ ollama-service.ts # Ollama API integration
β βββ database-service.ts # Analytics database
βββ content/ # Content scripts
β βββ content-script.ts # Main content script
β βββ feed-analyzer.ts # LinkedIn feed analytics
βββ ui/ # React UI components
β βββ popup/ # Extension popup
β βββ options/ # Options page
βββ utils/ # Shared utilities and types
- Start Ollama: Ensure Ollama is running locally
- Run dev build:
npm run devfor automatic rebuilding - Reload extension: Click refresh in
chrome://extensions/after changes - Debug: Use Chrome DevTools for debugging
-
Install Ollama models:
# Recommended models ollama pull llama3.2 # Fast, balanced ollama pull gemma3:4b # Google's Gemma ollama pull mistral # Mistral AI ollama pull phi3 # Microsoft Phi-3
-
Configure in extension:
- Click the extension icon
- Go to Settings
- Select your preferred Ollama model
- Adjust inference settings (temperature, max tokens)
- Auto-Hide Threshold: Posts below this score are hidden (default: 20)
- Dim Threshold: Posts below this score are dimmed (default: 40)
- Highlight Threshold: Posts above this score are highlighted (default: 80)
-
Initial Setup:
- Install the extension
- Ensure Ollama is running
- Visit the options page for initial configuration
-
Browsing Social Media:
- Navigate to supported platforms (Twitter, Facebook, Reddit, LinkedIn)
- Posts are automatically analyzed and rated
- Look for the rating overlay in the top-right of each post
-
Understanding Ratings:
- π΄ 0-20: Very Poor (auto-hidden)
- π 20-40: Poor (dimmed)
- π‘ 40-60: Fair (slightly dimmed)
- π’ 60-80: Good (normal display)
- π’ 80-100: Excellent (may be highlighted)
-
Content Classification Icons:
- π¬ Personal/Social
- πΌ Business
- π» Technology
- π° Finance
- π° News
- π¬ Entertainment
- π Education
- π’ Promotion/Ads
- ποΈ Politics
- π Other
-
Feed Analytics (LinkedIn only):
- View content quality trends
- Track top authors
- Export data for analysis
-
Quick Actions:
- Hide: Temporarily hide a post
- Block: Permanently block similar content
-
"Ollama Not Available" error:
- Ensure Ollama is running:
ollama serve - Check if it's accessible at
http://localhost:11434 - Try clicking "Check Ollama Status" in settings
- Ensure Ollama is running:
-
No ratings appearing:
- Check the extension console for errors
- Verify Ollama has a model loaded
- Ensure you're on a supported platform
-
Performance issues:
- Try a smaller Ollama model (e.g.,
phi3) - Adjust max tokens in settings
- Clear cached ratings in Chrome storage
- Try a smaller Ollama model (e.g.,
We welcome contributions! Please see our Contributing Guidelines for details.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
- Local Processing: All AI inference happens on your device via Ollama
- No External APIs: No content is sent to external services
- Data Storage: Only ratings and analytics are stored locally
- Open Source: Full code transparency
This project is licensed under the MIT License - see the LICENSE file for details.
- Ollama for local LLM inference
- Material-UI for UI components
- The open-source community for inspiration and support
Note: This extension requires Ollama to be installed and running on your local machine. It does not include any AI models - you must download them separately through Ollama.
