This repository was archived by the owner on Nov 23, 2025. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 9
Feature/adding OpenAI provider and some refactoring #19
Merged
doen1el
merged 11 commits into
doen1el:main
from
pwnyprod:feature/adding-openai-provider
Nov 18, 2025
Merged
Feature/adding OpenAI provider and some refactoring #19
doen1el
merged 11 commits into
doen1el:main
from
pwnyprod:feature/adding-openai-provider
Nov 18, 2025
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
- add interface for AI modules to standardize functionality - implement DuckAIModule for interaction with Duck.ai - create ChatGPTModule for potential future integration - update service to initialize and communicate with AI modules 📝 docs(docker-compose): document AI provider configuration - add comments for AI provider options in docker-compose file 📦 build(requirements): add OpenAI package dependency - include openai package for ChatGPT module functionality ♻️ refactor(ai_service): streamline AI service functions - consolidate AI module initialization and function forwarding - remove redundant code for improved maintainability
- simplify docker-compose by removing version declaration - ensure compatibility across different Docker Compose versions
- change default model from gpt-4 to gpt-5 for improved performance - ensure compatibility with updated AI module imports in service
- remove outdated comments related to DuckAI prompt logic - improve code readability by eliminating unnecessary comments ♻️ refactor(ai_service): simplify AI module import comments - remove German comments from AI module imports - enhance code clarity by standardizing comments to English
- eliminate browser interactions from scrape_for_mealie.py and scrape_for_tandoor.py - update recipe processing functions to directly use recipe data without browser
- eliminate unused AIModuleInterface import to clean up code - improve readability by reducing unnecessary dependencies
- change import paths to reflect new module structure - improve clarity and organization of the codebase
- change openai.ChatCompletion.create to openai.resources.chat.completions.create for improved code clarity - ensure compatibility with updated OpenAI API structure
- enhance process_recipe_part method to accept context for improved handling - update all implementations of process_recipe_part to support context 💄 style(chat_gpt): improve prompt formatting for clarity - clean up prompt definitions to ensure clarity and consistency - provide clear structure for expected JSON responses ✨ feat(api_service): implement payload validation for Tandoor API - create validate_tandoor_payload function to enforce required fields - ensure JSON structure adheres to Tandoor API specifications ✨ feat(scrape_for_tandoor): enhance recipe scraping logic - modify scraping logic to use context for recipe steps and servings - integrate validation for final JSON output before saving
- add thumbnails directory to .gitignore for cleaner repo - ignore docker-compose.override.yml to avoid local overrides
- create ScraperService to handle recipe scraping - integrate MealieProvider and TandoorProvider for flexibility - replace individual scrape functions in workers with service calls ♻️ refactor(main): simplify main scraping logic - replace specific scrape functions with ScraperService - streamline provider selection logic for better maintainability 🔧 chore(scraper_modules): create provider interface - define RecipeProviderInterface for consistent provider structure - ensure both MealieProvider and TandoorProvider implement the interface 💄 style(scrape_for_tandoor): remove deprecated scrape functions - delete old scrape_for_tandoor and scrape_for_mealie files - clean up codebase for improved readability and organization 💄 style(scraper_modules): organize provider modules - create separate files for MealieProvider and TandoorProvider - enhance structure and modularity of the scraping functionality
Contributor
Author
|
PS: Also updated Tandoor api interactions to work with the new 2.2 Tandoor api |
Owner
|
Thanks for your contribution! |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This pull request introduces a major refactor to the AI module integration for recipe scraping, adding support for multiple AI providers (DuckAI and OpenAI/ChatGPT) and improving modularity and configurability. The changes include a new AI module interface, provider-specific implementations, and updates to the main entry point to support dynamic provider selection via environment variables. Additionally, configuration options for AI providers have been added to
docker-compose.yml.AI Provider Abstraction and Integration
AIModuleInterfaceinscrapers/ai_modules/ai_module_interface.py, defining a standard interface for AI modules to support multiple providers.ChatGPTModuleandDuckAIModuleclasses implementing the interface inscrapers/ai_modules/chat_gpt.pyandscrapers/ai_modules/duck_ai.py, respectively, allowing seamless switching between OpenAI and DuckAI backends for recipe processing. [1] [2]main.py) to use a newScraperServiceabstraction and set theRECIPE_PROVIDERenvironment variable based on CLI arguments, enabling dynamic selection of the AI provider. [1] [2]Configuration and Setup
docker-compose.ymlto add configuration options for selecting the AI provider (AI_MODULE) and supplying the OpenAI API key, making it easier to switch providers and manage secrets.docker-compose.ymlfor compatibility with newer Docker Compose versions.These changes make the codebase more flexible, maintainable, and ready for further expansion with additional AI providers in the future.