-
Notifications
You must be signed in to change notification settings - Fork 5.9k
Merge dev branch #7260
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merge dev branch #7260
Conversation
This reverts commit 977ffba.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR merges the dev branch, bringing updates to dependency versions and implementing changes for Gradio image handling. The key updates include upgrading various Python packages to newer versions and modifying image-related functionality to use file paths instead of PIL objects.
- Updates multiple Python package versions including transformers, bitsandbytes, and exllamav3
- Changes Gradio image components from PIL to filepath handling for better compatibility
- Adds flash-linear-attention dependency and improves ExLlamav3 model handling
Reviewed Changes
Copilot reviewed 25 out of 25 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| requirements files | Updates llama-cpp-binaries from v0.46.0 to v0.49.0, transformers to 4.57., bitsandbytes to 0.48., and adds flash-linear-attention |
| modules/ui_chat.py | Changes Gradio Image components from type='pil' to type='filepath' |
| modules/chat.py | Refactors image handling to work with file paths instead of PIL objects |
| modules/exllamav3.py | Improves model loading and adds logits processing functionality |
| modules/exllamav3_hf.py | Streamlines forward pass implementation and fixes sequence handling |
| modules/models.py | Updates ExLlamav3 loader to return tuple format |
| modules/torch_utils.py | Adds device detection from model attribute |
| modules/logits.py | Extends ExLlamav3 support in logits processing |
| download-model.py | Handles HTTP 416 status code for completed downloads |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
Co-authored-by: Copilot <[email protected]>
…n continuing downloads" This reverts commit 1aa2b92.
No description provided.