Skip to content

[lora] Fix bug with training without validation #2106

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jan 25, 2023

Conversation

orenwang
Copy link
Contributor

Currenctly due to an indentation mistake, training Lora without validation (to be able to run at 6.5GB VRAM as mentioned here) throws UnboundLocalError: local variable 'images' referenced before assignment.

This should be fixed by this PR.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Jan 25, 2023

The documentation is not available anymore as the PR was closed or merged.

Copy link
Member

@pcuenca pcuenca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are right! Thanks a lot!

@pcuenca
Copy link
Member

pcuenca commented Jan 25, 2023

/cc @sayakpaul @patil-suraj just fyi :)

@pcuenca pcuenca merged commit fb98acf into huggingface:main Jan 25, 2023
@orenwang orenwang deleted the fix-lora-training-bug branch January 26, 2023 08:12
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants