Skip to content

Conversation

@eustlb
Copy link
Contributor

@eustlb eustlb commented Jun 19, 2025

What does this PR do?

Adds Kyutai's new STT model 🚀

Copy link
Contributor Author

@eustlb eustlb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ArthurZucker commented below the last changes that require your validation 🤗

Comment on lines +4660 to +4664
# Update: to extend _keep_in_fp32_modules flag feature, it can also be used to force modules that should stay in fp32
if model._keep_in_fp32_modules is not None and (
torch_dtype == torch.float16 or getattr(hf_quantizer, "use_keep_in_fp32_modules", False)
torch_dtype == torch.float16
or torch_dtype == torch.bfloat16
or getattr(hf_quantizer, "use_keep_in_fp32_modules", False)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as discussed offline, let's extend _keep_in_fp32_modules for more intuitive fonctionning

"processing",
"image_processing",
"video_processing",
"feature_extractor",
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

discussed offline with @Cyrilvallez, the correct convention for now for file naming is feature_extraction yet I do agree that feature_extractor sounds better. Nonetheless let's keep it how it is for know for coherency

@ArthurZucker
Copy link
Collaborator

Yes let's go!

@eustlb eustlb enabled auto-merge (squash) June 24, 2025 15:24
@eustlb eustlb disabled auto-merge June 24, 2025 15:51
@eustlb eustlb enabled auto-merge (squash) June 24, 2025 15:55
@LysandreJik LysandreJik disabled auto-merge June 24, 2025 16:01
@LysandreJik LysandreJik merged commit 6bdd4ec into huggingface:main Jun 24, 2025
20 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants