-
Notifications
You must be signed in to change notification settings - Fork 536
QNN: mobilebert failed to generate Qnn context binary #7946
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi @guangy10, |
Mine is 2.26. Let me try upgrade to 2.28 |
@winskuo-quic Got another issue after updating the QNN version to 2.28. What is the numpy version you are using? Getting the following error with
|
I believe the error above should not be directly related to QNN version but more possibly caused by Python library version. For me, I have |
Yeah, I'm running the same installation script, Transformers version is same, numpy version is slightly different but I guess it doesn't matter since it's a patched version. Here is my detailed env:
|
@winskuo-quic It's hard to debug the local dev env, if you think the model is working fine, should we just enable it in the CI, the setup there can be the source of truth for future reference. Here is the QNN models we are currently running on CI: https://github.com/pytorch/executorch/blob/main/.github/workflows/trunk.yml#L305-L329 Can you add mobilebert to it? The CI only need to test it in |
@guangy10, |
Same comment applies to here as well. #7634 (comment) |
#8616 is merged. Can we close it now? |
🐛 Describe the bug
python -m examples.qualcomm.scripts.mobilebert_fine_tune -b cmake-out -m SM8450 --compile_only --use_fp16
stacktrace:
Versions
trunk
cc @cccclai @winskuo-quic @shewu-quic
The text was updated successfully, but these errors were encountered: