Support exporting QNN models with Python wheels out-of-the-box #9474
Labels
module: build/install
Issues related to the cmake and buck2 builds, and to installing ExecuTorch
partner: qualcomm
For backend delegation, kernels, demo, etc. from the 3rd-party partner, Qualcomm
Milestone
As part of #9019, we recently added CoreML export support "out of the box" for the executorch pip package: #9483. Developers can now export to CoreML backend, by just pip installing executorch — previously this required cloning the ExecuTorch repo, and building everything from source. We anticipate this to encourage ET adopting for CoreML.
We want to provide the same developer experience when exporting models to Android on Qualcomm chips.
Currently, this requires developers to manually download the QNN SDK, clone the ExecuTorch repo, and build everything from source. Similar to CoreML, we should bundle this requirement into ExecuTorch. Given the QNN library is currently only distributed as a zip file, here are some potential next steps:
cc @larryliu0820 @cccclai @winskuo-quic @shewu-quic @cbilgin @lucylq
The text was updated successfully, but these errors were encountered: