-
Notifications
You must be signed in to change notification settings - Fork 537
Failed to load data for backend XnnpackBackend #3848
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@allzero-kwon Thanks for the feedback! How did you export xnn_llava_encoder.pte, could you share the command? |
I exported .pte with XnnpackPartitioner following this guide. Add below lines to llava_encoder and do
|
@iseeyuan I resolved it ! I exported model with aot_compiler script and got an result. BTW, I have another problem on inference.
|
@allzero-kwon Glad you've resolved it! Yes, export should be used to get the graph. |
@allzero-kwon could you share how you did the quantization? |
I tried with |
I'm trying to build llava_encoder using XNNPack & Android ToolChain.
Whenever i attempt to pass inputs to my model, it failed to delegate data to XNNBackend.
I've already checked .pte model working using xnn_executor_runner in linux (below command)
and the input tensor shape was same with android input tensor. (1,3,336,336).
If anyone can help, I would greatly appreciate it.
<MainActivity.java>
The text was updated successfully, but these errors were encountered: