-
Notifications
You must be signed in to change notification settings - Fork 65
TypeError: custom_FFT() takes 2 positional arguments but 5 were given #1078
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
fft_fft takes a few additional arguments you may want to account for: https://github.com/pytorch/pytorch/blob/e47e946bbf488890858fe1491df3bffa441d9011/aten/src/ATen/native/native_functions.yaml#L13172 |
Thanks @justinchuby, That worked like a charm! I now have an onnx graph that looks like this:
What would be the domain of the symbolic layer of a custom layer initialised as `class FFTLayer(nn.Module):
|
The new torch.onnx.dynamo_export should be able to capture layers in the graph by default. Could you try that and let us know? You may need the torch 2.1 rc or nightly build. Documentation: https://pytorch.org/docs/2.1/onnx_dynamo.html#torch.onnx.dynamo_export |
Hey @justinchuby Thank you so much for guiding me through the steps so far, unfortunately when I used the torch.onnx.dynamo_export it resulted in the attached SARIF file being generated. It appears that aten._fft_r2c.default was not supported. When I added support in the same fashion as I did for "aten::imag", "aten::real" and "aten::fft_fft". However, I still got the same error. I also looked into alternatives for converting my entire NN.Module towards a single graph node and I found the following description https://ramkrishna2910.medium.com/what-why-and-how-onnx-script-74dd21ab396f at number 2. Is this method outdated? Again thank you for your help so far. |
Thanks for catching this. Indeed torch dynamo is using _fft_r2c for fft ops instead of fft_fft used in torchscript. We are implementing the operator in #926. It is a work in progress and a little tricky to get right. |
This is done. |
Hey everyone,
I am trying to export my FFT layer towards onnx using onnxscript, however I get the following error:
"TypeError: custom_FFT() takes 2 positional arguments but 5 were given
(Occurred when translating fft_fft)."
Could you guys give me a hint where I am going wrong?
I have attached my colab in case you wish todo some experimentation:
https://colab.research.google.com/drive/12fdcvc-BMciFAzfVhz0ARfwxrfQ3M9V6?usp=sharing
The text was updated successfully, but these errors were encountered: