-
Notifications
You must be signed in to change notification settings - Fork 63
[torchlib] Allow calling shared functions from other onnx functions #834
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Done by #725 ? |
Ahh that’s right! I remember this pr but got a test error when I did this. I must have confused myself. Thanks! |
So we need to use I can think of two options:
|
@justinchuby We don't need |
This change introduces two shared operators `Rank` and `IsScalar`. They are used to replace the `Size(Shape())` pattern for code reuse and readability. I used a hack to always include these shared functions in the model proto because without #834 we cannot dynamically add these functions to the model as they are used. I added a TODO for this. The first usage is in `aten_all`. I will update the rest of the functions in a separate PR. #1095
Currently the torchscript evaluator does not support calling OnnxFunctions from within an OnnxFunction. We should enable this to allow shared implementations. To do this, we will need to keep references of the called functions inside an OnnxFunction.
The text was updated successfully, but these errors were encountered: