-
Notifications
You must be signed in to change notification settings - Fork 97
feat(atenlib): separate inputs from attributes #368
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report
@@ Coverage Diff @@
## main #368 +/- ##
==========================================
- Coverage 73.66% 73.50% -0.17%
==========================================
Files 97 99 +2
Lines 9710 9821 +111
==========================================
+ Hits 7153 7219 +66
- Misses 2557 2602 +45
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
| default: Any = _EmptyDefault | ||
| is_input: bool = True | ||
|
|
||
| def __repr__(self) -> str: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is the return value supposed to be valid python or just a description?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just wondering whether str is more appropriate or repr ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch, thanks!
|
|
||
| def __repr__(self) -> str: | ||
| param_kind = "INPUT" if self.is_input else "ATTRIBUTE" | ||
| text = f"{self.name}<{param_kind}>: {self.type}" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit (optional): f"{self.name} : {param_kind}({self.type})" or f"{param_kind} {self.name} : {self.type}" would be more readable?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
SG!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a<INTPUT>: INT64
vs
INPUT a : INT64
a : INTPUT(INT64)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will go with a: Input[INT64]
|
|
||
| Attributes: | ||
| name: The name of the parameter. | ||
| type: The type of the parameter. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since we have multiple type representations, which one is this supposed to be? Eg., is this supposed to be python-types (like "int" and "List[FLOAT]") ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think Python types sg. For now not really enforced and it could be anything
| return not self.is_input | ||
|
|
||
|
|
||
| def extract_param_schema_from_function(onnx_func: values.OnnxFunction) -> List[ParamSchema]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The current OnnxFunction IR loses the positional-order information between inputs and attributes (since we didn't care about it then given our assumptions that all inputs come first). We can't reconstruct it subsequently.
Perhaps we should consider updating OnnxFunction IR to store a List[ParamSchema] in the first place ... just thinking out aloud, haven't read the rest of the PR yet.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like this idea! I was going to propose exposing ParamSchema somewhere in the function and op as the next thing to do so we have a common interface as #320
| onnx_attributes[param.name] = kwargs[param.name] | ||
| else: | ||
| # input doesn't have default | ||
| onnx_attributes[param.name] = param.default |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I haven't seen the usage of this function. Perhaps some usage contexts would not want default-values to be filled in. Anyway, I guess we can cross that bridge when we come to it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added a fill_defaults option to the function
Create
param_manipulationfor separate inputs and attributes from python args and kwargs.Raname
typing.pytoonnxscript/function_libs/torch_aten/tensor_typing.py