Skip to content

Add a new op embedding. #306

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Jan 13, 2023
Merged

Add a new op embedding. #306

merged 6 commits into from
Jan 13, 2023

Conversation

fatcat-z
Copy link
Contributor

Add a new op embeding and its test.

@fatcat-z fatcat-z requested a review from xiaowuhu January 11, 2023 01:15
@codecov
Copy link

codecov bot commented Jan 11, 2023

Codecov Report

Merging #306 (f80920b) into main (edfa7c1) will increase coverage by 0.02%.
The diff coverage is 100.00%.

❗ Current head f80920b differs from pull request most recent head 78f075d. Consider uploading reports for the commit 78f075d to get more accurate results

@@            Coverage Diff             @@
##             main     #306      +/-   ##
==========================================
+ Coverage   72.92%   72.95%   +0.02%     
==========================================
  Files          97       97              
  Lines        9445     9446       +1     
==========================================
+ Hits         6888     6891       +3     
+ Misses       2557     2555       -2     
Impacted Files Coverage Δ
...t/function_libs/torch_aten/ops_correctness_test.py 95.53% <ø> (ø)
onnxscript/function_libs/torch_aten/ops/core.py 62.61% <100.00%> (+0.09%) ⬆️
onnxscript/converter.py 91.33% <0.00%> (+0.13%) ⬆️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@justinchuby justinchuby added the module: torchlib Related to the torch/aten function lib in development label Jan 11, 2023
) -> TensorType:
weight: TTensor,
indices: TTensor,
**kwargs,# pylint: disable=unused-argument
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
**kwargs,# pylint: disable=unused-argument
**_,

Don’t know if ** works in onnxscript, but if it does:

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to change behavior based on padding_idx?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According to current implementation, it will only impact Training. I think ONNX Script won't know if current export is under Training mode, so the warning about possible wrong training result should be thrown by exporter, not onnx-script.

Thoughts?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should also support any training behaviors when possible.

@justinchuby
Copy link
Collaborator

justinchuby commented Jan 12, 2023

@gramalingam so variadic keyword arguments are supported?

@fatcat-z fatcat-z merged commit b29a9fe into microsoft:main Jan 13, 2023
@fatcat-z fatcat-z mentioned this pull request Jan 14, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: torchlib Related to the torch/aten function lib in development
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants