Skip to content

Add a new op embedding.#306

Merged
fatcat-z merged 6 commits intomicrosoft:mainfrom
fatcat-z:new_ops_2
Jan 13, 2023
Merged

Add a new op embedding.#306
fatcat-z merged 6 commits intomicrosoft:mainfrom
fatcat-z:new_ops_2

Conversation

@fatcat-z
Copy link
Copy Markdown
Contributor

Add a new op embeding and its test.

@fatcat-z fatcat-z requested a review from xiaowuhu January 11, 2023 01:15
@codecov
Copy link
Copy Markdown

codecov bot commented Jan 11, 2023

Codecov Report

Merging #306 (f80920b) into main (edfa7c1) will increase coverage by 0.02%.
The diff coverage is 100.00%.

❗ Current head f80920b differs from pull request most recent head 78f075d. Consider uploading reports for the commit 78f075d to get more accurate results

@@            Coverage Diff             @@
##             main     #306      +/-   ##
==========================================
+ Coverage   72.92%   72.95%   +0.02%     
==========================================
  Files          97       97              
  Lines        9445     9446       +1     
==========================================
+ Hits         6888     6891       +3     
+ Misses       2557     2555       -2     
Impacted Files Coverage Δ
...t/function_libs/torch_aten/ops_correctness_test.py 95.53% <ø> (ø)
onnxscript/function_libs/torch_aten/ops/core.py 62.61% <100.00%> (+0.09%) ⬆️
onnxscript/converter.py 91.33% <0.00%> (+0.13%) ⬆️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@justinchuby justinchuby added the module: torchlib Related to the torch/aten function lib in development label Jan 11, 2023
) -> TensorType:
weight: TTensor,
indices: TTensor,
**kwargs,# pylint: disable=unused-argument
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
**kwargs,# pylint: disable=unused-argument
**_,

Don’t know if ** works in onnxscript, but if it does:

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to change behavior based on padding_idx?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According to current implementation, it will only impact Training. I think ONNX Script won't know if current export is under Training mode, so the warning about possible wrong training result should be thrown by exporter, not onnx-script.

Thoughts?

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should also support any training behaviors when possible.

@justinchuby
Copy link
Copy Markdown
Collaborator

justinchuby commented Jan 12, 2023

@gramalingam so variadic keyword arguments are supported?

Co-authored-by: Justin Chu <justinchuby@users.noreply.github.com>
@fatcat-z fatcat-z merged commit b29a9fe into microsoft:main Jan 13, 2023
@fatcat-z fatcat-z mentioned this pull request Jan 14, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

module: torchlib Related to the torch/aten function lib in development

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants