Skip to content

AddOp(embedding bag) | feat(torchlib) #909

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 40 commits into from
Aug 4, 2023
Merged

Conversation

xiaowuhu
Copy link
Contributor

@xiaowuhu xiaowuhu commented Jul 24, 2023

  • This PR is only for aten_embedding_bag function.
  • There have 4 outputs for this function, we only care about the first one, for other 3, we just make the shape correct, with all zero values filled.
  • aten_embedding_bag_padding_idx will be another PR.
  • max_norm, I think this is rare case but not sure. If given, each embedding vector with norm larger than max_norm is renormalized to have norm max_norm. Note: this will modify weight in-place. Not sure if we need implement embedding_renorm function.

@xiaowuhu xiaowuhu mentioned this pull request Jul 24, 2023
@codecov
Copy link

codecov bot commented Jul 24, 2023

Codecov Report

Merging #909 (9e45476) into main (c8959ff) will increase coverage by 0.17%.
The diff coverage is 99.00%.

@@            Coverage Diff             @@
##             main     #909      +/-   ##
==========================================
+ Coverage   76.68%   76.86%   +0.17%     
==========================================
  Files         112      112              
  Lines       13837    13936      +99     
  Branches     1417     1437      +20     
==========================================
+ Hits        10611    10712     +101     
+ Misses       2875     2873       -2     
  Partials      351      351              
Files Changed Coverage Δ
...ipt/tests/function_libs/torch_lib/ops_test_data.py 95.52% <ø> (ø)
onnxscript/function_libs/torch_lib/ops/core.py 78.13% <98.52%> (+0.57%) ⬆️
...ript/tests/function_libs/torch_lib/extra_opinfo.py 97.83% <100.00%> (+0.34%) ⬆️

... and 1 file with indirect coverage changes

@titaiwangms titaiwangms added the module: torchlib Related to the torch/aten function lib in development label Jul 24, 2023
@justinchuby justinchuby self-requested a review July 24, 2023 23:28
@justinchuby justinchuby self-requested a review July 25, 2023 14:22
@justinchuby justinchuby self-requested a review July 25, 2023 18:17
@xiaowuhu xiaowuhu marked this pull request as draft July 27, 2023 01:26
@github-actions
Copy link

github-actions bot commented Jul 28, 2023

Test Results

         18 files  ±  0         18 suites  ±0   1h 1m 31s ⏱️ - 1m 44s
    9 748 tests +  5    7 239 ✔️ +  5      2 509 💤 ±0  0 ±0 
135 303 runs  +45  29 568 ✔️ +45  105 735 💤 ±0  0 ±0 

Results for commit 9e45476. ± Comparison against base commit c8959ff.

This pull request removes 128 and adds 133 tests. Note that renamed tests count towards both.
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_131_aten_embedding
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_132_aten_hardtanh
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_133_aten_leaky_relu
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_134_aten_log_sigmoid
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_135_aten_nll_loss_weight
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_136_aten_nll_loss
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_137_aten_reflection_pad2d
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_138_aten_relu
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_139_aten_relu6
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_140_aten_replication_pad2d
…
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_131_aten_embedding_bag
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_132_aten_embedding
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_133_aten_hardtanh
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_134_aten_leaky_relu
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_135_aten_log_sigmoid
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_136_aten_nll_loss_weight
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_137_aten_nll_loss
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_138_aten_reflection_pad2d
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_139_aten_relu
onnxscript.tests.function_libs.torch_lib.ops_test.TestFunctionValidity ‑ test_function_has_op_schema_140_aten_relu6
…

♻️ This comment has been updated with latest results.

@xiaowuhu xiaowuhu marked this pull request as ready for review July 29, 2023 01:50
@xiaowuhu xiaowuhu marked this pull request as draft July 29, 2023 02:18
@xiaowuhu xiaowuhu marked this pull request as ready for review August 3, 2023 23:56
@xiaowuhu xiaowuhu requested a review from gramalingam August 4, 2023 00:22
@xiaowuhu xiaowuhu merged commit 2f39b94 into main Aug 4, 2023
@xiaowuhu xiaowuhu deleted the xiaowu/AddOp(embedding_bag) branch August 4, 2023 02:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: torchlib Related to the torch/aten function lib in development
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants