Skip to content

fix missed mask arg in torchtext transformer #1758

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 6, 2022
Merged

Conversation

erichan1
Copy link
Contributor

@erichan1 erichan1 commented Jun 2, 2022

Add src_key_padding_mask arg to fix numerical difference if there is padding.

@parmeet
Copy link
Contributor

parmeet commented Jun 2, 2022

Thanks @erichan1 for providing the quick fix to this issue. Unfortunately, i am seeing failure with scripting model. Could you please take a look?

test/integration_tests/test_models.py::TestRobertaEncoders::test_roberta_base_model_0_jit Fatal Python error: Aborted

@erichan1
Copy link
Contributor Author

erichan1 commented Jun 3, 2022

Thanks @erichan1 for providing the quick fix to this issue. Unfortunately, i am seeing failure with scripting model. Could you please take a look?

test/integration_tests/test_models.py::TestRobertaEncoders::test_roberta_base_model_0_jit Fatal Python error: Aborted

This fix relies on this fix pytorch/pytorch#78832 to work. With these together, the integration tests pass on a local run.

@parmeet
Copy link
Contributor

parmeet commented Jun 3, 2022

This fix relies on this fix pytorch/pytorch#78832 to work. With these together, the integration tests pass on a local run.

Sounds great! Let's just wait for the pytorch nightlies to reflect these changes and we can land/cherry-pick this PR in the release branch.

@parmeet
Copy link
Contributor

parmeet commented Jun 3, 2022

@Nayef211 This is one of the candidates for cherry-picking..

@Nayef211
Copy link
Contributor

Nayef211 commented Jun 3, 2022

@Nayef211 This is one of the candidates for cherry-picking..

Sounds good thanks for calling it out!

@parmeet
Copy link
Contributor

parmeet commented Jun 6, 2022

Looks like the model specific tests are passing. The failures in this PR are related to old issues that are fixed in main branch. I will go ahead and merge it. cc: @Nayef211

Copy link
Contributor

@parmeet parmeet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks @erichan1 for providing with the fix :)

@parmeet parmeet merged commit cfd1ae2 into main Jun 6, 2022
@parmeet parmeet deleted the erichan1/bt-padding-fix branch June 6, 2022 18:05
Nayef211 pushed a commit to Nayef211/text that referenced this pull request Jun 6, 2022
Nayef211 added a commit that referenced this pull request Jun 7, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants