Skip to content

Add ability to specify pipelineable preproc modules to ignore during SDD model rewrite #2149

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

sarckk
Copy link
Member

@sarckk sarckk commented Jun 20, 2024

Summary:
Make torchrec automatically pipeline any modules that don't have trainable params during sparse data dist pipelining.

tldr; with some traversal logic changes, TorchRec sparse data dist pipeline can support arbitrary input transformations at input dist stage as long as they are composed of either nn.Module calls or currently supported ops (mainly getattr and getitem)

Differential Revision: D57944338

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jun 20, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57944338

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57944338

@sarckk sarckk force-pushed the export-D57944338 branch from f497136 to eab0f86 Compare June 24, 2024 17:29
sarckk added a commit to sarckk/torchrec that referenced this pull request Jun 24, 2024
…SDD model rewrite (pytorch#2149)

Summary:
Pull Request resolved: pytorch#2149

Make torchrec automatically pipeline any modules that don't have trainable params during sparse data dist pipelining.

tldr; with some traversal logic changes, TorchRec sparse data dist pipeline can support arbitrary input transformations at input dist stage as long as they are composed of either nn.Module calls or currently supported ops (mainly getattr and getitem)

Differential Revision: D57944338
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57944338

@sarckk sarckk force-pushed the export-D57944338 branch from eab0f86 to 54047a5 Compare June 24, 2024 18:00
sarckk added a commit to sarckk/torchrec that referenced this pull request Jun 24, 2024
…SDD model rewrite (pytorch#2149)

Summary:
Pull Request resolved: pytorch#2149

Make torchrec automatically pipeline any modules that don't have trainable params during sparse data dist pipelining.

tldr; with some traversal logic changes, TorchRec sparse data dist pipeline can support arbitrary input transformations at input dist stage as long as they are composed of either nn.Module calls or currently supported ops (mainly getattr and getitem)

Differential Revision: D57944338
…SDD model rewrite (pytorch#2149)

Summary:
Pull Request resolved: pytorch#2149

Make torchrec automatically pipeline any modules that don't have trainable params during sparse data dist pipelining.

tldr; with some traversal logic changes, TorchRec sparse data dist pipeline can support arbitrary input transformations at input dist stage as long as they are composed of either nn.Module calls or currently supported ops (mainly getattr and getitem)

Reviewed By: dstaay-fb

Differential Revision: D57944338
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57944338

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants