Skip to content

add support for apply probability to CutMix and MixUp #6448

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Aug 18, 2022

Conversation

pmeier
Copy link
Collaborator

@pmeier pmeier commented Aug 18, 2022

No description provided.

def __init__(self, *, alpha: float) -> None:
super().__init__()
class _BaseMixupCutmix(_RandomApplyTransform):
def __init__(self, *, alpha: float, p: float = 0.5) -> None:
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Default for other transforms is p=0.5:

class RandomErasing(_RandomApplyTransform):
def __init__(
self,
p: float = 0.5,

def __init__(self, num_classes: int, p: float = 0.5, alpha: float = 1.0, inplace: bool = False) -> None:

In the classification reference we hardcode p=1.0 for CutMix and MixUp:

if args.mixup_alpha > 0.0:
mixup_transforms.append(transforms.RandomMixup(num_classes, p=1.0, alpha=args.mixup_alpha))
if args.cutmix_alpha > 0.0:
mixup_transforms.append(transforms.RandomCutmix(num_classes, p=1.0, alpha=args.cutmix_alpha))

I would keep p=0.5 here and set p=1.0 in the references as before. No strong opinion though.

Copy link
Contributor

@datumbox datumbox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@pmeier pmeier merged commit b83d5f7 into pytorch:main Aug 18, 2022
@pmeier pmeier deleted the cutmix-mixup-prob branch August 18, 2022 13:47
facebook-github-bot pushed a commit that referenced this pull request Aug 25, 2022
Reviewed By: datumbox

Differential Revision: D39013659

fbshipit-source-id: c63b0ef93980b515e306c0a30f8ab22f2df2476c
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants