-
Notifications
You must be signed in to change notification settings - Fork 7.1k
[proto] Draft for Transforms API v2 #6205
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 22 commits
25b3667
6b3483d
ea7c513
dc64e8a
d341beb
aade78f
1e04ad3
a812a3b
1aa6a78
6661d8d
0972822
f0c896f
39e8bf6
c147e53
1a3a749
740dfa7
6a5e5ab
8fcf4fa
b2ada45
4af25c6
ff80373
5bc6a50
6a5201a
b659a08
7a8f950
7917a17
99bfad9
ae5eef9
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -7,7 +7,52 @@ | |
F = TypeVar("F", bound="_Feature") | ||
|
||
|
||
class _Feature(torch.Tensor): | ||
class _TransformsMixin: | ||
def __init__(self, *args, **kwargs): | ||
super().__init__() | ||
|
||
# To avoid circular dependency between features and transforms | ||
from ..transforms import functional as F | ||
|
||
self._F = F | ||
|
||
def horizontal_flip(self): | ||
# Just output itself | ||
# How dangerous to do this instead of raising an error ? | ||
return self | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I would raise a "not implemented" error here and implement no-op explicitly on the implementations. So if bbox can't blur, then it should explicitly implement to be no-op as opposed to leaving the default implementation which returns self. Could help us avoid issues but if you disagree I'm happy to leave as-is and discuss later. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I agree that raising There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Up to you. I've added this for your consideration. Feel free to ignore or postpone for later. |
||
|
||
def vertical_flip(self): | ||
# Just output itself | ||
# How dangerous to do this instead of raising an error ? | ||
return self | ||
|
||
def resize(self, size, *, interpolation, max_size, antialias): | ||
# Just output itself | ||
# How dangerous to do this instead of raising an error ? | ||
return self | ||
|
||
def center_crop(self, output_size): | ||
# Just output itself | ||
# How dangerous to do this instead of raising an error ? | ||
return self | ||
|
||
def resized_crop(self, top, left, height, width, *, size, interpolation, antialias): | ||
# Just output itself | ||
# How dangerous to do this instead of raising an error ? | ||
return self | ||
|
||
def pad(self, padding, *, fill, padding_mode): | ||
# Just output itself | ||
# How dangerous to do this instead of raising an error ? | ||
return self | ||
|
||
def rotate(self, angle, *, interpolation, expand, fill, center): | ||
# Just output itself | ||
# How dangerous to do this instead of raising an error ? | ||
return self | ||
|
||
|
||
class _Feature(_TransformsMixin, torch.Tensor): | ||
vfdev-5 marked this conversation as resolved.
Show resolved
Hide resolved
|
||
def __new__( | ||
cls: Type[F], | ||
data: Any, | ||
|
Uh oh!
There was an error while loading. Please reload this page.