-
Notifications
You must be signed in to change notification settings - Fork 7.1k
[WIP] Aligning functional/pil types with functional/tensor #4323
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
I'm trying to use the Union typing to align. But JIT doesn't seem to be happy still. |
It seems torchscript cannot deal with |
Torchscript can deal with https://pytorch.org/docs/master/jit_language_reference.html#supported-type |
The problem is that the length of |
Ah, the JIT limitations... Using |
Hm, it seems we are missing something about the from typing import Union, List
import torch
from torch import jit
def try_script(fn):
name = fn.__name__
try:
jit.script(fn)
except RuntimeError as error:
print(f"Scripting of {name} failed with\n{error}")
else:
print(f"Scripting of {name} was successful!")
print("#" * 80)
def foo(data: float) -> torch.Tensor:
return torch.tensor(data)
def bar(data: List[float]) -> torch.Tensor:
return torch.tensor(data)
def baz(data: Union[float, List[float]]) -> torch.Tensor:
return torch.tensor(data)
try_script(foo)
try_script(bar)
try_script(baz)
Thoughts? cc @ansley |
This is a limitation of our statically-typed language: Type inference can be triggered with a conditional assert statement like ...Which all sounds reasonable enough, but, stupidly, you end up writing code like this to be type safe: import torch
from typing import Union, List
@torch.jit.script
def fn(data: Union[float, List[float]]) -> torch.Tensor:
if isinstance(data, float):
# Matches to the `torch.tensor` overload that takes a float arg
return torch.tensor(data)
else:
# Type inferred to be the `torch.tensor` overload that takes a
# `List[float]` since it's the one remaining Union type we
# haven't used in all in-scope branches. This is the best
# type inference I could safely do
return torch.tensor(data) I would like to do some sort of follow-up where we take values typed as |
@ansley Thanks a lot for the additional information. Just so I got it right: although a function like |
@pmeier Yeah, in the worst case (i.e. when there isn't additional type information), we rely on the user to annotate the code so that we can uniquely identify a function signature at compile time. We can't delay figuring out which function to call until runtime. Maybe I'm just being pedantic, but I genuinely hope it will help the problem be more clear for me to say: You can see examples of the times we can automatically refine types in the test suite for Union here. |
Tries to solve #4282
I think it won't be possible till Union is fully supported by torchscript.
I will update as I try.