Fixed issues with dtype in geom functional transforms v2#7211
Fixed issues with dtype in geom functional transforms v2#7211vfdev-5 merged 16 commits intopytorch:mainfrom
Conversation
… proto-apply-grid-transform-restore-dtype-cast
| img: torch.Tensor, grid: torch.Tensor, mode: str, fill: datapoints.FillTypeJIT | ||
| ) -> torch.Tensor: | ||
|
|
||
| fp = img.dtype == grid.dtype |
There was a problem hiding this comment.
Could you please explain why we're sure that img.dtype is float iff it's the same as the grid dtype? Can't we just use is_floating_dtype()?
There was a problem hiding this comment.
yes, we can use is_floating_dtype. I used a context knowledge that grid should have float dtype
|
Thanks for the PR @vfdev-5 . It's not really clear to me which bugs this issue is trying to address. Could you help me understand? |
|
@NicolasHug This addresses the second point in #7159 (comment). TL;DR You saw tests for |
…d-transform-restore-dtype-cast
|
Hey @vfdev-5! You merged this PR, but no labels were added. The list of valid labels is available at https://github.com/pytorch/vision/blob/main/.github/process_commit.py |
Description story:
Philip (@pmeier) tried to check if other floating types are working for the transforms: f16, f64 etc. here: #7195
It seems that f16 is not even working for stable one (ref). But for f64 we had few issues that this PR is supposed to fix.
Some transforms can't work properly for f64, e.g. perspective. Jit is inconsitent etc
cc @bjuncek @pmeier