-
Notifications
You must be signed in to change notification settings - Fork 7.1k
Added annotation typing to densenet #2860
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR!
mypy is complaining because of Tuple[int]
, could you have a look.
Otherwise looks good to me!
I'm still getting some errors, I'm not sure how you prefer to handle:
We could directly set the attribute modules, but that would be extra changes. I'd need some minimal change suggestion on this one 😅 |
Maybe @pmeier knows of a trick, but I would prefer to disable mypy in this block instead of rewriting the code. |
We will always face issues if we dynamically assign attributes, since self.norm1: nn.BatchNorm2d
self.add_module('norm1', nn.BatchNorm2d(num_input_features)) |
Alright, I'll change that then! |
Probably an artifact from a previous implementation.
Since we don't need to touch the lines for the typing, you could do so in a follow-up PR. |
FYI @pmeier, I tried and unfortunately, on my hand, it's still throwing the same error :/ |
Hi @frgfm! Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have you on file. In order for us to review and merge your code, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. If you have received this in error or have any questions, please contact us at [email protected]. Thanks! |
This checks fine: from torch import nn
class DenseLayer(nn.Module):
def __init__(self, num_features: int):
super().__init__()
self.norm1: nn.BatchNorm2d
self.add_module("norm1", nn.BatchNorm2d(num_features)) Could you compile a minimal example for the error you are seeing? |
OK I think I found why I'm still getting errors @pmeier : mypy seems to be taking the commas at the end of line quite badly. The error is gone if I remove them, but stays if I leave them. Should I remove the commas? |
Sure. I only suggested otherwise before, since I thought they are unrelated to this PR. If that is not the case, we can act on this. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Two nitpicks. Otherwise LGTM!
torchvision/models/densenet.py
Outdated
@@ -114,7 +130,7 @@ def forward(self, init_features): | |||
|
|||
|
|||
class _Transition(nn.Sequential): | |||
def __init__(self, num_input_features, num_output_features): | |||
def __init__(self, num_input_features: int, num_output_features: int): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def __init__(self, num_input_features: int, num_output_features: int): | |
def __init__(self, num_input_features: int, num_output_features: int) -> None: |
torchvision/models/densenet.py
Outdated
def _densenet(arch: str, growth_rate: int, block_config: Tuple[int, int, int, int], num_init_features: int, | ||
pretrained: bool, progress: bool, **kwargs: Any) -> DenseNet: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you format this with black
?
@frgfm As of lately, you are required to sign CLA as detailed in #2860 (comment). Without that the work that you have done cannot be committed into our repository. |
@pmeier done ✔️ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot!
self.norm1: nn.BatchNorm2d | ||
self.add_module('norm1', nn.BatchNorm2d(num_input_features)) | ||
self.relu1: nn.ReLU | ||
self.add_module('relu1', nn.ReLU(inplace=True)) | ||
self.conv1: nn.Conv2d |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
note for the future: we should just use
self.norm1 = nn.BatchNorm2(...)
instead of the add_module
alternative, as the names are not dynamic here.
Just waiting on the CLA to get propagated and will merge the PR |
* style: Added annotation typing for densenet * fix: Fixed import * refactor: Removed un-necessary import * fix: Fixed constructor typing * chore: Updated mypy.ini * fix: Fixed tuple typing * style: Ignored some mypy errors * style: Fixed typing * fix: Added missing constructor typing
* style: Added annotation typing for densenet * fix: Fixed import * refactor: Removed un-necessary import * fix: Fixed constructor typing * chore: Updated mypy.ini * fix: Fixed tuple typing * style: Ignored some mypy errors * style: Fixed typing * fix: Added missing constructor typing
Hi there!
As per #2025, annotation typing are welcome in torchvision. So, this PR focuses on
torchvision.models.densenet
this PR!Any feedback is welcome!