Skip to content

Added annotation typing to densenet #2860

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 9 commits into from
Nov 6, 2020
Merged

Conversation

frgfm
Copy link
Contributor

@frgfm frgfm commented Oct 21, 2020

Hi there!

As per #2025, annotation typing are welcome in torchvision. So, this PR focuses on torchvision.models.densenet this PR!

Any feedback is welcome!

@pmeier pmeier self-assigned this Oct 21, 2020
@pmeier
Copy link
Collaborator

pmeier commented Oct 22, 2020

@frgfm Before I review could you enable mypy be removing

vision/mypy.ini

Lines 19 to 21 in 65591f1

[mypy-torchvision.models.densenet.*]
ignore_errors = True

I remember I deactivated it in the first step because they were already errors present.

@frgfm
Copy link
Contributor Author

frgfm commented Oct 22, 2020

@frgfm Before I review could you enable mypy be removing

vision/mypy.ini

Lines 19 to 21 in 65591f1

[mypy-torchvision.models.densenet.*]
ignore_errors = True

I remember I deactivated it in the first step because they were already errors present.

Sure, just did it!

Copy link
Member

@fmassa fmassa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR!

mypy is complaining because of Tuple[int], could you have a look.

Otherwise looks good to me!

@frgfm
Copy link
Contributor Author

frgfm commented Oct 23, 2020

I'm still getting some errors, I'm not sure how you prefer to handle:

torchvision/models/densenet.py:32: error: "add_module" of "Module" does not return a value 
[func-returns-value]
            self.add_module('norm1', nn.BatchNorm2d(num_input_features)),
            ^
torchvision/models/densenet.py:33: error: "add_module" of "Module" does not return a value 
[func-returns-value]
            self.add_module('relu1', nn.ReLU(inplace=True)),
            ^
torchvision/models/densenet.py:34: error: "add_module" of "Module" does not return a value 
[func-returns-value]
            self.add_module('conv1', nn.Conv2d(num_input_features, bn_size *
            ^
torchvision/models/densenet.py:37: error: "add_module" of "Module" does not return a value 
[func-returns-value]
            self.add_module('norm2', nn.BatchNorm2d(bn_size * growth_rate)),
            ^
torchvision/models/densenet.py:38: error: "add_module" of "Module" does not return a value 
[func-returns-value]
            self.add_module('relu2', nn.ReLU(inplace=True)),
            ^
torchvision/models/densenet.py:39: error: "add_module" of "Module" does not return a value 
[func-returns-value]
            self.add_module('conv2', nn.Conv2d(bn_size * growth_rate, growth_rate,
            ^
torchvision/models/densenet.py:47: error: "Tensor" not callable  [operator]
            bottleneck_output = self.conv1(self.relu1(self.norm1(concated_features)))  # noqa: ...
                                ^

We could directly set the attribute modules, but that would be extra changes. I'd need some minimal change suggestion on this one 😅

@fmassa
Copy link
Member

fmassa commented Oct 23, 2020

Maybe @pmeier knows of a trick, but I would prefer to disable mypy in this block instead of rewriting the code.

@pmeier
Copy link
Collaborator

pmeier commented Oct 23, 2020

We will always face issues if we dynamically assign attributes, since mypy is a static type checker. The only thing we can do here is to add lines that annotate this statically

self.norm1: nn.BatchNorm2d
self.add_module('norm1', nn.BatchNorm2d(num_input_features))

@frgfm
Copy link
Contributor Author

frgfm commented Oct 23, 2020

Alright, I'll change that then!
Quick question: any specific reasons why there are commas at the end of each self.add_module line?
Should I remove then?

@pmeier
Copy link
Collaborator

pmeier commented Oct 23, 2020

Quick question: any specific reasons why there are commas at the end of each self.add_module line?

Probably an artifact from a previous implementation.

Should I remove then?

Since we don't need to touch the lines for the typing, you could do so in a follow-up PR.

@frgfm
Copy link
Contributor Author

frgfm commented Oct 25, 2020

We will always face issues if we dynamically assign attributes, since mypy is a static type checker. The only thing we can do here is to add lines that annotate this statically

self.norm1: nn.BatchNorm2d
self.add_module('norm1', nn.BatchNorm2d(num_input_features))

FYI @pmeier, I tried and unfortunately, on my hand, it's still throwing the same error :/

@facebook-github-bot
Copy link

Hi @frgfm!

Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have you on file.

In order for us to review and merge your code, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

If you have received this in error or have any questions, please contact us at [email protected]. Thanks!

@frgfm
Copy link
Contributor Author

frgfm commented Nov 4, 2020

Any suggestion about this error @pmeier ? The same thing goes for #2864

@pmeier
Copy link
Collaborator

pmeier commented Nov 4, 2020

@frgfm

This checks fine:

from torch import nn


class DenseLayer(nn.Module):
    def __init__(self, num_features: int):
        super().__init__()
        self.norm1: nn.BatchNorm2d
        self.add_module("norm1", nn.BatchNorm2d(num_features))

Could you compile a minimal example for the error you are seeing?

@frgfm
Copy link
Contributor Author

frgfm commented Nov 4, 2020

OK I think I found why I'm still getting errors @pmeier : mypy seems to be taking the commas at the end of line quite badly. The error is gone if I remove them, but stays if I leave them. Should I remove the commas?

@pmeier
Copy link
Collaborator

pmeier commented Nov 4, 2020

Should I remove the commas?

Sure. I only suggested otherwise before, since I thought they are unrelated to this PR. If that is not the case, we can act on this.

Copy link
Collaborator

@pmeier pmeier left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Two nitpicks. Otherwise LGTM!

@@ -114,7 +130,7 @@ def forward(self, init_features):


class _Transition(nn.Sequential):
def __init__(self, num_input_features, num_output_features):
def __init__(self, num_input_features: int, num_output_features: int):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def __init__(self, num_input_features: int, num_output_features: int):
def __init__(self, num_input_features: int, num_output_features: int) -> None:

Comment on lines 242 to 243
def _densenet(arch: str, growth_rate: int, block_config: Tuple[int, int, int, int], num_init_features: int,
pretrained: bool, progress: bool, **kwargs: Any) -> DenseNet:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you format this with black?

@pmeier pmeier requested a review from fmassa November 5, 2020 09:09
@pmeier
Copy link
Collaborator

pmeier commented Nov 5, 2020

@frgfm As of lately, you are required to sign CLA as detailed in #2860 (comment). Without that the work that you have done cannot be committed into our repository.

@frgfm
Copy link
Contributor Author

frgfm commented Nov 6, 2020

@pmeier done ✔️

Copy link
Member

@fmassa fmassa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot!

Comment on lines +32 to +36
self.norm1: nn.BatchNorm2d
self.add_module('norm1', nn.BatchNorm2d(num_input_features))
self.relu1: nn.ReLU
self.add_module('relu1', nn.ReLU(inplace=True))
self.conv1: nn.Conv2d
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note for the future: we should just use

self.norm1 = nn.BatchNorm2(...)

instead of the add_module alternative, as the names are not dynamic here.

@fmassa
Copy link
Member

fmassa commented Nov 6, 2020

Just waiting on the CLA to get propagated and will merge the PR

@fmassa fmassa merged commit a8d8496 into pytorch:master Nov 6, 2020
@frgfm frgfm deleted the densenet-typing branch November 19, 2020 08:09
bryant1410 pushed a commit to bryant1410/vision-1 that referenced this pull request Nov 22, 2020
* style: Added annotation typing for densenet

* fix: Fixed import

* refactor: Removed un-necessary import

* fix: Fixed constructor typing

* chore: Updated mypy.ini

* fix: Fixed tuple typing

* style: Ignored some mypy errors

* style: Fixed typing

* fix: Added missing constructor typing
vfdev-5 pushed a commit to Quansight/vision that referenced this pull request Dec 4, 2020
* style: Added annotation typing for densenet

* fix: Fixed import

* refactor: Removed un-necessary import

* fix: Fixed constructor typing

* chore: Updated mypy.ini

* fix: Fixed tuple typing

* style: Ignored some mypy errors

* style: Fixed typing

* fix: Added missing constructor typing
@frgfm frgfm mentioned this pull request Oct 26, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants