-
Notifications
You must be signed in to change notification settings - Fork 299
[fix] Dropout bug fix #247
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -38,10 +38,14 @@ def build_backbone(self, input_shape: Tuple[int, ...]) -> None: | |
self.config.update( | ||
{"num_units_%d" % (i): num for i, num in enumerate(neuron_counts)} | ||
) | ||
if self.config['use_dropout'] and self.config["max_dropout"] > 0.05: | ||
# we are skipping the last layer, as the function get_shaped_neuron_counts | ||
# is built for getting neuron counts, so it will add the out_features to | ||
# the last layer. However, in dropout we dont want to have that, we just | ||
# want to use the shape and not worry about the output. | ||
if self.config['use_dropout']: | ||
dropout_shape = get_shaped_neuron_counts( | ||
self.config['resnet_shape'], 0, 0, 1000, self.config['num_groups'] | ||
) | ||
self.config['resnet_shape'], 0, 0, 1000, self.config['num_groups'] + 1 | ||
)[:-1] | ||
ravinkohli marked this conversation as resolved.
Show resolved
Hide resolved
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Hi, I immediately do not understand this. The comment is very clear and I get the intention... Can we add a unit test for this? So that we call There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. sure |
||
|
||
dropout_shape = [ | ||
dropout / 1000 * self.config["max_dropout"] for dropout in dropout_shape | ||
|
@@ -61,7 +65,7 @@ def build_backbone(self, input_shape: Tuple[int, ...]) -> None: | |
out_features=self.config["num_units_%d" % i], | ||
blocks_per_group=self.config["blocks_per_group"], | ||
last_block_index=(i - 1) * self.config["blocks_per_group"], | ||
dropout=self.config['use_dropout'] | ||
dropout=self.config[f'dropout_{i}'] if self.config['use_dropout'] else None | ||
) | ||
) | ||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you please add a unit test where you create a pipeline with a config with dropout=0.123, and just check that the dropout of the layer is set to
0.123
to make sure this does not happen again?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sure