You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One of the dropout layer was "wrongly" inserted.
The original final layers of Caffe Version (https://gist.github.com/ksimonyan/211839e770f7b538e2d8) is:
self.classifier = nn.Sequential(
nn.Linear(512 * 7 * 7, 4096),
nn.ReLU(True),
nn.Dropout(),
nn.Linear(4096, 4096),
nn.ReLU(True),
nn.Dropout()
nn.Linear(4096, 1000),
)
This won't make difference when we use model.eval(), but will make discrepancy if we want to finetune VggNet by loading Caffe's parameters.
The text was updated successfully, but these errors were encountered:
On Fri, Mar 10, 2017 at 1:44 AM Licheng Yu ***@***.***> wrote:
One of the dropout layer was "wrongly" inserted.
The original final layers of Caffe Version (
https://gist.github.com/ksimonyan/211839e770f7b538e2d8) is:
self.classifier = nn.Sequential(
nn.Linear(512 * 7 * 7, 4096),
nn.ReLU(True),
nn.Dropout(),
nn.Linear(4096, 4096),
nn.ReLU(True),
nn.Dropout()
nn.Linear(4096, 1000),
)
This won't make difference when we use model.eval(), but will make
discrepancy if we want to finetune VggNet by loading Caffe's parameters.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#92>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAoB-mQoAanKtu7ZOeBuY48ynFDezv_0ks5rkPFjgaJpZM4MZB_2>
.
One of the dropout layer was "wrongly" inserted.
The original final layers of Caffe Version (https://gist.github.com/ksimonyan/211839e770f7b538e2d8) is:
self.classifier = nn.Sequential(
nn.Linear(512 * 7 * 7, 4096),
nn.ReLU(True),
nn.Dropout(),
nn.Linear(4096, 4096),
nn.ReLU(True),
nn.Dropout()
nn.Linear(4096, 1000),
)
This won't make difference when we use model.eval(), but will make discrepancy if we want to finetune VggNet by loading Caffe's parameters.
The text was updated successfully, but these errors were encountered: