-
Notifications
You must be signed in to change notification settings - Fork 52
Open
Description
Hi ,
getting error after adding new token in dictionary.txt
Error(s) in loading state_dict for LitBTTR:
size mismatch for bttr.decoder.word_embed.0.weight: copying a param with shape torch.Size([113, 256]) from checkpoint, the shape in current model is torch.Size([115, 256]).
size mismatch for bttr.decoder.proj.weight: copying a param with shape torch.Size([113, 256]) from checkpoint, the shape in current model is torch.Size([115, 256]).
size mismatch for bttr.decoder.proj.bias: copying a param with shape torch.Size([113]) from checkpoint, the shape in current model is torch.Size([115]).
Kindly help me out how can i fix this error.
Metadata
Metadata
Assignees
Labels
No labels