Skip to content

SimCLR Loss: False Output (Possibly with Correction) #106

@vaydemir3

Description

@vaydemir3

Hi,

First of all, thanks a lot for sharing your code :) I was having an issue with a particular input to the SimCLR Loss.
Here is the code that produces the false output:

# passing a case where each feature vector is same
featTensor = torch.ones((8, 2, 10)) # batch size of 8 with 2 augmented views and feature dimension of 10
featTensor = nn.functional.normalize(featTensor, dim=-1) #l2 normalize the feature vectors
criterion = SupConLoss(temperature=0.07)
loss = criterion(featTensor)

When each feature vector is same, the loss should output log_e(2N-1), to see why

The above code fails to output log_e(2N-1)

The correct usage of the code seems to be dependent on setting both tempreature and base_temperature arguments of the loss function (thanks to my friend @sstojanov):

import torch.nn as nn                                   
import numpy as np                                      
                                                        
from losses import SupConLoss                           
                                                        
the_answer = np.log(15)          
print("the answer we want", the_answer)                 
                                                        
featTensor = torch.ones((8, 2, 10))                     
featTensor = nn.functional.normalize(featTensor, dim=-1)
                                                        
criterion = SupConLoss(                                 
        temperature=1.0,                                
        base_temperature=1.0)                           
                                                        
loss = criterion(featTensor)
print("the answer we get", loss)   

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions