Description
🐛 Bugs / Unexpected behaviors
The cotcurv
method of the mesh_laplacian_smoothing
function is supposed to compute loss as:
\sum_j w_ij(u_j - u_i) / (4 A[i])
where w_ij = (cot a_ij + cot b_ij)
following reference [1] in the docs.
However, the implementation does not follow this, and computes loss as:
(\sum_j (w_ij u_j) - u_i) / (4 A[i])
These two aren't equivalent as \sum_j w_ij
doesn't sum-up to 1.
Instructions To Reproduce the Issue:
The effect of this bug is evident when you compute the cotcurv laplacian loss for ico-spheres of different levels. Since the 'cotcurv' laplacian is supposed to capture mean curvature, we expect the laplacian loss to be the same at different level ico-spheres. However, that's not the case:
for i in range(5):
print(mesh_laplacian_smoothing(ico_sphere(i), 'cotcurv'))
# Output:
# tensor(0.1652)
# tensor(1.3907)
# tensor(6.1977)
# tensor(25.4255)
# tensor(102.4149)
The implementation can be fixed as:
L_sum = torch.sparse.sum(L, dim=1).to_dense().view(-1, 1)
loss = (L.mm(verts_packed) - L_sum*verts_packed) * norm_w
Once fixed, the cotcurv laplacian loss remains same across different ico_sphere levels:
for i in range(5):
print(mesh_laplacian_smoothing(ico_sphere(i), 'cotcurv'))
# Output:
# tensor(0.3333)
# tensor(0.3354)
# tensor(0.3341)
# tensor(0.3335)
# tensor(0.3334)