Skip to content

Bug in 'cotcurv' mesh_laplacian_smoothing #551

Closed
@shubham-goel

Description

@shubham-goel

🐛 Bugs / Unexpected behaviors

The cotcurv method of the mesh_laplacian_smoothing function is supposed to compute loss as:
\sum_j w_ij(u_j - u_i) / (4 A[i])
where w_ij = (cot a_ij + cot b_ij) following reference [1] in the docs.

However, the implementation does not follow this, and computes loss as:
(\sum_j (w_ij u_j) - u_i) / (4 A[i])

loss = (L.mm(verts_packed) - verts_packed) * norm_w

These two aren't equivalent as \sum_j w_ij doesn't sum-up to 1.

Instructions To Reproduce the Issue:

The effect of this bug is evident when you compute the cotcurv laplacian loss for ico-spheres of different levels. Since the 'cotcurv' laplacian is supposed to capture mean curvature, we expect the laplacian loss to be the same at different level ico-spheres. However, that's not the case:

for i in range(5):
    print(mesh_laplacian_smoothing(ico_sphere(i), 'cotcurv'))

# Output:
# tensor(0.1652)
# tensor(1.3907)
# tensor(6.1977)
# tensor(25.4255)
# tensor(102.4149)

The implementation can be fixed as:

L_sum = torch.sparse.sum(L, dim=1).to_dense().view(-1, 1)
loss = (L.mm(verts_packed) - L_sum*verts_packed) * norm_w

Once fixed, the cotcurv laplacian loss remains same across different ico_sphere levels:

for i in range(5):
    print(mesh_laplacian_smoothing(ico_sphere(i), 'cotcurv'))

# Output:
# tensor(0.3333)
# tensor(0.3354)
# tensor(0.3341)
# tensor(0.3335)
# tensor(0.3334)

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions