Skip to content

Commit ab16559

Browse files
authored
Fix two warnings (#676)
- torch.nn.functional.sigmoid is deprecated in favor of torch.sigmoid. - Clip cosh input in sechsq to avoid overflow.
1 parent 07a7dcf commit ab16559

File tree

2 files changed

+3
-1
lines changed

2 files changed

+3
-1
lines changed

thinc/backends/ops.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1001,6 +1001,8 @@ def erf(self, X: FloatsType) -> FloatsType:
10011001
return out
10021002

10031003
def sechsq(self, X: FloatsType) -> FloatsType:
1004+
# Avoid overflow in cosh. Clipping at |20| has an error of 1.7e-17.
1005+
X = self.xp.clip(X, -20.0, 20.0)
10041006
return (1 / self.xp.cosh(X)) ** 2
10051007

10061008
def gelu_approx(self, X: FloatsType, inplace: bool = False) -> FloatsType:

thinc/tests/backends/test_ops.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ def torch_hard_swish_mobilenet(x):
6262
return torch.nn.functional.hardswish(x)
6363

6464
def torch_sigmoid(x):
65-
return torch.nn.functional.sigmoid(x)
65+
return torch.sigmoid(x)
6666

6767
# https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py#L37
6868
def torch_gelu_approx(x):

0 commit comments

Comments
 (0)