Skip to content

Commit 5483674

Browse files
authored
Fixes typos and references (#657)
1 parent feb1230 commit 5483674

19 files changed

+98
-70
lines changed

docs/backend.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
Backend selection and use
44
=========================
55

6-
`tslearn` proposes different backends (`NumPy` and `PyTorch`)
6+
``tslearn`` proposes different backends (`NumPy` and `PyTorch`)
77
to compute time series metrics such as `DTW` and `Soft-DTW`.
88
The `PyTorch` backend can be used to compute gradients of
99
metric functions thanks to automatic differentiation. The `PyTorch`

docs/conf.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -96,7 +96,6 @@ def matplotlib_svg_scraper(*args, **kwargs):
9696

9797
# General information about the project.
9898
project = u'tslearn'
99-
copyright = u'2025, Romain Tavenard'
10099

101100
# The version info for the project you're documenting, acts as replacement for
102101
# |version| and |release|, also used in various other places throughout the

docs/examples/classification/plot_early_classification.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -7,11 +7,11 @@
77
88
Early classifiers are implemented in the
99
:mod:`tslearn.early_classification` module and in this example
10-
we use the method from [1].
10+
we use the method from [1]_.
1111
1212
13-
[1] A. Dachraoui, A. Bondu & A. Cornuejols. Early classification of time
14-
series as a non myopic sequential decision making problem. ECML/PKDD 2015
13+
.. [1] A. Dachraoui, A. Bondu & A. Cornuejols. Early classification of time
14+
series as a non myopic sequential decision making problem. ECML/PKDD 2015
1515
"""
1616

1717
# Author: Romain Tavenard

docs/examples/classification/plot_svm.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,13 +7,14 @@
77
support vector classification.
88
99
This metric is defined in the :ref:`tslearn.metrics <mod-metrics>` module and
10-
explained in details in [1].
10+
explained in details in [1]_.
1111
1212
In this example, a `TimeSeriesSVC` model that uses GAK as kernel is fit and the
1313
support vectors for each class are reported.
1414
1515
16-
[1] M. Cuturi, "Fast global alignment kernels," ICML 2011.
16+
.. [1] M. Cuturi, "Fast global alignment kernels",
17+
ICML 2011.
1718
"""
1819

1920
# Author: Romain Tavenard

docs/examples/clustering/plot_barycenter_interpolate.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
66
This example presents the weighted Soft-DTW time series barycenter method.
77
8-
Soft-DTW [1] is a differentiable loss function for Dynamic Time Warping,
8+
Soft-DTW [1]_ is a differentiable loss function for Dynamic Time Warping,
99
allowing for the use of gradient-based algorithms. The barycenter corresponds
1010
to the time series that minimizes the sum of the distances between that time
1111
series and all the time series from a dataset. It is thus an optimization
@@ -18,8 +18,8 @@
1818
time series the barycenter is, the higher the weight for this time series
1919
is.
2020
21-
[1] M. Cuturi and M. Blondel, "Soft-DTW: a Differentiable Loss Function for
22-
Time-Series". International Conference on Machine Learning, 2017.
21+
.. [1] M. Cuturi and M. Blondel, "Soft-DTW: a Differentiable Loss Function for
22+
Time-Series". International Conference on Machine Learning, 2017.
2323
"""
2424

2525
# Author: Romain Tavenard

docs/examples/clustering/plot_barycenters.py

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -18,25 +18,25 @@
1818
* *DTW Barycenter Averaging (DBA)* is an iteratively refined barycenter,
1919
starting out with a (potentially) bad candidate and improving it
2020
until convergence criteria are met. The optimization can be accomplished
21-
with (a) expectation-maximization [1] and (b) stochastic subgradient
22-
descent [2]. Empirically, the latter "is [often] more stable and finds better
23-
solutions in shorter time" [2].
21+
with (a) expectation-maximization [1]_ and (b) stochastic subgradient
22+
descent [2]_. Empirically, the latter "is [often] more stable and finds better
23+
solutions in shorter time" [2]_.
2424
* *Soft-DTW barycenter* uses a differentiable loss function to iteratively
25-
find a barycenter [3]. The method itself and the parameter
25+
find a barycenter [3]_. The method itself and the parameter
2626
:math:`\\gamma=1.0` is described in more detail in the section on
2727
:ref:`DTW<dtw>`. There is also a dedicated
2828
:ref:`example<sphx_glr_auto_examples_clustering_plot_barycenter_interpolate.py>`
2929
available.
3030
31-
[1] F. Petitjean, A. Ketterlin & P. Gancarski. A global averaging method for
32-
dynamic time warping, with applications to clustering. Pattern Recognition,
33-
Elsevier, 2011, Vol. 44, Num. 3, pp. 678-693.
31+
.. [1] F. Petitjean, A. Ketterlin & P. Gancarski. A global averaging method for
32+
dynamic time warping, with applications to clustering. Pattern Recognition,
33+
Elsevier, 2011, Vol. 44, Num. 3, pp. 678-693.
3434
35-
[2] D. Schultz & B. Jain. Nonsmooth Analysis and Subgradient Methods for
36-
Averaging in Dynamic Time Warping Spaces. Pattern Recognition, 74, 340-358.
35+
.. [2] D. Schultz & B. Jain. Nonsmooth Analysis and Subgradient Methods for
36+
Averaging in Dynamic Time Warping Spaces. Pattern Recognition, 74, 340-358.
3737
38-
[3] M. Cuturi & M. Blondel. Soft-DTW: a Differentiable Loss Function for
39-
Time-Series. ICML 2017.
38+
.. [3] M. Cuturi & M. Blondel. Soft-DTW: a Differentiable Loss Function for
39+
Time-Series. ICML 2017.
4040
"""
4141

4242
# Author: Romain Tavenard, Felix Divo

docs/examples/clustering/plot_kernel_kmeans.py

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,19 +3,20 @@
33
Kernel k-means
44
==============
55
6-
This example uses Global Alignment kernel (GAK, [1]) at the core of a kernel
7-
:math:`k`-means algorithm [2] to perform time series clustering.
6+
This example uses Global Alignment kernel (GAK, [1]_) at the core of a kernel
7+
:math:`k`-means algorithm [2]_ to perform time series clustering.
88
99
Note that, contrary to :math:`k`-means, a centroid cannot be computed when
1010
using kernel :math:`k`-means. However, one can still report cluster
1111
assignments, which is what is provided here: each subfigure represents the set
1212
of time series from the training set that were assigned to the considered
1313
cluster.
1414
15-
[1] M. Cuturi, "Fast global alignment kernels," ICML 2011.
15+
.. [1] M. Cuturi, "Fast global alignment kernels",
16+
ICML 2011.
1617
17-
[2] I. S. Dhillon, Y. Guan, B. Kulis. Kernel k-means, Spectral Clustering and \
18-
Normalized Cuts. KDD 2004.
18+
.. [2] I. S. Dhillon, Y. Guan, B. Kulis. Kernel k-means, Spectral Clustering and
19+
Normalized Cuts. KDD 2004.
1920
"""
2021

2122
# Author: Romain Tavenard

docs/examples/clustering/plot_kmeans.py

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,8 @@
66
This example uses :math:`k`-means clustering for time series. Three variants of
77
the algorithm are available: standard
88
Euclidean :math:`k`-means, DBA-:math:`k`-means (for DTW Barycenter
9-
Averaging [1])
10-
and Soft-DTW :math:`k`-means [2].
9+
Averaging [1]_)
10+
and Soft-DTW :math:`k`-means [2]_.
1111
1212
In the figure below, each row corresponds to the result of a different
1313
clustering. In a row, each sub-figure corresponds to a cluster.
@@ -29,11 +29,11 @@
2929
time series is scaled independently and there is hence no such thing as an
3030
overall data range.
3131
32-
[1] F. Petitjean, A. Ketterlin & P. Gancarski. A global averaging method \
33-
for dynamic time warping, with applications to clustering. Pattern \
34-
Recognition, Elsevier, 2011, Vol. 44, Num. 3, pp. 678-693
35-
[2] M. Cuturi, M. Blondel "Soft-DTW: a Differentiable Loss Function for \
36-
Time-Series," ICML 2017.
32+
.. [1] F. Petitjean, A. Ketterlin & P. Gancarski. A global averaging method
33+
for dynamic time warping, with applications to clustering. Pattern
34+
Recognition, Elsevier, 2011, Vol. 44, Num. 3, pp. 678-693
35+
.. [2] M. Cuturi, M. Blondel "Soft-DTW: a Differentiable Loss Function for
36+
Time-Series", ICML 2017.
3737
"""
3838

3939
# Author: Romain Tavenard

docs/examples/clustering/plot_kshape.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,12 +3,12 @@
33
KShape
44
======
55
6-
This example uses the KShape clustering method [1] that is based on
6+
This example uses the KShape clustering method [1]_ that is based on
77
cross-correlation to cluster time series.
88
99
10-
[1] J. Paparrizos & L. Gravano. k-Shape: Efficient and Accurate Clustering \
11-
of Time Series. SIGMOD 2015. pp. 1855-1870.
10+
.. [1] J. Paparrizos & L. Gravano. k-Shape: Efficient and Accurate Clustering
11+
of Time Series. SIGMOD 2015. pp. 1855-1870.
1212
"""
1313

1414
# Author: Romain Tavenard

docs/examples/metrics/plot_lb_keogh.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
========
55
66
This example illustrates the principle of time series envelope and its
7-
relationship to the "LB_Keogh" lower bound [1].
7+
relationship to the "LB_Keogh" lower bound [1]_.
88
99
The envelope of a time series consists of two time series such that the
1010
original time series is between the two time series. Denoting the original
@@ -21,7 +21,7 @@
2121
2222
where :math:`r` is the radius of the envelope.
2323
24-
The distance between a time series $Q$ and an envelope :math:`(L, U)` is
24+
The distance between a time series :math:`Q` and an envelope :math:`(L, U)` is
2525
defined as:
2626
2727
.. math::
@@ -36,8 +36,8 @@
3636
3737
So it is simply the Euclidean distance between :math:`Q` and the envelope.
3838
39-
[1] E. Keogh and C. A. Ratanamahatana, "Exact indexing of dynamic time
40-
warping". Knowledge and Information Systems, 7(3), 358-386 (2004).
39+
.. [1] E. Keogh and C. A. Ratanamahatana, "Exact indexing of dynamic time
40+
warping". Knowledge and Information Systems, 7(3), 358-386 (2004).
4141
"""
4242

4343
# Author: Romain Tavenard

0 commit comments

Comments
 (0)