Skip to content

Commit 73be250

Browse files
authored
Add optional inference params to example (#15)
*Description of changes:* This PR adds optional inference params such as `num_samples`, `top_k`, etc. to the example in the README for clarity. By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.
1 parent ef786e9 commit 73be250

File tree

1 file changed

+9
-1
lines changed

1 file changed

+9
-1
lines changed

README.md

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,7 @@ pip install git+https://github.com/amazon-science/chronos-forecasting.git
3737
A minimal example showing how to perform inference using Chronos models:
3838

3939
```python
40+
# for plotting, run: pip install pandas matplotlib
4041
import matplotlib.pyplot as plt
4142
import numpy as np
4243
import pandas as pd
@@ -55,7 +56,14 @@ df = pd.read_csv("https://raw.githubusercontent.com/AileenNielsen/TimeSeriesAnal
5556
# or a left-padded 2D tensor with batch as the first dimension
5657
context = torch.tensor(df["#Passengers"])
5758
prediction_length = 12
58-
forecast = pipeline.predict(context, prediction_length) # shape [num_series, num_samples, prediction_length]
59+
forecast = pipeline.predict(
60+
context,
61+
prediction_length,
62+
num_samples=20,
63+
temperature=1.0,
64+
top_k=50,
65+
top_p=1.0,
66+
) # forecast shape: [num_series, num_samples, prediction_length]
5967

6068
# visualize the forecast
6169
forecast_index = range(len(df), len(df) + prediction_length)

0 commit comments

Comments
 (0)