@@ -26,22 +26,22 @@ started:
26
26
To install PyTorch/XLA stable build in a new TPU VM:
27
27
28
28
```
29
- pip install torch~= 2.5.0 torch_xla[tpu]~= 2.5.0 -f https://storage.googleapis.com/libtpu-releases/index.html
29
+ pip install torch== 2.5.1 torch_xla[tpu]== 2.5.1 -f https://storage.googleapis.com/libtpu-releases/index.html
30
30
```
31
31
32
32
To install PyTorch/XLA nightly build in a new TPU VM:
33
33
34
34
```
35
35
pip3 install --pre torch torchvision --index-url https://download.pytorch.org/whl/nightly/cpu
36
- pip install 'torch_xla[tpu] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.5 .0.dev-cp310-cp310-linux_x86_64.whl' -f https://storage.googleapis.com/libtpu-releases/index.html
36
+ pip install 'torch_xla[tpu] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.6 .0.dev-cp310-cp310-linux_x86_64.whl' -f https://storage.googleapis.com/libtpu-releases/index.html
37
37
```
38
38
39
39
### GPU Plugin
40
40
41
41
PyTorch/XLA now provides GPU support through a plugin package similar to ` libtpu ` :
42
42
43
43
```
44
- pip install torch~= 2.5.0 torch_xla~= 2.5.0 https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla_cuda_plugin-2.5.0 -py3-none-any.whl
44
+ pip install torch== 2.5.1 torch_xla== 2.5.1 https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla_cuda_plugin-2.5.1 -py3-none-any.whl
45
45
```
46
46
47
47
## Getting Started
@@ -154,12 +154,12 @@ GPU and nightly builds are available in our public GCS bucket.
154
154
155
155
| Version | Cloud GPU VM Wheels |
156
156
| --- | ----------- |
157
- | 2.5 (CUDA 12.1 + Python 3.9) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.0 -cp39-cp39-manylinux_2_28_x86_64.whl ` |
158
- | 2.5 (CUDA 12.1 + Python 3.10) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.0 -cp310-cp310-manylinux_2_28_x86_64.whl ` |
159
- | 2.5 (CUDA 12.1 + Python 3.11) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.0 -cp311-cp311-manylinux_2_28_x86_64.whl ` |
160
- | 2.5 (CUDA 12.4 + Python 3.9) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.0 -cp39-cp39-manylinux_2_28_x86_64.whl ` |
161
- | 2.5 (CUDA 12.4 + Python 3.10) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.0 -cp310-cp310-manylinux_2_28_x86_64.whl ` |
162
- | 2.5 (CUDA 12.4 + Python 3.11) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.0 -cp311-cp311-manylinux_2_28_x86_64.whl ` |
157
+ | 2.5.1 (CUDA 12.1 + Python 3.9) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.1 -cp39-cp39-manylinux_2_28_x86_64.whl ` |
158
+ | 2.5.1 (CUDA 12.1 + Python 3.10) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.1 -cp310-cp310-manylinux_2_28_x86_64.whl ` |
159
+ | 2.5.1 (CUDA 12.1 + Python 3.11) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.1 -cp311-cp311-manylinux_2_28_x86_64.whl ` |
160
+ | 2.5.1 (CUDA 12.4 + Python 3.9) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.1 -cp39-cp39-manylinux_2_28_x86_64.whl ` |
161
+ | 2.5.1 (CUDA 12.4 + Python 3.10) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.1 -cp310-cp310-manylinux_2_28_x86_64.whl ` |
162
+ | 2.5.1 (CUDA 12.4 + Python 3.11) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.1 -cp311-cp311-manylinux_2_28_x86_64.whl ` |
163
163
| nightly (Python 3.10) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.6.0.dev-cp310-cp310-linux_x86_64.whl ` |
164
164
| nightly (CUDA 12.1 + Python 3.10) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.6.0.dev-cp310-cp310-linux_x86_64.whl ` |
165
165
@@ -193,6 +193,7 @@ The torch wheel version `2.5.0.dev20240820+cpu` can be found at https://download
193
193
194
194
| Version | Cloud TPU VMs Wheel |
195
195
| ---------| -------------------|
196
+ | 2.5 (Python 3.10) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.5.0-cp310-cp310-manylinux_2_28_x86_64.whl ` |
196
197
| 2.4 (Python 3.10) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.4.0-cp310-cp310-manylinux_2_28_x86_64.whl ` |
197
198
| 2.3 (Python 3.10) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.3.0-cp310-cp310-manylinux_2_28_x86_64.whl ` |
198
199
| 2.2 (Python 3.10) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.2.0-cp310-cp310-manylinux_2_28_x86_64.whl ` |
@@ -203,6 +204,12 @@ The torch wheel version `2.5.0.dev20240820+cpu` can be found at https://download
203
204
204
205
| Version | GPU Wheel |
205
206
| --- | ----------- |
207
+ | 2.5.1 (CUDA 12.1 + Python 3.9) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.1-cp39-cp39-manylinux_2_28_x86_64.whl ` |
208
+ | 2.5.1 (CUDA 12.1 + Python 3.10) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.1-cp310-cp310-manylinux_2_28_x86_64.whl ` |
209
+ | 2.5.1 (CUDA 12.1 + Python 3.11) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.1-cp311-cp311-manylinux_2_28_x86_64.whl ` |
210
+ | 2.5.1 (CUDA 12.4 + Python 3.9) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.1-cp39-cp39-manylinux_2_28_x86_64.whl ` |
211
+ | 2.5.1 (CUDA 12.4 + Python 3.10) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.1-cp310-cp310-manylinux_2_28_x86_64.whl ` |
212
+ | 2.5.1 (CUDA 12.4 + Python 3.11) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.1-cp311-cp311-manylinux_2_28_x86_64.whl ` |
206
213
| 2.5 (CUDA 12.1 + Python 3.9) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.0-cp39-cp39-manylinux_2_28_x86_64.whl ` |
207
214
| 2.5 (CUDA 12.1 + Python 3.10) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.0-cp310-cp310-manylinux_2_28_x86_64.whl ` |
208
215
| 2.5 (CUDA 12.1 + Python 3.11) | ` https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.0-cp311-cp311-manylinux_2_28_x86_64.whl ` |
@@ -226,6 +233,7 @@ The torch wheel version `2.5.0.dev20240820+cpu` can be found at https://download
226
233
227
234
| Version | Cloud TPU VMs Docker |
228
235
| --- | ----------- |
236
+ | 2.5.1 | ` us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.5.1_3.10_tpuvm ` |
229
237
| 2.5 | ` us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.5.0_3.10_tpuvm ` |
230
238
| 2.4 | ` us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.4.0_3.10_tpuvm ` |
231
239
| 2.3 | ` us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.3.0_3.10_tpuvm ` |
@@ -243,6 +251,7 @@ docker run --privileged --net host --shm-size=16G -it us-central1-docker.pkg.dev
243
251
244
252
| Version | GPU CUDA 12.4 Docker |
245
253
| --- | ----------- |
254
+ | 2.5.1 | ` us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.5.1_3.10_cuda_12.4 ` |
246
255
| 2.5 | ` us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.5.0_3.10_cuda_12.4 ` |
247
256
| 2.4 | ` us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.4.0_3.10_cuda_12.4 ` |
248
257
@@ -251,6 +260,7 @@ docker run --privileged --net host --shm-size=16G -it us-central1-docker.pkg.dev
251
260
252
261
| Version | GPU CUDA 12.1 Docker |
253
262
| --- | ----------- |
263
+ | 2.5.1 | ` us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.5.1_3.10_cuda_12.1 ` |
254
264
| 2.5 | ` us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.5.0_3.10_cuda_12.1 ` |
255
265
| 2.4 | ` us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.4.0_3.10_cuda_12.1 ` |
256
266
| 2.3 | ` us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.3.0_3.10_cuda_12.1 ` |
0 commit comments