Skip to content

Commit fc6376b

Browse files
authored
Minor fix to distillation README (#221)
1 parent 2e3c04f commit fc6376b

2 files changed

Lines changed: 2 additions & 2 deletions

File tree

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ Cosmos-Transfer1 includes the following:
3131

3232
## News
3333

34-
- [2025/08] **Cosmos-Transfer1-7B Edge Distilled** is available! Now you can generate videos in a single diffusion step (vs. 35 steps), significantly speeding up inference. We provide the distillation recipe and training code, so you can even distill your own models! Try it out and tell us what you think!
34+
- [2025/08] **Cosmos-Transfer1-7B Edge Distilled** is available! Now you can generate videos in a single diffusion step (vs. 36 steps), significantly speeding up inference. We provide the distillation recipe and training code, so you can even distill your own models! Try it out and tell us what you think!
3535

3636
- [Inference guide](examples/inference_cosmos_transfer1_7b.md#example-2-distilled-single-control-edge)
3737

examples/distillation_cosmos_transfer1_7b.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Distilling Cosmos-Transfer1 Models
22

3-
In July 2025, we released a distilled version of the Cosmos-Transfer1-7B Edge model. We distilled the original 35-step Cosmos-Transfer1-7B Edge model into a single-step model while preserving output quality.
3+
We previously released a distilled version of the Cosmos-Transfer1-7B Edge model. While the original model required 72 total inferences (36 steps x 2) due to classifier-free guidance (CFG), the distilled model requires only a single inference without CFG. This achieves a 72x speedup while maintaining output quality.
44

55
We now provide our distillation recipe and training code, so that you can replicate the diffusion step distillation process using your own models and data.
66

0 commit comments

Comments
 (0)