- comfyui: bump ver v0.3.69
- llama.cpp: bump ver b7091
- llama.cpp: fix #5
| Project | Image | |
|---|---|---|
| ROCm | ╦═ | docker.io/mixa3607/rocm-gfx906:7.1.0-complete |
| ╠═ | docker.io/mixa3607/rocm-gfx906:7.0.2-complete |
|
| ╠═ | docker.io/mixa3607/rocm-gfx906:7.0.0-complete |
|
| ╠═ | docker.io/mixa3607/rocm-gfx906:6.4.4-complete |
|
| ╚═ | docker.io/mixa3607/rocm-gfx906:6.3.3-complete |
|
| PyTorch | ╦═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.4.4 |
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.3.3 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.4.4 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.3.3 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-6.4.4 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-6.3.3 |
|
| ╚═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-7.0.2 |
|
| ComfyUI | ╦═ | docker.io/mixa3607/comfyui-gfx906:v0.3.69-torch-v2.9.0-rocm-7.0.2 |
| ╚═ | docker.io/mixa3607/comfyui-gfx906:v0.3.69-torch-v2.9.0-rocm-6.3.3 |
|
| vLLM | ╦═ | docker.io/mixa3607/vllm-gfx906:0.11.0-rocm-6.3.3 |
| ╠═ | docker.io/mixa3607/vllm-gfx906:0.10.2-rocm-6.3.3 |
|
| ╚═ | docker.io/mixa3607/vllm-gfx906:0.8.5-rocm-6.3.3 |
|
| llama.cpp | ╦═ | docker.io/mixa3607/llama.cpp-gfx906:full-b7091-rocm-7.1.0 |
| ╚═ | docker.io/mixa3607/llama.cpp-gfx906:full-b7091-rocm-6.3.3 |
Full Changelog: 2025110...2025111