Issue Type
Description
Following the steps in INSTALL.md, the final check PYTHONPATH=$(pwd) python scripts/test_environment.py fails when importing vllm with:
Attempting to import critical packages...
[SUCCESS] torch found
[SUCCESS] torchvision found
[SUCCESS] transformers found
[SUCCESS] megatron.core found
[SUCCESS] transformer_engine found
INFO ... Automatically detected platform cuda.
[ERROR] Package not successfully imported: vllm
[SUCCESS] pandas found
Traceback (most recent call last):
...
File ".../vllm/transformers_utils/configs/ovis.py", line 75, in <module>
AutoConfig.register("aimv2", AIMv2Config)
File ".../transformers/models/auto/configuration_auto.py", line 1350, in register
CONFIG_MAPPING.register(model_type, config, exist_ok=exist_ok)
File ".../transformers/models/auto/configuration_auto.py", line 1037, in register
raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.")
ValueError: 'aimv2' is already used by a Transformers config, pick another name.
Steps to Reproduce (for bugs)
- Install per current
INSTALL.md.
- Run
PYTHONPATH=$(pwd) python scripts/test_environment.py or PYTHONPATH=$(pwd) python -c "import vllm; print('ok')"
- Error
Expected Behavior
scripts/test_environment.py completes without errors on versions implied by INSTALL.md.
Actual Behavior
Attempting to import critical packages...
[SUCCESS] torch found
[SUCCESS] torchvision found
[SUCCESS] transformers found
[SUCCESS] megatron.core found
[SUCCESS] transformer_engine found
INFO ... Automatically detected platform cuda.
[ERROR] Package not successfully imported: vllm
Manual import shows:
ValueError: 'aimv2' is already used by a Transformers config, pick another name.
Environment
- Model:
nvidia-cosmos/cosmos-transfer1
- Framework (e.g., Docker, Conda):
conda 25.7.0
- Python Version:
3.12.11 (conda env cosmos-transfer1)
- OS/Platform:
Linux, Ubuntu 24.04
- GPU/CPU:
NVIDIA H200 (143GB VRAM, CUDA 12.8)
- Additional Dependencies:
- vllm==0.9.0
- transformers==4.56.1
- tokenizers==0.20.3
- torch==2.7.0+cu128
- torchvision==0.22.0+cu128
- transformer_engine==2.6.0.post1
Additional Context
vllm==0.9.0 registers custom configs via transformers.AutoConfig.register(...).
In transformers>=4.56.x, the key "aimv2" already exists in CONFIG_MAPPING, causing a duplicate-registration ValueError.
Deliverable
Workaround (confirmed)
pip install "tokenizers>=0.21.1"
pip install "transformers==4.45.1"
PYTHONPATH=$(pwd) python scripts/test_environment.py
# -> SUCCESS
Proposed Fix
-
Docs quick fix: Pin compatible versions in INSTALL.md until code is adjusted:
pip install "tokenizers>=0.21.1" "transformers==4.45.1"
or specify a range, e.g.:
transformers>=4.44,<4.51
tokenizers>=0.21.1
and note the known conflict with transformers>=4.56.x.
-
Code fix (preferred): Guard registration to avoid duplicates, e.g.:
from transformers.models.auto.configuration_auto import CONFIG_MAPPING
if "aimv2" not in CONFIG_MAPPING:
AutoConfig.register("aimv2", AIMv2Config)
(or use an exist_ok-style approach if supported), and/or pin a tested vllm/transformers pair in project requirements.
-
Alternative: Update to a vllm release that’s compatible with transformers>=4.56.x and reflect that pin in INSTALL.md.
Impact
Blocks a clean install following INSTALL.md - users hit an error on the final environment check without manual version adjustments.
Issue Type
Description
Following the steps in
INSTALL.md, the final checkPYTHONPATH=$(pwd) python scripts/test_environment.pyfails when importingvllmwith:Steps to Reproduce (for bugs)
INSTALL.md.PYTHONPATH=$(pwd) python scripts/test_environment.pyorPYTHONPATH=$(pwd) python -c "import vllm; print('ok')"Expected Behavior
scripts/test_environment.pycompletes without errors on versions implied byINSTALL.md.Actual Behavior
Manual import shows:
Environment
nvidia-cosmos/cosmos-transfer1conda 25.7.03.12.11 (conda env cosmos-transfer1)Linux, Ubuntu 24.04NVIDIA H200 (143GB VRAM, CUDA 12.8)Additional Context
vllm==0.9.0registers custom configs viatransformers.AutoConfig.register(...).In
transformers>=4.56.x, the key "aimv2" already exists inCONFIG_MAPPING, causing a duplicate-registrationValueError.Deliverable
Workaround (confirmed)
Proposed Fix
Docs quick fix: Pin compatible versions in
INSTALL.mduntil code is adjusted:or specify a range, e.g.:
and note the known conflict with transformers>=4.56.x.
Code fix (preferred): Guard registration to avoid duplicates, e.g.:
(or use an
exist_ok-style approach if supported), and/or pin a testedvllm/transformerspair in project requirements.Alternative: Update to a
vllmrelease that’s compatible withtransformers>=4.56.xand reflect that pin inINSTALL.md.Impact
Blocks a clean install following
INSTALL.md- users hit an error on the final environment check without manual version adjustments.