Validations
Problem
The vLLM API now allows deploying different reranking models while supporting endpoints in the format of both Cohere API and Jina AI API. However, when specifying the model name in "reranker": { "name": "cohere", "params": { "model": model_name, the configuration has fixed model names that are allowed for the Cohere provider, but the user can choose any name and any model for inference in vLLM.
It is necessary to remove the model name check or create another reranking provider - vLLM, which would support the current capabilities.
Additional links:
https://docs.vllm.ai/en/stable/getting_started/examples/cohere_rerank_client.html
https://docs.vllm.ai/en/stable/getting_started/examples/jinaai_rerank_client.html
vllm-project/vllm#12376 [Frontend] Rerank API (Jina- and Cohere-compatible API)
Solution
No response
Validations
Problem
The vLLM API now allows deploying different reranking models while supporting endpoints in the format of both Cohere API and Jina AI API. However, when specifying the model name in "reranker": { "name": "cohere", "params": { "model": model_name, the configuration has fixed model names that are allowed for the Cohere provider, but the user can choose any name and any model for inference in vLLM.
It is necessary to remove the model name check or create another reranking provider - vLLM, which would support the current capabilities.
Additional links:
https://docs.vllm.ai/en/stable/getting_started/examples/cohere_rerank_client.html
https://docs.vllm.ai/en/stable/getting_started/examples/jinaai_rerank_client.html
vllm-project/vllm#12376 [Frontend] Rerank API (Jina- and Cohere-compatible API)
Solution
No response