Replies: 1 comment
-
|
@adrianocr we are looking into this |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I currently use Litellm proxy with aider. Aider has a built-in
litellm_proxyargument that allows it to use any of your models defined in litellm. Like soaider --model litellm_proxy/gpt-4 --dark-mode --cache-prompts --no-attribute-committeroraider --model litellm_proxy/openrouter/gemini-2.5-pro --dark-mode --cache-prompts --no-attribute-committer --thinking-tokens=5k.The same doesn't work with bifrost. I've tried setting the base api url to bifrost's openai compatible endpoint, to the litellm compatible endpoint, and using the same
litellm_proxyargument to potentially trick aider, but nothing works. No combination of model names, provider prefixes, endpoint changes, etc., seems to work. Despite aider being extremely compatible with just about every AI provider and standard endpoints.Beta Was this translation helpful? Give feedback.
All reactions