Skip to content

Conversation

Thibault00
Copy link

  • Add new 'requesty' model provider for enterprise LLM routing
  • Support custom HTTP headers via ModelConfig.Headers field
  • Default base_url to https://router.requesty.ai/v1
  • Maintain full OpenAI API compatibility with now access to +300 models

Enablesintegration with Requesty's enterprise LLM gateway serving 15k+ developers while preserving existing OpenAI workflows.

- Add new 'requesty' model provider for enterprise LLM routing
- Support custom HTTP headers via ModelConfig.Headers field
- Default base_url to https://router.requesty.ai/v1
- Maintain full OpenAI API compatibility with enhanced routing
- Add nil-safe usage parsing for enterprise-grade reliability

Enables seamless integration with Requesty's enterprise LLM gateway
serving 15k+ developers while preserving existing OpenAI workflows.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant