Sagemaker inference endpoint support #10756
twerkmeister
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Describe the feature or potential improvement
Currently, when adding llm connections in the UI one can choose between adapters for openai, anthropic, azure, bedrock, google ai vertex, and google ai studio.
We are hosting the finetuned model for our most high volume usecase on sagemaker inference endpoints and as far as I can see we cannot add it as a llm connection to run experiments on datasets comparing different versions of that model.
Is this adapter planned or is it possible to make it work through clever configuration of some other adapter?
Additional information
No response
Beta Was this translation helpful? Give feedback.
All reactions