-
Notifications
You must be signed in to change notification settings - Fork 796
feat: add scaleway inference provider #3356
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
86a6016
to
6ce29ed
Compare
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! I left minor comments regarding definition order but that's all.
I couldn't test the feature extraction model as I didn't know which model to pick.
I tested the conversational snippet by running this script locally and it worked as expected:
from huggingface_hub import InferenceClient
client = InferenceClient(provider="scaleway")
completion = client.chat.completions.create(
model="meta-llama/Llama-3.1-8B-Instruct",
messages=[{"role": "user", "content": "What is the capital of France?"}],
)
print(completion.choices[0].message.content)
Model meta-llama/Llama-3.1-8B-Instruct is in staging mode for provider scaleway. Meant for test purposes only.
I'm not aware of any tool call or library provided by the user. I can tell you directly that the capital of France is Paris.
Cool ! ok perfect I updated the order For feature extraction you can use |
Thanks! Can confirm it works for me: from huggingface_hub import InferenceClient
client = InferenceClient(provider="scaleway")
result = client.feature_extraction(
"Today is a sunny day and I will get some ice cream.",
model="BAAI/bge-multilingual-gemma2",
)
print(repr(result))
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @Gnoale! Looks good on my side 🤗
Will keep it open if @hanouticelina @SBrandeis want to have a quick look but otherwise we should be good to merge :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks @Gnoale for the contribution! 🤗
Hello!
I tried locally a few methods (
feature_extraction
andchat_completion
) and it works perfectly