-
Notifications
You must be signed in to change notification settings - Fork 4
Closed
Description
I was surprised that the chat
method doesn't use the configured default model.
Example:
ChatGPT.configure do |config|
config.api_key = ENV["OPENAI_API_KEY"]
config.api_version = "v1"
config.default_engine = "gpt-4" # <--- default configured here
config.request_timeout = 120
config.default_parameters = {
max_tokens: nil,
temperature: 0.0,
top_p: 0.3,
n: 1,
}
end
resp = client.chat(...)
puts response.dig("model") # outputs: gpt-3.5-turbo
Relevant source:
def chat(messages, params = {})
url = "#{@endpoint}/chat/completions"
data = @config.default_parameters.merge(
model: params[:model] || 'gpt-3.5-turbo', # <--- hard-coded default
messages: messages,
temperature: params[:temperature],
top_p: params[:top_p],
n: params[:n],
stream: params[:stream] || false
).compact
request_api(url, data)
end
I may have misunderstood what's meant to happen, but I expected the configured default_engine
to be used instead.
Thanks for the library, I have found it very useful!
Metadata
Metadata
Assignees
Labels
No labels