Skip to content

Using models via OpenRouter (as Anthropic Claude 3.5) returns an error. #1522

Open
@ErykCh

Description

@ErykCh

First of all, it would be nice to have support for OpenRouter in the Chat Model API.

I think the adoption of this tool is already very high and a lot of people are using it. Especially since you can use OpenAI's o1 model without any tier restrictions.

Referencing OpenAI models via OpenRouter using the OpenAI Chat Model API works fine.

However, I get the following error when I try to use the OpenAI Chat Model API to refer to the Claude 3.5 model via OpenRouter:

Caused by: com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot deserialize value of type org.springframework.ai.openai.api.OpenAiApi$ChatCompletionFinishReason from String "end_turn": not one of the values accepted for Enum class: [stop, function_call, length, content_filter, tool_call, tool_calls]
at [Source: REDACTED (StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION disabled); line: 283, column: 198] (through reference chain: org.springframework.ai.openai.api.OpenAiApi$ChatCompletion["choices"]->java.util.ArrayList[0]->org.springframework.ai.openai.api.OpenAiApi$ChatCompletion$Choice["finish_reason"])

The question is, can this be fixed by extending the OpenAI Chat Model API?

If not, can I create my own response parser implementation and plug it into ChatModel? How to do it?

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions