-
-
Notifications
You must be signed in to change notification settings - Fork 14
feat:OpenAPI: oneOf responses, nullable JSON, federated and tool metadata #143
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughOpenAPI schema in src/libs/Ollama/openapi.yaml was updated: response format now uses oneOf and supports JSON/object with nullability; “think” controls were generalized; federated metadata fields (remote_model, remote_host) were added across requests/responses/models; CreateModelRequest gained several nullable properties; tool metadata fields were added; various descriptions updated. Changes
Sequence Diagram(s)Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (4)
src/libs/Ollama/openapi.yaml (4)
313-329: Unify and harden thethinkunion; add DRY schema and validate codegen.
- Mixed boolean|string enum unions can trip some generators. Consider a shared component to keep consistency, and add examples.
- Add notes on precedence if both boolean and string are provided by clients (server behavior).
Suggested component and reuse:
@@ + ThinkOption: + nullable: true + oneOf: + - type: boolean + - type: string + enum: [high, medium, low] + description: "Controls whether thinking/reasoning models will think before responding."@@ GenerateCompletionRequest.properties - think: - oneOf: - - type: boolean - - enum: - - high - - medium - - low - type: string - description: "Controls whether thinking/reasoning models will think before responding.\nCan be:\n- boolean: true/false to enable/disable thinking\n- string: \"high\", \"medium\", \"low\" to set thinking intensity level\n" - nullable: true + think: + $ref: '#/components/schemas/ThinkOption' @@ GenerateChatCompletionRequest.properties - think: - oneOf: - - type: boolean - - enum: - - high - - medium - - low - type: string - description: "Controls whether thinking/reasoning models will think before responding.\nCan be:\n- boolean: true/false to enable/disable thinking\n- string: \"high\", \"medium\", \"low\" to set thinking intensity level\n" - nullable: true + think: + $ref: '#/components/schemas/ThinkOption'Please verify your generators (e.g., openapi-generator, swagger-codegen) handle boolean|string oneOf as expected.
Also applies to: 594-610
543-551: Remote metadata: add URI format and consider a reusable mixin.
- remote_host should use format: uri and ideally include examples.
- Consider a small component (e.g., RemoteMetadata with remote_model/remote_host) and allOf compose it where needed to avoid drift.
Example tweak:
- remote_host: - type: string + remote_host: + type: string + format: uri description: URL of the upstream Ollama host (when model is federated from another instance) nullable: trueOptionally:
RemoteMetadata: type: object properties: remote_model: type: string nullable: true description: Name of the upstream remote model remote_host: type: string format: uri nullable: true description: URL of the upstream Ollama hostThen use allOf: [ { $ref: '#/components/schemas/RemoteMetadata' }, { ...existing schema... } ].
Also applies to: 665-673, 912-919, 1095-1102
716-719: tool_name conditionality.Consider expressing (if feasible) that tool_name is required when role=tool. In OAS 3.0 this is awkward; a pragmatic alternative is to add guidance and an example. If you move to OAS 3.1, you can use dependentRequired or if/then/else.
769-772: Constrain index and add example.Add minimum: 0 and clarify zero-based indexing; optionally add example for streaming order.
index: type: integer + minimum: 0 description: The index of the tool call in the list of tool calls
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (42)
src/libs/Ollama/Generated/Ollama..JsonSerializerContext.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.ChatClient.GenerateChatCompletion.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.CompletionsClient.GenerateCompletion.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.IChatClient.GenerateChatCompletion.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.ICompletionsClient.GenerateCompletion.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.IModelsClient.CreateModel.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.JsonConverters.GenerateChatCompletionRequestFormatEnum.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.JsonConverters.GenerateChatCompletionRequestFormatEnumNullable.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.JsonConverters.GenerateChatCompletionRequestThink.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.JsonConverters.GenerateChatCompletionRequestThinkNullable.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.JsonConverters.GenerateCompletionRequestFormatEnum.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.JsonConverters.GenerateCompletionRequestFormatEnumNullable.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.JsonConverters.GenerateCompletionRequestThink.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.JsonConverters.GenerateCompletionRequestThinkNullable.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.JsonConverters.OneOf2.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.JsonSerializerContextTypes.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.CreateModelRequest.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.CreateModelRequestAdapters.Json.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.CreateModelRequestAdapters.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.CreateModelRequestFiles.Json.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.CreateModelRequestFiles.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.CreateModelRequestParameters.Json.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.CreateModelRequestParameters.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.GenerateChatCompletionRequest.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.GenerateChatCompletionRequestFormatEnum.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.GenerateChatCompletionRequestFormatEnum2.Json.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.GenerateChatCompletionRequestFormatEnum2.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.GenerateChatCompletionRequestThink.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.GenerateChatCompletionResponse.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.GenerateCompletionRequest.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.GenerateCompletionRequestFormatEnum.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.GenerateCompletionRequestFormatEnum2.Json.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.GenerateCompletionRequestFormatEnum2.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.GenerateCompletionRequestThink.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.GenerateCompletionResponse.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.Message.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.Model.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.ModelInfo.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.Models.ToolCallFunction.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/Ollama.ModelsClient.CreateModel.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/OneOf.2.Json.g.csis excluded by!**/generated/**src/libs/Ollama/Generated/OneOf.2.g.csis excluded by!**/generated/**
📒 Files selected for processing (1)
src/libs/Ollama/openapi.yaml(12 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Test / Build, test and publish
🔇 Additional comments (3)
src/libs/Ollama/openapi.yaml (3)
287-287: LGTM: clarified raw description.Good clarification; no functional/schema impact.
823-823: Require at least one source in CreateModelRequest schema
Add ananyOfunderCreateModelRequestto enforce presence of one ofmodelfile,path,from,files, oradapters, matching server expectations.
468-475: DRY response format via $ref; relocatedefaultfromoneOfsubschema
- Replace duplicated inline schemas with
$ref: '#/components/schemas/ResponseFormat'.- Although OpenAPI allows
defaultin aoneOfbranch, many generators ignore it—move the default to the parentformatproperty or component and validate with your target generator(s).Also applies to lines 289–299 and 572–581.
Summary by CodeRabbit
New Features
Improvements