Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Add reponse format parameter to LLM chat completion call #234

Merged
merged 1 commit into from
Dec 9, 2024

Conversation

ptelang
Copy link
Contributor

@ptelang ptelang commented Dec 9, 2024

Setting response format parameter will ensure that the LLM response is JSON.

@ptelang ptelang merged commit d666bb6 into main Dec 9, 2024
3 checks passed
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants