Skip to content

Ensure we properly support structured outputs#49

Merged
jeffpaul merged 4 commits into
developfrom
fix/structured-outputs
Apr 22, 2026
Merged

Ensure we properly support structured outputs#49
jeffpaul merged 4 commits into
developfrom
fix/structured-outputs

Conversation

@dkotter
Copy link
Copy Markdown
Collaborator

@dkotter dkotter commented Apr 20, 2026

Description of the Change

Ollama supports models that support structured outputs but we weren't properly parsing those ourselves. This means any API request that requested a structured output wouldn't work properly.

How to test the Change

  1. Pull this PR down and set up the Ollama provider
  2. Install the AI plugin and turn on Content Classification and Review Notes (both use structured outputs)
  3. Test both and ensure they work, noting that depending on the model you're using, you may run into timeouts on Review Notes

Changelog Entry

Changed - Increase the standard timeout to be 60 seconds
Fixed - Properly parse structured outputs

Credits

Props @dkotter

Checklist:

Open WordPress Playground Preview

@dkotter dkotter added this to the 1.1.0 milestone Apr 20, 2026
@dkotter dkotter self-assigned this Apr 20, 2026
@dkotter dkotter requested a review from jeffpaul April 20, 2026 23:01
@jeffpaul
Copy link
Copy Markdown
Collaborator

I'm running 7.0-RC2-62242 and among other models have qwen3.6:latest available, but keep getting this error:

Network error occurred while sending request to http://localhost:11434/v1/chat/completions: cURL error 28: Operation timed out after 30001 milliseconds with 0 bytes received

@dkotter
Copy link
Copy Markdown
Collaborator Author

dkotter commented Apr 21, 2026

I'm running 7.0-RC2-62242 and among other models have qwen3.6:latest available, but keep getting this error:

Network error occurred while sending request to http://localhost:11434/v1/chat/completions: cURL error 28: Operation timed out after 30001 milliseconds with 0 bytes received

I had run into timeouts a few times in testing though not consistently. That 30 second timeout comes upstream from the AI Client but just pushed up a fix that changes that default to 60 seconds when using this provider. That seemed high enough in my testing to never hit timeouts but let me know if you're still having issues.

@dkotter dkotter mentioned this pull request Apr 21, 2026
19 tasks
@jeffpaul
Copy link
Copy Markdown
Collaborator

Not surprisingly, now getting:

Network error occurred while sending request to http://localhost:11434/v1/chat/completions: cURL error 28: Operation timed out after 60002 milliseconds with 0 bytes received

I've got a whole bunch of older models locally but also some newer, larger ones. Perhaps I'm getting routed to a poorly performing model here?

@dkotter
Copy link
Copy Markdown
Collaborator Author

dkotter commented Apr 21, 2026

I've got a whole bunch of older models locally but also some newer, larger ones. Perhaps I'm getting routed to a poorly performing model here?

Hmm.. maybe. The issue we have right now is there's no way to easily check which model was used in the request.

If you can, can you install this version of the AI plugin (this is from PR 437):

pr-437-f588fc861783a4ec9306ca070a51a25e40c6dfd4.zip

Then go to the AI settings page and activate the AI Request Logging experiment. This should then show a page under Tools > AI Request Logs. Now try running Content Classification or Review Notes and then refresh this log page. It should show you what model was used which I can then try and replicate on my end:

AI request logs

@jeffpaul
Copy link
Copy Markdown
Collaborator

Screenshot 2026-04-21 at 3 45 15 PM
Screenshot 2026-04-21 at 3 44 42 PM Screenshot 2026-04-21 at 3 44 51 PM

@dkotter
Copy link
Copy Markdown
Collaborator Author

dkotter commented Apr 21, 2026

Interesting. I'm using the same model but not running into this. Is it consistent across all features? Or just the review notes experiment?

@jeffpaul
Copy link
Copy Markdown
Collaborator

Updated Ollama to 0.21.1 and classification and review notes both work, let's go!

@jeffpaul jeffpaul merged commit 254e8f1 into develop Apr 22, 2026
14 checks passed
@jeffpaul jeffpaul deleted the fix/structured-outputs branch April 22, 2026 20:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants