Skip to content

v0.9.0

Choose a tag to compare

@jurasofish jurasofish released this 02 Mar 06:32
· 7 commits to main since this release
4dc75e2

What's Changed

  • Large functions and large whole files are now happy, no more LLM making stuff up because it can't read large chunks: Now chunks over 10k chars (configurable) are split up, to be under 10k chars. Previously these would just be large chunks and the default_response_max_chars option (with a default of 20k chars) would mean that the LLM couldn't see them, and generally in my experience the LLM would just guess things as a result.

Full Changelog: v0.8.0...v0.9.0