v0.9.0
What's Changed
- Large functions and large whole files are now happy, no more LLM making stuff up because it can't read large chunks: Now chunks over 10k chars (configurable) are split up, to be under 10k chars. Previously these would just be large chunks and the
default_response_max_charsoption (with a default of 20k chars) would mean that the LLM couldn't see them, and generally in my experience the LLM would just guess things as a result.- Split large chunks by @jurasofish in #46
Full Changelog: v0.8.0...v0.9.0