Key changes:
- Use any model via litellm:
Agent(model="litellm/<provider>/<model_name>")
e.g.Agent(model="litellm/anthropic/claude-3-5-sonnet-20240620")
- Enable more complex output types on Agents
What's Changed
- Run CI on all commits, not just ones on main by @rm-openai in #521
- Extract chat completions conversion code into helper by @rm-openai in #522
- Extract chat completions streaming helpers by @rm-openai in #523
- Show repo name/data in docs by @rm-openai in #525
- Litellm integration by @rm-openai in #524
- Docs for LiteLLM integration by @rm-openai in #532
- Docs: Switch to o3 model; exclude translated pages from search by @seratch in #533
- Examples for image inputs by @rm-openai in #553
- Enable non-strict output types by @rm-openai in #539
- Start and finish streaming trace in impl metod by @rm-openai in #540
- Fix visualize graph filename to without extension. by @yuya-haruna in #554
- RFC: automatically use litellm if possible by @rm-openai in #534
- Docs and tests for litellm by @rm-openai in #561
- Pass through organization/project headers to tracing backend, fix speech_group enum by @stevenheidel in #562
- v0.0.12 by @rm-openai in #564
New Contributors
- @yuya-haruna made their first contribution in #554
Full Changelog: v0.0.11...v0.0.12