Skip to content

Releases: he-yufeng/CoreCoder

v0.3.0 — LiteLLM backend for non-OpenAI providers

04 May 15:27

Choose a tag to compare

What's new

LiteLLM backend for non-OpenAI providers (#10, contributed by @RheagalFire)

Set CORECODER_PROVIDER=litellm and use any LiteLLM model string —
anthropic/claude-3-haiku, bedrock/anthropic.claude-v2, vertex_ai/gemini-pro, etc.

pip install 'corecoder[litellm]'
export CORECODER_PROVIDER=litellm
export CORECODER_MODEL=anthropic/claude-3-haiku
export ANTHROPIC_API_KEY=sk-ant-...
corecoder

Routes through to 100+ providers (Bedrock, Vertex AI, Cohere, Groq, Replicate, Anyscale, …) via one model-string convention.

Compatibility

The default openai backend is unchanged. Existing setups using OpenAI-compatible endpoints (Kimi, DeepSeek, Qwen, Ollama, …) keep working with no migration.

Install

pip install corecoder            # default
pip install 'corecoder[litellm]' # with LiteLLM

v0.2.0 - Renamed to CoreCoder

06 Apr 12:12

Choose a tag to compare

CoreCoder v0.2.0

Renamed from NanoCoder to CoreCoder to avoid confusion with Nano-Collective/nanocoder.

What changed

  • Package name: nanocoderagentcorecoder
  • CLI command: nanocodercorecoder
  • Env vars: NANOCODER_*CORECODER_*
  • Config dir: ~/.nanocoder~/.corecoder

Install

pip install corecoder

Old links to he-yufeng/NanoCoder redirect here automatically.

v0.1.0

02 Apr 04:52

Choose a tag to compare

NanoCoder v0.1.0 — 512,000 lines of Claude Code → 1,300 lines of Python.

Install: pip install nanocoderagent

  • 7 tools (bash, read, write, edit, glob, grep, agent)
  • Parallel tool execution, 3-layer context compression, sub-agents
  • Any OpenAI-compatible LLM: Kimi K2.5, Claude Opus 4.6, GPT-5, DeepSeek V3, Qwen 3.5, Ollama
  • Architecture deep dive (7 articles)