Add lm studio as model provider#1134
Conversation
Summary of ChangesHello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request significantly enhances the platform's local LLM capabilities by introducing native support for LM Studio. Users can now easily configure and utilize models served by LM Studio, benefiting from improved integration and clear guidance. The changes ensure a consistent user experience across the application and its documentation, making local model deployment more accessible. Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Changelog
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Pull request overview
Adds LM Studio as a first-class LLM provider across the backend provider registry, Console UI, and documentation, enabling CoPaw to connect to LM Studio’s OpenAI-compatible local server and auto-discover loaded models.
Changes:
- Introduce
LMStudioProviderand register it as a built-in provider inProviderManager. - Update Console UI to support LM Studio endpoint hints/defaults and eligibility logic.
- Expand English/Chinese docs + README Docker guidance to include LM Studio setup.
Reviewed changes
Copilot reviewed 17 out of 17 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| website/public/docs/models.zh.md | Document LM Studio provider usage + Docker networking notes (ZH). |
| website/public/docs/models.en.md | Document LM Studio provider usage + context-length/Docker notes (EN). |
| website/public/docs/console.zh.md | Add LM Studio section to Console docs and update quick index (ZH). |
| website/public/docs/console.en.md | Add LM Studio section to Console docs and update quick reference (EN). |
| website/public/docs/config.zh.md | Expand built-in provider table to include LM Studio + others (ZH). |
| website/public/docs/config.en.md | Expand built-in provider table to include LM Studio + others (EN). |
| src/copaw/providers/provider_manager.py | Register LM Studio as a built-in provider and load it from persisted data. |
| src/copaw/providers/lm_studio_provider.py | New provider class that fetches models from LM Studio’s OpenAI-compatible endpoint. |
| console/src/pages/Settings/Models/components/sections/ModelsSection.tsx | Allow LM Studio providers in “eligible” selection when base_url is set. |
| console/src/pages/Settings/Models/components/modals/ProviderConfigModal.tsx | Add LM Studio endpoint hint + default base URL behavior. |
| console/src/pages/Settings/Models/components/cards/RemoteProviderCard.tsx | Mark LM Studio as configured in provider card status logic. |
| console/src/locales/zh.json | Add models.lmstudioEndpointHint translation (ZH). |
| console/src/locales/ru.json | Add models.lmstudioEndpointHint translation (RU). |
| console/src/locales/ja.json | Add models.lmstudioEndpointHint translation (JA). |
| console/src/locales/en.json | Add models.lmstudioEndpointHint translation (EN). |
| README_zh.md | Generalize Docker “host.docker.internal” Base URL guidance to include LM Studio (ZH). |
| README.md | Generalize Docker “host.docker.internal” Base URL guidance to include LM Studio (EN). |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Code Review
This pull request introduces LM Studio as a new model provider, which is a great addition for users running local models. The changes are comprehensive, covering the backend provider implementation, frontend UI updates for configuration, and thorough documentation for users. My review focuses on improving the maintainability and robustness of the new LMStudioProvider implementation. I've suggested a refactoring to reduce code duplication and enhance error logging, which will make the code cleaner and easier to debug in the future.
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 18 out of 18 changed files in this pull request and generated 5 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 17 out of 17 changed files in this pull request and generated 4 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 17 out of 17 changed files in this pull request and generated 1 comment.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Description
As the title says. Also polish doc.