-
Notifications
You must be signed in to change notification settings - Fork 83
Extract packages in both input and output using the LLM the user called with #214
Conversation
…ed with Instead of always using the local LLM to extract the packages, use the LLM the user called with. This is implemented in a new module llm_utils - the way it is implemented is a bit of a kludge as it really should reuse more of the providers. But because of time constraints, let's start this way and refactor later. The important thing is the PackageExtractor class that uses the provided string to extract packages out of the prompt.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @jhrozek lgtm
import structlog | ||
|
||
from codegate.codegate_logging import LogFormat, LogLevel, setup_logging | ||
from codegate.config import Config, ConfigurationError | ||
from codegate.db.connection import init_db_sync | ||
from codegate.server import init_app | ||
from src.codegate.storage.utils import restore_storage_backup |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should not use "src" in the package name here. It should be: "from codegate.storage.utils ..."
@@ -0,0 +1,4 @@ | |||
from src.codegate.llm_utils.extractor import PackageExtractor | |||
from src.codegate.llm_utils.llmclient import LLMClient |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment as above re: "src".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will fix in a follow up
|
||
class LLMClient: | ||
""" | ||
Base class for LLM interactions handling both local and cloud providers. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This class only handles chat requests currently. Do we need to implement code-completion support in it eventually?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For FIM pipelines we will
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! I have couple comments, but nothing to block the PR.
Instead of always using the local LLM to extract the packages, use the
LLM the user called with.
This is implemented in a new module llm_utils - the way it is
implemented is a bit of a kludge as it really should reuse more of the
providers. But because of time constraints, let's start this way and
refactor later.
The important thing is the PackageExtractor class that uses the provided
string to extract packages out of the prompt.