Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Extract packages in both input and output using the LLM the user called with #214

Merged
merged 1 commit into from
Dec 5, 2024

Conversation

jhrozek
Copy link
Contributor

@jhrozek jhrozek commented Dec 5, 2024

Instead of always using the local LLM to extract the packages, use the
LLM the user called with.

This is implemented in a new module llm_utils - the way it is
implemented is a bit of a kludge as it really should reuse more of the
providers. But because of time constraints, let's start this way and
refactor later.

The important thing is the PackageExtractor class that uses the provided
string to extract packages out of the prompt.

…ed with

Instead of always using the local LLM to extract the packages, use the
LLM the user called with.

This is implemented in a new module llm_utils - the way it is
implemented is a bit of a kludge as it really should reuse more of the
providers. But because of time constraints, let's start this way and
refactor later.

The important thing is the PackageExtractor class that uses the provided
string to extract packages out of the prompt.
Copy link

@lukehinds lukehinds left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @jhrozek lgtm

import structlog

from codegate.codegate_logging import LogFormat, LogLevel, setup_logging
from codegate.config import Config, ConfigurationError
from codegate.db.connection import init_db_sync
from codegate.server import init_app
from src.codegate.storage.utils import restore_storage_backup
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should not use "src" in the package name here. It should be: "from codegate.storage.utils ..."

@@ -0,0 +1,4 @@
from src.codegate.llm_utils.extractor import PackageExtractor
from src.codegate.llm_utils.llmclient import LLMClient
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment as above re: "src".

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will fix in a follow up


class LLMClient:
"""
Base class for LLM interactions handling both local and cloud providers.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This class only handles chat requests currently. Do we need to implement code-completion support in it eventually?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For FIM pipelines we will

Copy link
Contributor

@ptelang ptelang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! I have couple comments, but nothing to block the PR.

@jhrozek jhrozek merged commit d15ba88 into stacklok:main Dec 5, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants