Skip to content

[Bug] ChatQnA cannot support authorized LLM service #1453

@gavinlichn

Description

@gavinlichn

Priority

P2-High

OS type

Ubuntu

Hardware type

Xeon-GNR

Installation method

  • Pull docker images from hub.docker.com
  • Build docker images from source
  • Other

Deploy method

  • Docker
  • Docker Compose
  • Kubernetes Helm Charts
  • Kubernetes GMC
  • Other

Running nodes

Single Node

What's the version?

v1.2

Description

For ChatQnA without wrapper flavor, the mega service not support authorized LLM service(as well as other service).
Since previvors Token OpenAI API Key supported by llm usvc, default flavor(no wrapper) cannot support authorized LLM so far.

Modify of mega code as below can resolve this issue, but need to design to compatible with all examples and all services.

diff --git a/comps/cores/mega/orchestrator.py b/comps/cores/mega/orchestrator.py
index 8a75f9cf..3fa7f99e 100644
--- a/comps/cores/mega/orchestrator.py
+++ b/comps/cores/mega/orchestrator.py
@@ -196,7 +196,10 @@ class ServiceOrchestrator(DAG):
             response = requests.post(
                 url=endpoint,
                 data=json.dumps(inputs),
-                headers={"Content-type": "application/json"},
+                headers={
+                    "Content-type": "application/json",
+                    "Authorization": "Bearer API_KEY_TOKENS"
+                    },
                 proxies={"http": None},
                 stream=True,
                 timeout=1000,

Reproduce steps

n/a

Raw log

Attachments

No response

Metadata

Metadata

Assignees

Labels

A0ScrubebugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions