Skip to content

[Bug] ChatQnA cannot support configuring llm protocol #1504

@Yugar-1

Description

@Yugar-1

Priority

P2-High

OS type

Ubuntu

Hardware type

Xeon-GNR

Installation method

  • Pull docker images from hub.docker.com
  • Build docker images from source
  • Other

Deploy method

  • Docker
  • Docker Compose
  • Kubernetes Helm Charts
  • Kubernetes GMC
  • Other

Running nodes

Single Node

What's the version?

latest

Description

When ChatQnA uses the authorized LLM service endpoint, the protocol may be https. The current LLM service protocol defaults to http, which does not support configuration.

 llm = MicroService(
        name="llm",
        host=LLM_SERVER_HOST_IP,
        port=LLM_SERVER_PORT,
        endpoint="/v1/chat/completions",
        use_remote_service=True,
        service_type=ServiceType.LLM,
    )

Reproduce steps

n/a

Raw log

Attachments

No response

Metadata

Metadata

Labels

A2ready to fixbugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions