-
Notifications
You must be signed in to change notification settings - Fork 300
Closed
Description
Priority
P1
Description
Leverage OpenTelemetry to enable effective observability for ChatQnA and AgentQnA on TGI and vLLM. OPEA integrated OpenTelemetry protocol (OTLP) tracing in v1.2. In v1.3, we need to enable tracing for ChatQnA and AgentQnA and related components like orchestrator, embedding, llms, reranking, retriever etc, export tracing data to Prometheus and visualize it by Grafana dashboards.
Tracing
Components
- add opentelemetry tracing into OPEA DAG and couple microservices code path related to ChatQnA GenAIComps#1122
- [Telemetry] use existed env variable instead of introducing new one GenAIComps#1251
- Update requirements-hpu.txt for open telemetry tracing support HabanaAI/vllm-fork#857 (optional)
- Enable Telemetry Tracing in Agent Comp and also add class name along with func name as span name for tracing GenAIComps#1503
Examples
Data Visualization
Deployment
Documentation
Metadata
Metadata
Assignees
Labels
Type
Projects
Status
Done