From 2a18013cc6cf25ec59262a4862f7cee2fd13c031 Mon Sep 17 00:00:00 2001 From: letonghan Date: Tue, 14 Jan 2025 10:21:45 +0800 Subject: [PATCH 01/15] clean retrievers docker image in docker_images_list Signed-off-by: letonghan --- docker_images_list.md | 11 ++--------- 1 file changed, 2 insertions(+), 9 deletions(-) diff --git a/docker_images_list.md b/docker_images_list.md index dd934ae827..bb5f0d5454 100644 --- a/docker_images_list.md +++ b/docker_images_list.md @@ -2,7 +2,7 @@ A list of released OPEA docker images in https://hub.docker.com/, contains all relevant images from the GenAIExamples, GenAIComps and GenAIInfra projects. Please expect more public available images in the future release. -Take ChatQnA for example. ChatQnA is a chatbot application service based on the Retrieval Augmented Generation (RAG) architecture. It consists of [opea/embedding](https://hub.docker.com/r/opea/embedding), [opea/retriever-redis](https://hub.docker.com/r/opea/retriever-redis), [opea/reranking-tei](https://hub.docker.com/r/opea/reranking-tei), [opea/llm-textgen](https://hub.docker.com/r/opea/llm-textgen), [opea/dataprep-redis](https://hub.docker.com/r/opea/dataprep-redis), [opea/chatqna](https://hub.docker.com/r/opea/chatqna), [opea/chatqna-ui](https://hub.docker.com/r/opea/chatqna-ui) and [opea/chatqna-conversation-ui](https://hub.docker.com/r/opea/chatqna-conversation-ui) (Optional) multiple microservices. Other services are similar, see the corresponding README for details. +Take ChatQnA for example. ChatQnA is a chatbot application service based on the Retrieval Augmented Generation (RAG) architecture. It consists of [opea/embedding](https://hub.docker.com/r/opea/embedding), [opea/retriever](https://hub.docker.com/r/opea/retriever-redis), [opea/reranking-tei](https://hub.docker.com/r/opea/reranking-tei), [opea/llm-textgen](https://hub.docker.com/r/opea/llm-textgen), [opea/dataprep-redis](https://hub.docker.com/r/opea/dataprep-redis), [opea/chatqna](https://hub.docker.com/r/opea/chatqna), [opea/chatqna-ui](https://hub.docker.com/r/opea/chatqna-ui) and [opea/chatqna-conversation-ui](https://hub.docker.com/r/opea/chatqna-conversation-ui) (Optional) multiple microservices. Other services are similar, see the corresponding README for details. ## Example images @@ -80,14 +80,7 @@ Take ChatQnA for example. ChatQnA is a chatbot application service based on the | [opea/nginx](https://hub.docker.com/r/opea/nginx) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/third_parties/nginx/src/Dockerfile) | The docker image exposed the OPEA nginx microservice for GenAI application use | | [opea/promptregistry-mongo-server](https://hub.docker.com/r/opea/promptregistry-mongo-server) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/prompt_registry/src/Dockerfile) | The docker image exposes the OPEA Prompt Registry microservices which based on MongoDB database, designed to store and retrieve user's preferred prompts | | [opea/reranking]() | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/rerankings/src/Dockerfile) | The docker image exposed the OPEA reranking microservice based on tei docker image for GenAI application use | -| [opea/retriever-milvus](https://hub.docker.com/r/opea/retriever-milvus) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/milvus/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on milvus vectordb for GenAI application use | -| [opea/retriever-pathway](https://hub.docker.com/r/opea/retriever-pathway) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/pathway/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice with pathway for GenAI application use | -| [opea/retriever-pgvector](https://hub.docker.com/r/opea/retriever-pgvector) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/pgvector/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on pgvector vectordb for GenAI application use | -| [opea/retriever-pinecone](https://hub.docker.com/r/opea/retriever-pinecone) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/pinecone/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on pinecone vectordb for GenAI application use | -| [opea/retriever-qdrant](https://hub.docker.com/r/opea/retriever-qdrant) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/qdrant/haystack/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on qdrant vectordb for GenAI application use | -| [opea/retriever-redis](https://hub.docker.com/r/opea/retriever-redis) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/redis/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on redis vectordb for GenAI application use | -| [opea/retriever-redis-llamaindex](https://hub.docker.com/r/opea/retriever-redis-llamaindex) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/redis/llama_index/Dockerfile) | The docker image exposed the OPEA retriever service based on LlamaIndex for GenAI application use | -| [opea/retriever-vdms](https://hub.docker.com/r/opea/retriever-vdms) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/vdms/langchain/Dockerfile) | The docker image exposed the OPEA retriever service based on Visual Data Management System for GenAI application use | +| [opea/retriever]() | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/src/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on milvus vectordb for GenAI application use | | [opea/speecht5](https://hub.docker.com/r/opea/speecht5) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/tts/src/integrations/dependency/speecht5/Dockerfile) | The docker image exposed the OPEA SpeechT5 service for GenAI application use | | [opea/speecht5-gaudi](https://hub.docker.com/r/opea/speecht5-gaudi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/tts/src/integrations/dependency/speecht5/Dockerfile.intel_hpu) | The docker image exposed the OPEA SpeechT5 service on Gaudi2 for GenAI application use | | [opea/tei-gaudi](https://hub.docker.com/r/opea/tei-gaudi/tags) | [Link](https://github.com/huggingface/tei-gaudi/blob/habana-main/Dockerfile-hpu) | The docker image powered by HuggingFace Text Embedding Inference (TEI) on Gaudi2 for deploying and serving Embedding Models | From 52dcd0da014eccb38626d56c4e5e6d60fb80ad5b Mon Sep 17 00:00:00 2001 From: letonghan Date: Tue, 14 Jan 2025 10:28:43 +0800 Subject: [PATCH 02/15] modify retriever docker image path in READMEs Signed-off-by: letonghan --- ChatQnA/docker_compose/amd/gpu/rocm/README.md | 2 +- ChatQnA/docker_compose/intel/cpu/aipc/README.md | 2 +- ChatQnA/docker_compose/intel/cpu/xeon/README.md | 2 +- ChatQnA/docker_compose/intel/hpu/gaudi/README.md | 2 +- ChatQnA/docker_compose/nvidia/gpu/README.md | 2 +- DocIndexRetriever/docker_compose/intel/cpu/xeon/README.md | 2 +- DocIndexRetriever/docker_compose/intel/hpu/gaudi/README.md | 2 +- MultimodalQnA/docker_compose/amd/gpu/rocm/README.md | 2 +- MultimodalQnA/docker_compose/intel/cpu/xeon/README.md | 2 +- MultimodalQnA/docker_compose/intel/hpu/gaudi/README.md | 2 +- ProductivitySuite/docker_compose/intel/cpu/xeon/README.md | 2 +- VideoQnA/docker_compose/intel/cpu/xeon/README.md | 2 +- 12 files changed, 12 insertions(+), 12 deletions(-) diff --git a/ChatQnA/docker_compose/amd/gpu/rocm/README.md b/ChatQnA/docker_compose/amd/gpu/rocm/README.md index 400cf325d3..0d477b5a6b 100644 --- a/ChatQnA/docker_compose/amd/gpu/rocm/README.md +++ b/ChatQnA/docker_compose/amd/gpu/rocm/README.md @@ -94,7 +94,7 @@ cd GenAIComps ### 2. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile . +docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 3. Build Dataprep Image diff --git a/ChatQnA/docker_compose/intel/cpu/aipc/README.md b/ChatQnA/docker_compose/intel/cpu/aipc/README.md index 860629fa46..2b6455dd07 100644 --- a/ChatQnA/docker_compose/intel/cpu/aipc/README.md +++ b/ChatQnA/docker_compose/intel/cpu/aipc/README.md @@ -21,7 +21,7 @@ export https_proxy="Your_HTTPs_Proxy" ### 1. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile . +docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 2. Build Dataprep Image diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/README.md b/ChatQnA/docker_compose/intel/cpu/xeon/README.md index 91aa867897..d5b9fa3eda 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/README.md +++ b/ChatQnA/docker_compose/intel/cpu/xeon/README.md @@ -105,7 +105,7 @@ cd GenAIComps ### 1. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile . +docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 2. Build Dataprep Image diff --git a/ChatQnA/docker_compose/intel/hpu/gaudi/README.md b/ChatQnA/docker_compose/intel/hpu/gaudi/README.md index 5276321e6f..d256bcfae9 100644 --- a/ChatQnA/docker_compose/intel/hpu/gaudi/README.md +++ b/ChatQnA/docker_compose/intel/hpu/gaudi/README.md @@ -78,7 +78,7 @@ cd GenAIComps ### 1. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile . +docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 2. Build Dataprep Image diff --git a/ChatQnA/docker_compose/nvidia/gpu/README.md b/ChatQnA/docker_compose/nvidia/gpu/README.md index 4b21130f17..793d049ebc 100644 --- a/ChatQnA/docker_compose/nvidia/gpu/README.md +++ b/ChatQnA/docker_compose/nvidia/gpu/README.md @@ -104,7 +104,7 @@ cd GenAIComps ### 2. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile . +docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 3. Build Dataprep Image diff --git a/DocIndexRetriever/docker_compose/intel/cpu/xeon/README.md b/DocIndexRetriever/docker_compose/intel/cpu/xeon/README.md index a4f085e8b0..1d6683e920 100644 --- a/DocIndexRetriever/docker_compose/intel/cpu/xeon/README.md +++ b/DocIndexRetriever/docker_compose/intel/cpu/xeon/README.md @@ -15,7 +15,7 @@ DocRetriever are the most widely adopted use case for leveraging the different m - Retriever Vector store Image ```bash - docker build -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile . + docker build -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` - Rerank TEI Image diff --git a/DocIndexRetriever/docker_compose/intel/hpu/gaudi/README.md b/DocIndexRetriever/docker_compose/intel/hpu/gaudi/README.md index f30d017e8e..bc1db26124 100644 --- a/DocIndexRetriever/docker_compose/intel/hpu/gaudi/README.md +++ b/DocIndexRetriever/docker_compose/intel/hpu/gaudi/README.md @@ -15,7 +15,7 @@ DocRetriever are the most widely adopted use case for leveraging the different m - Retriever Vector store Image ```bash - docker build -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile . + docker build -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` - Rerank TEI Image diff --git a/MultimodalQnA/docker_compose/amd/gpu/rocm/README.md b/MultimodalQnA/docker_compose/amd/gpu/rocm/README.md index 2e16848a72..a6d38f3d70 100644 --- a/MultimodalQnA/docker_compose/amd/gpu/rocm/README.md +++ b/MultimodalQnA/docker_compose/amd/gpu/rocm/README.md @@ -45,7 +45,7 @@ docker build --no-cache -t opea/lvm-llava:latest --build-arg https_proxy=$https_ ### 3. Build retriever-multimodal-redis Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile . +docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 4. Build dataprep-multimodal-redis Image diff --git a/MultimodalQnA/docker_compose/intel/cpu/xeon/README.md b/MultimodalQnA/docker_compose/intel/cpu/xeon/README.md index 71706732ad..03e5b0bfa5 100644 --- a/MultimodalQnA/docker_compose/intel/cpu/xeon/README.md +++ b/MultimodalQnA/docker_compose/intel/cpu/xeon/README.md @@ -124,7 +124,7 @@ docker build --no-cache -t opea/embedding:latest --build-arg https_proxy=$https_ ### 2. Build retriever-multimodal-redis Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile . +docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 3. Build LVM Images diff --git a/MultimodalQnA/docker_compose/intel/hpu/gaudi/README.md b/MultimodalQnA/docker_compose/intel/hpu/gaudi/README.md index 10acba6da0..a148bfcafa 100644 --- a/MultimodalQnA/docker_compose/intel/hpu/gaudi/README.md +++ b/MultimodalQnA/docker_compose/intel/hpu/gaudi/README.md @@ -75,7 +75,7 @@ docker build --no-cache -t opea/embedding:latest --build-arg https_proxy=$https_ ### 2. Build retriever-multimodal-redis Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile . +docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 3. Build LVM Images diff --git a/ProductivitySuite/docker_compose/intel/cpu/xeon/README.md b/ProductivitySuite/docker_compose/intel/cpu/xeon/README.md index 8faa43e3c2..10afc85340 100644 --- a/ProductivitySuite/docker_compose/intel/cpu/xeon/README.md +++ b/ProductivitySuite/docker_compose/intel/cpu/xeon/README.md @@ -19,7 +19,7 @@ docker build --no-cache -t opea/embedding:latest --build-arg https_proxy=$https_ ### 2. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile . +docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 3. Build Rerank Image diff --git a/VideoQnA/docker_compose/intel/cpu/xeon/README.md b/VideoQnA/docker_compose/intel/cpu/xeon/README.md index 921f1175db..3047c527c6 100644 --- a/VideoQnA/docker_compose/intel/cpu/xeon/README.md +++ b/VideoQnA/docker_compose/intel/cpu/xeon/README.md @@ -59,7 +59,7 @@ docker build -t opea/embedding-multimodal-clip:latest --build-arg https_proxy=$h ### 2. Build Retriever Image ```bash -docker build -t opea/retriever-vdms:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/vdms/langchain/Dockerfile . +docker build -t opea/retriever-vdms:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 3. Build Reranking Image From 9230fc7d79bdd12952581697586950983ecc80fc Mon Sep 17 00:00:00 2001 From: "pre-commit-ci[bot]" <66853113+pre-commit-ci[bot]@users.noreply.github.com> Date: Tue, 14 Jan 2025 02:31:57 +0000 Subject: [PATCH 03/15] [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --- docker_images_list.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docker_images_list.md b/docker_images_list.md index bb5f0d5454..53f50eb5d6 100644 --- a/docker_images_list.md +++ b/docker_images_list.md @@ -80,7 +80,7 @@ Take ChatQnA for example. ChatQnA is a chatbot application service based on the | [opea/nginx](https://hub.docker.com/r/opea/nginx) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/third_parties/nginx/src/Dockerfile) | The docker image exposed the OPEA nginx microservice for GenAI application use | | [opea/promptregistry-mongo-server](https://hub.docker.com/r/opea/promptregistry-mongo-server) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/prompt_registry/src/Dockerfile) | The docker image exposes the OPEA Prompt Registry microservices which based on MongoDB database, designed to store and retrieve user's preferred prompts | | [opea/reranking]() | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/rerankings/src/Dockerfile) | The docker image exposed the OPEA reranking microservice based on tei docker image for GenAI application use | -| [opea/retriever]() | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/src/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on milvus vectordb for GenAI application use | +| [opea/retriever]() | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/src/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on milvus vectordb for GenAI application use | | [opea/speecht5](https://hub.docker.com/r/opea/speecht5) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/tts/src/integrations/dependency/speecht5/Dockerfile) | The docker image exposed the OPEA SpeechT5 service for GenAI application use | | [opea/speecht5-gaudi](https://hub.docker.com/r/opea/speecht5-gaudi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/tts/src/integrations/dependency/speecht5/Dockerfile.intel_hpu) | The docker image exposed the OPEA SpeechT5 service on Gaudi2 for GenAI application use | | [opea/tei-gaudi](https://hub.docker.com/r/opea/tei-gaudi/tags) | [Link](https://github.com/huggingface/tei-gaudi/blob/habana-main/Dockerfile-hpu) | The docker image powered by HuggingFace Text Embedding Inference (TEI) on Gaudi2 for deploying and serving Embedding Models | From 421fa6d6141bae6c1a17e3970182c8834c126951 Mon Sep 17 00:00:00 2001 From: letonghan Date: Wed, 15 Jan 2025 11:00:44 +0800 Subject: [PATCH 04/15] specify comps branch for ci test Signed-off-by: letonghan --- .github/workflows/_run-docker-compose.yml | 1 + .github/workflows/pr-docker-compose-e2e.yml | 2 +- .github/workflows/pr-dockerfile-path-and-build-yaml-scan.yml | 1 + 3 files changed, 3 insertions(+), 1 deletion(-) diff --git a/.github/workflows/_run-docker-compose.yml b/.github/workflows/_run-docker-compose.yml index daf87add83..a1eb5464ac 100644 --- a/.github/workflows/_run-docker-compose.yml +++ b/.github/workflows/_run-docker-compose.yml @@ -134,6 +134,7 @@ jobs: SERVING_TOKEN: ${{ secrets.SERVING_TOKEN }} IMAGE_REPO: ${{ inputs.registry }} IMAGE_TAG: ${{ inputs.tag }} + opea_branch: "refactor_retrievers" example: ${{ inputs.example }} hardware: ${{ inputs.hardware }} test_case: ${{ matrix.test_case }} diff --git a/.github/workflows/pr-docker-compose-e2e.yml b/.github/workflows/pr-docker-compose-e2e.yml index fe052f90a1..446afa9250 100644 --- a/.github/workflows/pr-docker-compose-e2e.yml +++ b/.github/workflows/pr-docker-compose-e2e.yml @@ -4,7 +4,7 @@ name: E2E test with docker compose on: - pull_request_target: + pull_request: branches: ["main", "*rc"] types: [opened, reopened, ready_for_review, synchronize] # added `ready_for_review` since draft is skipped paths: diff --git a/.github/workflows/pr-dockerfile-path-and-build-yaml-scan.yml b/.github/workflows/pr-dockerfile-path-and-build-yaml-scan.yml index 9fa9d03342..214f340658 100644 --- a/.github/workflows/pr-dockerfile-path-and-build-yaml-scan.yml +++ b/.github/workflows/pr-dockerfile-path-and-build-yaml-scan.yml @@ -22,6 +22,7 @@ jobs: run: | cd .. git clone https://github.com/opea-project/GenAIComps.git + cd GenAIComps && git checkout refactor_retrievers - name: Check for Missing Dockerfile Paths in GenAIComps run: | From e838915ceb723883785f6895061fc4f8aaef7bf0 Mon Sep 17 00:00:00 2001 From: letonghan Date: Wed, 15 Jan 2025 11:21:23 +0800 Subject: [PATCH 05/15] modify all related image build path & unify retrievers image name Signed-off-by: letonghan --- .../docker_compose/amd/gpu/rocm/compose.yaml | 4 +++- .../docker_compose/intel/cpu/aipc/compose.yaml | 4 +++- .../intel/cpu/xeon/README_pinecone.md | 2 +- .../intel/cpu/xeon/README_qdrant.md | 2 +- .../docker_compose/intel/cpu/xeon/compose.yaml | 4 +++- .../intel/cpu/xeon/compose_pinecone.yaml | 4 +++- .../intel/cpu/xeon/compose_qdrant.yaml | 8 +++++--- .../intel/cpu/xeon/compose_vllm.yaml | 4 +++- .../intel/cpu/xeon/compose_without_rerank.yaml | 4 +++- .../intel/hpu/gaudi/compose.yaml | 4 +++- .../intel/hpu/gaudi/compose_guardrails.yaml | 4 +++- .../intel/hpu/gaudi/compose_vllm.yaml | 4 +++- .../hpu/gaudi/compose_without_rerank.yaml | 4 +++- ChatQnA/docker_compose/nvidia/gpu/compose.yaml | 4 +++- ChatQnA/docker_image_build/build.yaml | 18 +++--------------- .../docker_compose/intel/cpu/xeon/compose.yaml | 3 ++- .../intel/cpu/xeon/compose_without_rerank.yaml | 3 ++- .../intel/hpu/gaudi/compose.yaml | 3 ++- .../docker_image_build/build.yaml | 6 +++--- GraphRAG/docker_image_build/build.yaml | 6 +++--- .../docker_compose/amd/gpu/rocm/compose.yaml | 5 +++-- .../docker_compose/intel/cpu/xeon/compose.yaml | 5 +++-- .../intel/hpu/gaudi/compose.yaml | 5 +++-- MultimodalQnA/docker_image_build/build.yaml | 6 +++--- .../docker_compose/intel/cpu/xeon/compose.yaml | 4 ++-- .../docker_image_build/build.yaml | 6 +++--- .../docker_compose/intel/cpu/xeon/compose.yaml | 8 +++++--- VideoQnA/docker_image_build/build.yaml | 6 +++--- 28 files changed, 80 insertions(+), 60 deletions(-) diff --git a/ChatQnA/docker_compose/amd/gpu/rocm/compose.yaml b/ChatQnA/docker_compose/amd/gpu/rocm/compose.yaml index dd0c4ddc7e..1a7b9ad9b4 100644 --- a/ChatQnA/docker_compose/amd/gpu/rocm/compose.yaml +++ b/ChatQnA/docker_compose/amd/gpu/rocm/compose.yaml @@ -49,7 +49,7 @@ services: security_opt: - seccomp:unconfined chatqna-retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: chatqna-retriever-redis-server depends_on: - chatqna-redis-vector-db @@ -63,6 +63,8 @@ services: REDIS_URL: ${CHATQNA_REDIS_URL} INDEX_NAME: ${CHATQNA_INDEX_NAME} TEI_EMBEDDING_ENDPOINT: ${CHATQNA_TEI_EMBEDDING_ENDPOINT} + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped chatqna-tei-reranking-service: image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 diff --git a/ChatQnA/docker_compose/intel/cpu/aipc/compose.yaml b/ChatQnA/docker_compose/intel/cpu/aipc/compose.yaml index f2fe08c833..7cc74b0f37 100644 --- a/ChatQnA/docker_compose/intel/cpu/aipc/compose.yaml +++ b/ChatQnA/docker_compose/intel/cpu/aipc/compose.yaml @@ -39,7 +39,7 @@ services: https_proxy: ${https_proxy} command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis-server depends_on: - redis-vector-db @@ -55,6 +55,8 @@ services: INDEX_NAME: ${INDEX_NAME} TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80 HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tei-reranking-service: image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/README_pinecone.md b/ChatQnA/docker_compose/intel/cpu/xeon/README_pinecone.md index cd1737d420..d7b307d1de 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/README_pinecone.md +++ b/ChatQnA/docker_compose/intel/cpu/xeon/README_pinecone.md @@ -352,7 +352,7 @@ click [here](https://raw.githubusercontent.com/opea-project/GenAIComps/v1.1/comp Or run this command to get the file on a terminal. ```bash -wget https://raw.githubusercontent.com/opea-project/GenAIComps/main/comps/retrievers/redis/data/nke-10k-2023.pdf +wget https://raw.githubusercontent.com/opea-project/GenAIComps/1,1/comps/retrievers/redis/data/nke-10k-2023.pdf ``` diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/README_qdrant.md b/ChatQnA/docker_compose/intel/cpu/xeon/README_qdrant.md index 7cb4241ee3..c7813ed6ab 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/README_qdrant.md +++ b/ChatQnA/docker_compose/intel/cpu/xeon/README_qdrant.md @@ -73,7 +73,7 @@ cd GenAIComps ### 1. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-qdrant:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/qdrant/haystack/Dockerfile . +docker build --no-cache -t opea/retriever-qdrant:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 2. Build Dataprep Image diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/compose.yaml b/ChatQnA/docker_compose/intel/cpu/xeon/compose.yaml index 0c290b8683..6e94a9f998 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/compose.yaml +++ b/ChatQnA/docker_compose/intel/cpu/xeon/compose.yaml @@ -39,7 +39,7 @@ services: https_proxy: ${https_proxy} command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis-server depends_on: - redis-vector-db @@ -55,6 +55,8 @@ services: INDEX_NAME: ${INDEX_NAME} TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80 HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tei-reranking-service: image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/compose_pinecone.yaml b/ChatQnA/docker_compose/intel/cpu/xeon/compose_pinecone.yaml index f42fd6fd2d..d72b49363f 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/compose_pinecone.yaml +++ b/ChatQnA/docker_compose/intel/cpu/xeon/compose_pinecone.yaml @@ -37,7 +37,7 @@ services: https_proxy: ${https_proxy} command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate retriever: - image: ${REGISTRY:-opea}/retriever-pinecone:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-pinecone-server ports: - "7000:7000" @@ -51,6 +51,8 @@ services: LANGCHAIN_API_KEY: ${LANGCHAIN_API_KEY} TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80 HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_PINECONE" restart: unless-stopped tei-reranking-service: image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/compose_qdrant.yaml b/ChatQnA/docker_compose/intel/cpu/xeon/compose_qdrant.yaml index ad7df8fa79..af69531c21 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/compose_qdrant.yaml +++ b/ChatQnA/docker_compose/intel/cpu/xeon/compose_qdrant.yaml @@ -22,8 +22,8 @@ services: https_proxy: ${https_proxy} QDRANT_HOST: qdrant-vector-db QDRANT_PORT: 6333 - COLLECTION_NAME: ${INDEX_NAME} - TEI_ENDPOINT: http://tei-embedding-service:80 + QDRANT_INDEX_NAME: ${INDEX_NAME} + TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80 HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} tei-embedding-service: image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 @@ -39,7 +39,7 @@ services: https_proxy: ${https_proxy} command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate retriever: - image: ${REGISTRY:-opea}/retriever-qdrant:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-qdrant-server depends_on: - qdrant-vector-db @@ -54,6 +54,8 @@ services: QDRANT_PORT: 6333 INDEX_NAME: ${INDEX_NAME} TEI_EMBEDDING_ENDPOINT: ${TEI_EMBEDDING_ENDPOINT} + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_QDRANT" restart: unless-stopped tei-reranking-service: image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/compose_vllm.yaml b/ChatQnA/docker_compose/intel/cpu/xeon/compose_vllm.yaml index 33725f47ec..f34868b6de 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/compose_vllm.yaml +++ b/ChatQnA/docker_compose/intel/cpu/xeon/compose_vllm.yaml @@ -39,7 +39,7 @@ services: https_proxy: ${https_proxy} command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis-server depends_on: - redis-vector-db @@ -55,6 +55,8 @@ services: INDEX_NAME: ${INDEX_NAME} TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80 HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tei-reranking-service: image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/compose_without_rerank.yaml b/ChatQnA/docker_compose/intel/cpu/xeon/compose_without_rerank.yaml index 938a6690d3..92d7fcf7bc 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/compose_without_rerank.yaml +++ b/ChatQnA/docker_compose/intel/cpu/xeon/compose_without_rerank.yaml @@ -39,7 +39,7 @@ services: https_proxy: ${https_proxy} command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis-server depends_on: - redis-vector-db @@ -55,6 +55,8 @@ services: INDEX_NAME: ${INDEX_NAME} TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80 HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tgi-service: image: ghcr.io/huggingface/text-generation-inference:2.4.0-intel-cpu diff --git a/ChatQnA/docker_compose/intel/hpu/gaudi/compose.yaml b/ChatQnA/docker_compose/intel/hpu/gaudi/compose.yaml index 8748a31b44..cc75704aef 100644 --- a/ChatQnA/docker_compose/intel/hpu/gaudi/compose.yaml +++ b/ChatQnA/docker_compose/intel/hpu/gaudi/compose.yaml @@ -40,7 +40,7 @@ services: https_proxy: ${https_proxy} command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate --otlp-endpoint $OTEL_EXPORTER_OTLP_TRACES_ENDPOINT retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis-server depends_on: - redis-vector-db @@ -57,6 +57,8 @@ services: TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80 HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} TELEMETRY_ENDPOINT: ${TELEMETRY_ENDPOINT} + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tei-reranking-service: image: ghcr.io/huggingface/tei-gaudi:1.5.0 diff --git a/ChatQnA/docker_compose/intel/hpu/gaudi/compose_guardrails.yaml b/ChatQnA/docker_compose/intel/hpu/gaudi/compose_guardrails.yaml index 55230d5829..4f062dce3f 100644 --- a/ChatQnA/docker_compose/intel/hpu/gaudi/compose_guardrails.yaml +++ b/ChatQnA/docker_compose/intel/hpu/gaudi/compose_guardrails.yaml @@ -78,7 +78,7 @@ services: https_proxy: ${https_proxy} command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis-server depends_on: - redis-vector-db @@ -94,6 +94,8 @@ services: INDEX_NAME: ${INDEX_NAME} TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80 HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tei-reranking-service: image: ghcr.io/huggingface/tei-gaudi:1.5.0 diff --git a/ChatQnA/docker_compose/intel/hpu/gaudi/compose_vllm.yaml b/ChatQnA/docker_compose/intel/hpu/gaudi/compose_vllm.yaml index 50e2f00591..5c7bd8e0d2 100644 --- a/ChatQnA/docker_compose/intel/hpu/gaudi/compose_vllm.yaml +++ b/ChatQnA/docker_compose/intel/hpu/gaudi/compose_vllm.yaml @@ -39,7 +39,7 @@ services: https_proxy: ${https_proxy} command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis-server depends_on: - redis-vector-db @@ -55,6 +55,8 @@ services: INDEX_NAME: ${INDEX_NAME} TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80 HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tei-reranking-service: image: ghcr.io/huggingface/tei-gaudi:1.5.0 diff --git a/ChatQnA/docker_compose/intel/hpu/gaudi/compose_without_rerank.yaml b/ChatQnA/docker_compose/intel/hpu/gaudi/compose_without_rerank.yaml index 524b44c1a0..8da9ecc0e4 100644 --- a/ChatQnA/docker_compose/intel/hpu/gaudi/compose_without_rerank.yaml +++ b/ChatQnA/docker_compose/intel/hpu/gaudi/compose_without_rerank.yaml @@ -39,7 +39,7 @@ services: https_proxy: ${https_proxy} command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis-server depends_on: - redis-vector-db @@ -55,6 +55,8 @@ services: INDEX_NAME: ${INDEX_NAME} TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80 HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tgi-service: image: ghcr.io/huggingface/tgi-gaudi:2.0.6 diff --git a/ChatQnA/docker_compose/nvidia/gpu/compose.yaml b/ChatQnA/docker_compose/nvidia/gpu/compose.yaml index ba504c2eb3..40f45491c8 100644 --- a/ChatQnA/docker_compose/nvidia/gpu/compose.yaml +++ b/ChatQnA/docker_compose/nvidia/gpu/compose.yaml @@ -40,7 +40,7 @@ services: https_proxy: ${https_proxy} command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis-server depends_on: - redis-vector-db @@ -55,6 +55,8 @@ services: REDIS_HOST: redis-vector-db INDEX_NAME: ${INDEX_NAME} TEI_EMBEDDING_ENDPOINT: ${TEI_EMBEDDING_ENDPOINT} + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tei-reranking-service: image: ghcr.io/huggingface/text-embeddings-inference:1.5 diff --git a/ChatQnA/docker_image_build/build.yaml b/ChatQnA/docker_image_build/build.yaml index ac85d0ab07..7ae42b6029 100644 --- a/ChatQnA/docker_image_build/build.yaml +++ b/ChatQnA/docker_image_build/build.yaml @@ -47,24 +47,12 @@ services: dockerfile: comps/embeddings/src/Dockerfile extends: chatqna image: ${REGISTRY:-opea}/embedding:${TAG:-latest} - retriever-redis: + retriever: build: context: GenAIComps - dockerfile: comps/retrievers/redis/langchain/Dockerfile + dockerfile: comps/retrievers/src/Dockerfile extends: chatqna - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} - retriever-qdrant: - build: - context: GenAIComps - dockerfile: comps/retrievers/qdrant/haystack/Dockerfile - extends: chatqna - image: ${REGISTRY:-opea}/retriever-qdrant:${TAG:-latest} - retriever-pinecone: - build: - context: GenAIComps - dockerfile: comps/retrievers/pinecone/langchain/Dockerfile - extends: chatqna - image: ${REGISTRY:-opea}/retriever-pinecone:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} reranking: build: context: GenAIComps diff --git a/DocIndexRetriever/docker_compose/intel/cpu/xeon/compose.yaml b/DocIndexRetriever/docker_compose/intel/cpu/xeon/compose.yaml index 206bdfb11b..6384312e9b 100644 --- a/DocIndexRetriever/docker_compose/intel/cpu/xeon/compose.yaml +++ b/DocIndexRetriever/docker_compose/intel/cpu/xeon/compose.yaml @@ -67,7 +67,7 @@ services: LOGFLAG: ${LOGFLAG} restart: unless-stopped retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis-server depends_on: - redis-vector-db @@ -83,6 +83,7 @@ services: HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} TEI_EMBEDDING_ENDPOINT: ${TEI_EMBEDDING_ENDPOINT} LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tei-reranking-service: image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 diff --git a/DocIndexRetriever/docker_compose/intel/cpu/xeon/compose_without_rerank.yaml b/DocIndexRetriever/docker_compose/intel/cpu/xeon/compose_without_rerank.yaml index a0a3e7d726..81baf2da3a 100644 --- a/DocIndexRetriever/docker_compose/intel/cpu/xeon/compose_without_rerank.yaml +++ b/DocIndexRetriever/docker_compose/intel/cpu/xeon/compose_without_rerank.yaml @@ -67,7 +67,7 @@ services: LOGFLAG: ${LOGFLAG} restart: unless-stopped retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis-server depends_on: - redis-vector-db @@ -83,6 +83,7 @@ services: HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80 LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped doc-index-retriever-server: image: ${REGISTRY:-opea}/doc-index-retriever:${TAG:-latest} diff --git a/DocIndexRetriever/docker_compose/intel/hpu/gaudi/compose.yaml b/DocIndexRetriever/docker_compose/intel/hpu/gaudi/compose.yaml index 903bb8d635..a73970f36c 100644 --- a/DocIndexRetriever/docker_compose/intel/hpu/gaudi/compose.yaml +++ b/DocIndexRetriever/docker_compose/intel/hpu/gaudi/compose.yaml @@ -72,7 +72,7 @@ services: LOGFLAG: ${LOGFLAG} restart: unless-stopped retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis-server depends_on: - redis-vector-db @@ -86,6 +86,7 @@ services: REDIS_URL: ${REDIS_URL} INDEX_NAME: ${INDEX_NAME} LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tei-reranking-service: image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 diff --git a/DocIndexRetriever/docker_image_build/build.yaml b/DocIndexRetriever/docker_image_build/build.yaml index 956c46fe48..4619a9962d 100644 --- a/DocIndexRetriever/docker_image_build/build.yaml +++ b/DocIndexRetriever/docker_image_build/build.yaml @@ -17,12 +17,12 @@ services: dockerfile: comps/embeddings/src/Dockerfile extends: doc-index-retriever image: ${REGISTRY:-opea}/embedding:${TAG:-latest} - retriever-redis: + retriever: build: context: GenAIComps - dockerfile: comps/retrievers/redis/langchain/Dockerfile + dockerfile: comps/retrievers/src/Dockerfile extends: doc-index-retriever - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} reranking: build: context: GenAIComps diff --git a/GraphRAG/docker_image_build/build.yaml b/GraphRAG/docker_image_build/build.yaml index 0be2bcb523..870b15a674 100644 --- a/GraphRAG/docker_image_build/build.yaml +++ b/GraphRAG/docker_image_build/build.yaml @@ -11,15 +11,15 @@ services: context: ../ dockerfile: ./Dockerfile image: ${REGISTRY:-opea}/graphrag:${TAG:-latest} - retriever-neo4j-llamaindex: + retriever: build: args: http_proxy: ${http_proxy} https_proxy: ${https_proxy} no_proxy: ${no_proxy} context: GenAIComps - dockerfile: comps/retrievers/neo4j/llama_index/Dockerfile - image: ${REGISTRY:-opea}/retriever-neo4j-llamaindex:${TAG:-latest} + dockerfile: comps/retrievers/src/Dockerfile + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} dataprep-neo4j-llamaindex: build: args: diff --git a/MultimodalQnA/docker_compose/amd/gpu/rocm/compose.yaml b/MultimodalQnA/docker_compose/amd/gpu/rocm/compose.yaml index bea1632c63..e38f175f94 100644 --- a/MultimodalQnA/docker_compose/amd/gpu/rocm/compose.yaml +++ b/MultimodalQnA/docker_compose/amd/gpu/rocm/compose.yaml @@ -73,7 +73,7 @@ services: MULTIMODAL_EMBEDDING: true restart: unless-stopped retriever-redis: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis depends_on: - redis-vector-db @@ -87,7 +87,8 @@ services: REDIS_URL: ${REDIS_URL} INDEX_NAME: ${INDEX_NAME} BRIDGE_TOWER_EMBEDDING: ${BRIDGE_TOWER_EMBEDDING} - RETRIEVER_TYPE: "redis" + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tgi-rocm: image: ghcr.io/huggingface/text-generation-inference:3.0.1-rocm diff --git a/MultimodalQnA/docker_compose/intel/cpu/xeon/compose.yaml b/MultimodalQnA/docker_compose/intel/cpu/xeon/compose.yaml index d865d0e41c..48c40f3bb3 100644 --- a/MultimodalQnA/docker_compose/intel/cpu/xeon/compose.yaml +++ b/MultimodalQnA/docker_compose/intel/cpu/xeon/compose.yaml @@ -73,7 +73,7 @@ services: MULTIMODAL_EMBEDDING: true restart: unless-stopped retriever-redis: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis depends_on: - redis-vector-db @@ -87,7 +87,8 @@ services: REDIS_URL: ${REDIS_URL} INDEX_NAME: ${INDEX_NAME} BRIDGE_TOWER_EMBEDDING: ${BRIDGE_TOWER_EMBEDDING} - RETRIEVER_TYPE: "redis" + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped lvm-llava: image: ${REGISTRY:-opea}/lvm-llava:${TAG:-latest} diff --git a/MultimodalQnA/docker_compose/intel/hpu/gaudi/compose.yaml b/MultimodalQnA/docker_compose/intel/hpu/gaudi/compose.yaml index 346a008fd8..7a2641c9a5 100644 --- a/MultimodalQnA/docker_compose/intel/hpu/gaudi/compose.yaml +++ b/MultimodalQnA/docker_compose/intel/hpu/gaudi/compose.yaml @@ -73,7 +73,7 @@ services: MULTIMODAL_EMBEDDING: true restart: unless-stopped retriever-redis: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis depends_on: - redis-vector-db @@ -87,7 +87,8 @@ services: REDIS_URL: ${REDIS_URL} INDEX_NAME: ${INDEX_NAME} BRIDGE_TOWER_EMBEDDING: ${BRIDGE_TOWER_EMBEDDING} - RETRIEVER_TYPE: "redis" + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tgi-gaudi: image: ghcr.io/huggingface/tgi-gaudi:2.0.6 diff --git a/MultimodalQnA/docker_image_build/build.yaml b/MultimodalQnA/docker_image_build/build.yaml index a3159bac29..9c26d99d8e 100644 --- a/MultimodalQnA/docker_image_build/build.yaml +++ b/MultimodalQnA/docker_image_build/build.yaml @@ -29,12 +29,12 @@ services: dockerfile: comps/embeddings/src/Dockerfile extends: multimodalqna image: ${REGISTRY:-opea}/embedding:${TAG:-latest} - retriever-redis: + retriever: build: context: GenAIComps - dockerfile: comps/retrievers/redis/langchain/Dockerfile + dockerfile: comps/retrievers/src/Dockerfile extends: multimodalqna - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} lvm-llava: build: context: GenAIComps diff --git a/ProductivitySuite/docker_compose/intel/cpu/xeon/compose.yaml b/ProductivitySuite/docker_compose/intel/cpu/xeon/compose.yaml index 67921ec35b..1872f12923 100644 --- a/ProductivitySuite/docker_compose/intel/cpu/xeon/compose.yaml +++ b/ProductivitySuite/docker_compose/intel/cpu/xeon/compose.yaml @@ -69,7 +69,7 @@ services: LOGFLAG: ${LOGFLAG} restart: unless-stopped retriever: - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-redis-server depends_on: - redis-vector-db @@ -85,8 +85,8 @@ services: INDEX_NAME: ${INDEX_NAME} TEI_EMBEDDING_ENDPOINT: ${TEI_EMBEDDING_ENDPOINT} HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} - RETRIEVER_TYPE: ${RETRIEVER_TYPE} LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS" restart: unless-stopped tei-reranking-service: image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 diff --git a/ProductivitySuite/docker_image_build/build.yaml b/ProductivitySuite/docker_image_build/build.yaml index 7090e7ac0d..3af5c1e106 100644 --- a/ProductivitySuite/docker_image_build/build.yaml +++ b/ProductivitySuite/docker_image_build/build.yaml @@ -17,12 +17,12 @@ services: dockerfile: comps/embeddings/src/Dockerfile extends: chatqna image: ${REGISTRY:-opea}/embedding:${TAG:-latest} - retriever-redis: + retriever: build: context: GenAIComps - dockerfile: comps/retrievers/redis/langchain/Dockerfile + dockerfile: comps/retrievers/src/Dockerfile extends: chatqna - image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} reranking: build: context: GenAIComps diff --git a/VideoQnA/docker_compose/intel/cpu/xeon/compose.yaml b/VideoQnA/docker_compose/intel/cpu/xeon/compose.yaml index 8610b90aed..f52ceef414 100644 --- a/VideoQnA/docker_compose/intel/cpu/xeon/compose.yaml +++ b/VideoQnA/docker_compose/intel/cpu/xeon/compose.yaml @@ -41,7 +41,7 @@ services: - /home/$USER/.cache/huggingface/hub:/home/user/.cache/huggingface/hub restart: unless-stopped retriever: - image: ${REGISTRY:-opea}/retriever-vdms:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-vdms-server depends_on: - vdms-vector-db @@ -52,10 +52,12 @@ services: no_proxy: ${no_proxy} http_proxy: ${http_proxy} https_proxy: ${https_proxy} - INDEX_NAME: ${INDEX_NAME} + VDMS_INDEX_NAME: ${INDEX_NAME} VDMS_HOST: ${VDMS_HOST} VDMS_PORT: ${VDMS_PORT} - USECLIP: ${USECLIP} + VDMS_USE_CLIP: ${USECLIP} + LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_VDMS" entrypoint: sh -c 'sleep 30 && python retriever_vdms.py' restart: unless-stopped volumes: diff --git a/VideoQnA/docker_image_build/build.yaml b/VideoQnA/docker_image_build/build.yaml index 9fb5a752d4..8f000f7295 100644 --- a/VideoQnA/docker_image_build/build.yaml +++ b/VideoQnA/docker_image_build/build.yaml @@ -29,12 +29,12 @@ services: dockerfile: comps/third_parties/clip/src/Dockerfile extends: videoqna image: ${REGISTRY:-opea}/embedding-multimodal-clip:${TAG:-latest} - retriever-vdms: + retriever: build: context: GenAIComps - dockerfile: comps/retrievers/vdms/langchain/Dockerfile + dockerfile: comps/retrievers/src/Dockerfile extends: videoqna - image: ${REGISTRY:-opea}/retriever-vdms:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} reranking: build: context: GenAIComps From e783bd2b7586dff3785cd2d047955ad3c2f80784 Mon Sep 17 00:00:00 2001 From: letonghan Date: Wed, 15 Jan 2025 11:25:05 +0800 Subject: [PATCH 06/15] udpate Signed-off-by: letonghan --- GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml b/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml index 4b5817a190..5d80d6ff31 100644 --- a/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml +++ b/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml @@ -95,7 +95,7 @@ services: LOGFLAG: ${LOGFLAG} restart: unless-stopped retriever-neo4j-llamaindex: - image: ${REGISTRY:-opea}/retriever-neo4j-llamaindex:${TAG:-latest} + image: ${REGISTRY:-opea}/retriever:${TAG:-latest} container_name: retriever-neo4j-server depends_on: - neo4j-apoc @@ -122,6 +122,7 @@ services: EMBEDDING_MODEL_ID: ${EMBEDDING_MODEL_ID} LLM_MODEL_ID: ${LLM_MODEL_ID} LOGFLAG: ${LOGFLAG} + RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_NEO4J" restart: unless-stopped graphrag-gaudi-backend-server: image: ${REGISTRY:-opea}/graphrag:${TAG:-latest} From 7b679fb970d1baf98b24e8aa44f9a6d868ca7122 Mon Sep 17 00:00:00 2001 From: letonghan Date: Wed, 15 Jan 2025 11:31:19 +0800 Subject: [PATCH 07/15] update readmes Signed-off-by: letonghan --- ChatQnA/docker_compose/amd/gpu/rocm/README.md | 4 ++-- ChatQnA/docker_compose/intel/cpu/aipc/README.md | 4 ++-- ChatQnA/docker_compose/intel/cpu/xeon/README.md | 4 ++-- ChatQnA/docker_compose/intel/cpu/xeon/README_pinecone.md | 6 +++--- ChatQnA/docker_compose/intel/cpu/xeon/README_qdrant.md | 4 ++-- ChatQnA/docker_compose/intel/hpu/gaudi/README.md | 4 ++-- .../intel/hpu/gaudi/how_to_validate_service.md | 2 +- ChatQnA/docker_compose/nvidia/gpu/README.md | 4 ++-- ChatQnA/kubernetes/gmc/README.md | 2 +- DocIndexRetriever/docker_compose/intel/cpu/xeon/README.md | 2 +- DocIndexRetriever/docker_compose/intel/hpu/gaudi/README.md | 2 +- MultimodalQnA/docker_compose/amd/gpu/rocm/README.md | 4 ++-- MultimodalQnA/docker_compose/intel/cpu/xeon/README.md | 4 ++-- MultimodalQnA/docker_compose/intel/hpu/gaudi/README.md | 4 ++-- .../kubernetes/intel/cpu/xeon/manifest/chatqna.yaml | 2 +- VideoQnA/docker_compose/intel/cpu/xeon/README.md | 6 ++---- 16 files changed, 28 insertions(+), 30 deletions(-) diff --git a/ChatQnA/docker_compose/amd/gpu/rocm/README.md b/ChatQnA/docker_compose/amd/gpu/rocm/README.md index 0d477b5a6b..b3a5069ab1 100644 --- a/ChatQnA/docker_compose/amd/gpu/rocm/README.md +++ b/ChatQnA/docker_compose/amd/gpu/rocm/README.md @@ -94,7 +94,7 @@ cd GenAIComps ### 2. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . +docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 3. Build Dataprep Image @@ -143,7 +143,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a Then run the command `docker images`, you will have the following 5 Docker Images: -1. `opea/retriever-redis:latest` +1. `opea/retriever:latest` 2. `opea/dataprep-redis:latest` 3. `opea/chatqna:latest` 4. `opea/chatqna-ui:latest` or `opea/chatqna-react-ui:latest` diff --git a/ChatQnA/docker_compose/intel/cpu/aipc/README.md b/ChatQnA/docker_compose/intel/cpu/aipc/README.md index 2b6455dd07..6b339270c0 100644 --- a/ChatQnA/docker_compose/intel/cpu/aipc/README.md +++ b/ChatQnA/docker_compose/intel/cpu/aipc/README.md @@ -21,7 +21,7 @@ export https_proxy="Your_HTTPs_Proxy" ### 1. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . +docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 2. Build Dataprep Image @@ -61,7 +61,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a Then run the command `docker images`, you will have the following 6 Docker Images: 1. `opea/dataprep-redis:latest` -2. `opea/retriever-redis:latest` +2. `opea/retriever:latest` 3. `opea/chatqna:latest` 4. `opea/chatqna-ui:latest` 5. `opea/nginx:latest` diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/README.md b/ChatQnA/docker_compose/intel/cpu/xeon/README.md index d5b9fa3eda..f908470d71 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/README.md +++ b/ChatQnA/docker_compose/intel/cpu/xeon/README.md @@ -105,7 +105,7 @@ cd GenAIComps ### 1. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . +docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 2. Build Dataprep Image @@ -167,7 +167,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a Then run the command `docker images`, you will have the following 5 Docker Images: 1. `opea/dataprep-redis:latest` -2. `opea/retriever-redis:latest` +2. `opea/retriever:latest` 3. `opea/chatqna:latest` or `opea/chatqna-without-rerank:latest` 4. `opea/chatqna-ui:latest` 5. `opea/nginx:latest` diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/README_pinecone.md b/ChatQnA/docker_compose/intel/cpu/xeon/README_pinecone.md index d7b307d1de..dce5b0a540 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/README_pinecone.md +++ b/ChatQnA/docker_compose/intel/cpu/xeon/README_pinecone.md @@ -108,7 +108,7 @@ cd GenAIComps ### 1. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-pinecone:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/pinecone/langchain/Dockerfile . +docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 2. Build Dataprep Image @@ -170,7 +170,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a Then run the command `docker images`, you will have the following 5 Docker Images: 1. `opea/dataprep-pinecone:latest` -2. `opea/retriever-pinecone:latest` +2. `opea/retriever:latest` 3. `opea/chatqna:latest` or `opea/chatqna-without-rerank:latest` 4. `opea/chatqna-ui:latest` 5. `opea/nginx:latest` @@ -352,7 +352,7 @@ click [here](https://raw.githubusercontent.com/opea-project/GenAIComps/v1.1/comp Or run this command to get the file on a terminal. ```bash -wget https://raw.githubusercontent.com/opea-project/GenAIComps/1,1/comps/retrievers/redis/data/nke-10k-2023.pdf +wget https://raw.githubusercontent.com/opea-project/GenAIComps/v1.1/comps/retrievers/redis/data/nke-10k-2023.pdf ``` diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/README_qdrant.md b/ChatQnA/docker_compose/intel/cpu/xeon/README_qdrant.md index c7813ed6ab..6688f25370 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/README_qdrant.md +++ b/ChatQnA/docker_compose/intel/cpu/xeon/README_qdrant.md @@ -73,7 +73,7 @@ cd GenAIComps ### 1. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-qdrant:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . +docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 2. Build Dataprep Image @@ -128,7 +128,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a Then run the command `docker images`, you will have the following 5 Docker Images: 1. `opea/dataprep-qdrant:latest` -2. `opea/retriever-qdrant:latest` +2. `opea/retriever:latest` 3. `opea/chatqna:latest` 4. `opea/chatqna-ui:latest` 5. `opea/nginx:latest` diff --git a/ChatQnA/docker_compose/intel/hpu/gaudi/README.md b/ChatQnA/docker_compose/intel/hpu/gaudi/README.md index d256bcfae9..85b0338549 100644 --- a/ChatQnA/docker_compose/intel/hpu/gaudi/README.md +++ b/ChatQnA/docker_compose/intel/hpu/gaudi/README.md @@ -78,7 +78,7 @@ cd GenAIComps ### 1. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . +docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 2. Build Dataprep Image @@ -156,7 +156,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a Then run the command `docker images`, you will have the following 5 Docker Images: -- `opea/retriever-redis:latest` +- `opea/retriever:latest` - `opea/dataprep-redis:latest` - `opea/chatqna:latest` - `opea/chatqna-ui:latest` diff --git a/ChatQnA/docker_compose/intel/hpu/gaudi/how_to_validate_service.md b/ChatQnA/docker_compose/intel/hpu/gaudi/how_to_validate_service.md index ca778e7e91..3834d5b8cc 100644 --- a/ChatQnA/docker_compose/intel/hpu/gaudi/how_to_validate_service.md +++ b/ChatQnA/docker_compose/intel/hpu/gaudi/how_to_validate_service.md @@ -46,7 +46,7 @@ bee1132464cd opea/chatqna:latest "python c f810f3b4d329 opea/embedding:latest "python embedding_te…" 2 minutes ago Up 2 minutes 0.0.0.0:6000->6000/tcp, :::6000->6000/tcp embedding-server 325236a01f9b opea/llm-textgen:latest "python llm.py" 2 minutes ago Up 2 minutes 0.0.0.0:9000->9000/tcp, :::9000->9000/tcp llm-textgen-gaudi-server 2fa17d84605f opea/dataprep-redis:latest "python prepare_doc_…" 2 minutes ago Up 2 minutes 0.0.0.0:6007->6007/tcp, :::6007->6007/tcp dataprep-redis-server -69e1fb59e92c opea/retriever-redis:latest "/home/user/comps/re…" 2 minutes ago Up 2 minutes 0.0.0.0:7000->7000/tcp, :::7000->7000/tcp retriever-redis-server +69e1fb59e92c opea/retriever:latest "/home/user/comps/re…" 2 minutes ago Up 2 minutes 0.0.0.0:7000->7000/tcp, :::7000->7000/tcp retriever-redis-server 313b9d14928a opea/reranking-tei:latest "python reranking_te…" 2 minutes ago Up 2 minutes 0.0.0.0:8000->8000/tcp, :::8000->8000/tcp reranking-tei-gaudi-server 174bd43fa6b5 ghcr.io/huggingface/tei-gaudi:1.5.0 "text-embeddings-rou…" 2 minutes ago Up 2 minutes 0.0.0.0:8090->80/tcp, :::8090->80/tcp tei-embedding-gaudi-server 05c40b636239 ghcr.io/huggingface/tgi-gaudi:2.0.6 "text-generation-lau…" 2 minutes ago Exited (1) About a minute ago tgi-gaudi-server diff --git a/ChatQnA/docker_compose/nvidia/gpu/README.md b/ChatQnA/docker_compose/nvidia/gpu/README.md index 793d049ebc..edf9dc12f4 100644 --- a/ChatQnA/docker_compose/nvidia/gpu/README.md +++ b/ChatQnA/docker_compose/nvidia/gpu/README.md @@ -104,7 +104,7 @@ cd GenAIComps ### 2. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . +docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 3. Build Dataprep Image @@ -153,7 +153,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a Then run the command `docker images`, you will have the following 5 Docker Images: -1. `opea/retriever-redis:latest` +1. `opea/retriever:latest` 2. `opea/dataprep-redis:latest` 3. `opea/chatqna:latest` 4. `opea/chatqna-ui:latest` or `opea/chatqna-react-ui:latest` diff --git a/ChatQnA/kubernetes/gmc/README.md b/ChatQnA/kubernetes/gmc/README.md index 2c849c5079..db8b3466f1 100644 --- a/ChatQnA/kubernetes/gmc/README.md +++ b/ChatQnA/kubernetes/gmc/README.md @@ -16,7 +16,7 @@ The ChatQnA uses the below prebuilt images if you choose a Xeon deployment - redis-vector-db: redis/redis-stack:7.2.0-v9 - tei_embedding_service: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 -- retriever: opea/retriever-redis:latest +- retriever: opea/retriever:latest - tei_xeon_service: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 - tgi-service: ghcr.io/huggingface/text-generation-inference:2.4.0-intel-cpu - chaqna-xeon-backend-server: opea/chatqna:latest diff --git a/DocIndexRetriever/docker_compose/intel/cpu/xeon/README.md b/DocIndexRetriever/docker_compose/intel/cpu/xeon/README.md index 1d6683e920..5699ece356 100644 --- a/DocIndexRetriever/docker_compose/intel/cpu/xeon/README.md +++ b/DocIndexRetriever/docker_compose/intel/cpu/xeon/README.md @@ -15,7 +15,7 @@ DocRetriever are the most widely adopted use case for leveraging the different m - Retriever Vector store Image ```bash - docker build -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . + docker build -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` - Rerank TEI Image diff --git a/DocIndexRetriever/docker_compose/intel/hpu/gaudi/README.md b/DocIndexRetriever/docker_compose/intel/hpu/gaudi/README.md index bc1db26124..f2de0048a8 100644 --- a/DocIndexRetriever/docker_compose/intel/hpu/gaudi/README.md +++ b/DocIndexRetriever/docker_compose/intel/hpu/gaudi/README.md @@ -15,7 +15,7 @@ DocRetriever are the most widely adopted use case for leveraging the different m - Retriever Vector store Image ```bash - docker build -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . + docker build -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` - Rerank TEI Image diff --git a/MultimodalQnA/docker_compose/amd/gpu/rocm/README.md b/MultimodalQnA/docker_compose/amd/gpu/rocm/README.md index a6d38f3d70..af0812d84d 100644 --- a/MultimodalQnA/docker_compose/amd/gpu/rocm/README.md +++ b/MultimodalQnA/docker_compose/amd/gpu/rocm/README.md @@ -45,7 +45,7 @@ docker build --no-cache -t opea/lvm-llava:latest --build-arg https_proxy=$https_ ### 3. Build retriever-multimodal-redis Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . +docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 4. Build dataprep-multimodal-redis Image @@ -86,7 +86,7 @@ Then run the command `docker images`, you will have the following 8 Docker Image 1. `opea/dataprep-multimodal-redis:latest` 2. `ghcr.io/huggingface/text-generation-inference:2.4.1-rocm` 3. `opea/lvm:latest` -4. `opea/retriever-multimodal-redis:latest` +4. `opea/retriever:latest` 5. `opea/embedding:latest` 6. `opea/embedding-multimodal-bridgetower:latest` 7. `opea/multimodalqna:latest` diff --git a/MultimodalQnA/docker_compose/intel/cpu/xeon/README.md b/MultimodalQnA/docker_compose/intel/cpu/xeon/README.md index 03e5b0bfa5..5a72491c32 100644 --- a/MultimodalQnA/docker_compose/intel/cpu/xeon/README.md +++ b/MultimodalQnA/docker_compose/intel/cpu/xeon/README.md @@ -124,7 +124,7 @@ docker build --no-cache -t opea/embedding:latest --build-arg https_proxy=$https_ ### 2. Build retriever-multimodal-redis Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . +docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 3. Build LVM Images @@ -181,7 +181,7 @@ Then run the command `docker images`, you will have the following 11 Docker Imag 1. `opea/dataprep-multimodal-redis:latest` 2. `opea/lvm:latest` 3. `opea/lvm-llava:latest` -4. `opea/retriever-multimodal-redis:latest` +4. `opea/retriever:latest` 5. `opea/whisper:latest` 6. `opea/redis-vector-db` 7. `opea/embedding:latest` diff --git a/MultimodalQnA/docker_compose/intel/hpu/gaudi/README.md b/MultimodalQnA/docker_compose/intel/hpu/gaudi/README.md index a148bfcafa..598797b74f 100644 --- a/MultimodalQnA/docker_compose/intel/hpu/gaudi/README.md +++ b/MultimodalQnA/docker_compose/intel/hpu/gaudi/README.md @@ -75,7 +75,7 @@ docker build --no-cache -t opea/embedding:latest --build-arg https_proxy=$https_ ### 2. Build retriever-multimodal-redis Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . +docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 3. Build LVM Images @@ -130,7 +130,7 @@ Then run the command `docker images`, you will have the following 11 Docker Imag 1. `opea/dataprep-multimodal-redis:latest` 2. `opea/lvm:latest` 3. `ghcr.io/huggingface/tgi-gaudi:2.0.6` -4. `opea/retriever-multimodal-redis:latest` +4. `opea/retriever:latest` 5. `opea/whisper:latest` 6. `opea/redis-vector-db` 7. `opea/embedding:latest` diff --git a/ProductivitySuite/kubernetes/intel/cpu/xeon/manifest/chatqna.yaml b/ProductivitySuite/kubernetes/intel/cpu/xeon/manifest/chatqna.yaml index 624c0b0081..c921efea55 100644 --- a/ProductivitySuite/kubernetes/intel/cpu/xeon/manifest/chatqna.yaml +++ b/ProductivitySuite/kubernetes/intel/cpu/xeon/manifest/chatqna.yaml @@ -811,7 +811,7 @@ spec: runAsUser: 1000 seccompProfile: type: RuntimeDefault - image: "opea/retriever-redis:latest" + image: "opea/retriever:latest" imagePullPolicy: IfNotPresent ports: - name: retriever-usvc diff --git a/VideoQnA/docker_compose/intel/cpu/xeon/README.md b/VideoQnA/docker_compose/intel/cpu/xeon/README.md index 3047c527c6..6c5af3d84f 100644 --- a/VideoQnA/docker_compose/intel/cpu/xeon/README.md +++ b/VideoQnA/docker_compose/intel/cpu/xeon/README.md @@ -59,7 +59,7 @@ docker build -t opea/embedding-multimodal-clip:latest --build-arg https_proxy=$h ### 2. Build Retriever Image ```bash -docker build -t opea/retriever-vdms:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . +docker build -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 3. Build Reranking Image @@ -108,15 +108,13 @@ Then run the command `docker images`, you will have the following 8 Docker Image 1. `opea/dataprep-multimodal-vdms:latest` 2. `opea/embedding-multimodal-clip:latest` -3. `opea/retriever-vdms:latest` - <<<<<<< HEAD +3. `opea/retriever:latest` 4. `opea/reranking:latest` 5. `opea/video-llama-lvm-server:latest` 6. # `opea/lvm-video-llama:latest` 7. `opea/reranking-tei:latest` 8. `opea/lvm-video-llama:latest` 9. `opea/lvm:latest` - > > > > > > > d93597cbfd9da92b956adb3673c9e5d743c181af 10. `opea/videoqna:latest` 11. `opea/videoqna-ui:latest` From 1e8bbbe06c86bd3621c633dad564ba170a7843b8 Mon Sep 17 00:00:00 2001 From: letonghan Date: Wed, 15 Jan 2025 11:38:06 +0800 Subject: [PATCH 08/15] modify service name in test scripts Signed-off-by: letonghan --- ChatQnA/tests/test_compose_guardrails_on_gaudi.sh | 2 +- ChatQnA/tests/test_compose_on_gaudi.sh | 2 +- ChatQnA/tests/test_compose_on_rocm.sh | 2 +- ChatQnA/tests/test_compose_on_xeon.sh | 2 +- ChatQnA/tests/test_compose_pinecone_on_xeon.sh | 2 +- ChatQnA/tests/test_compose_qdrant_on_xeon.sh | 2 +- ChatQnA/tests/test_compose_vllm_on_gaudi.sh | 2 +- ChatQnA/tests/test_compose_vllm_on_xeon.sh | 2 +- ChatQnA/tests/test_compose_without_rerank_on_gaudi.sh | 2 +- ChatQnA/tests/test_compose_without_rerank_on_xeon.sh | 2 +- DocIndexRetriever/tests/test_compose_on_xeon.sh | 2 +- DocIndexRetriever/tests/test_compose_without_rerank_on_xeon.sh | 2 +- MultimodalQnA/tests/test_compose_on_gaudi.sh | 2 +- MultimodalQnA/tests/test_compose_on_rocm.sh | 2 +- MultimodalQnA/tests/test_compose_on_xeon.sh | 2 +- ProductivitySuite/docker_compose/intel/cpu/xeon/README.md | 2 +- 16 files changed, 16 insertions(+), 16 deletions(-) diff --git a/ChatQnA/tests/test_compose_guardrails_on_gaudi.sh b/ChatQnA/tests/test_compose_guardrails_on_gaudi.sh index b0376affb5..8fe8dc733f 100644 --- a/ChatQnA/tests/test_compose_guardrails_on_gaudi.sh +++ b/ChatQnA/tests/test_compose_guardrails_on_gaudi.sh @@ -19,7 +19,7 @@ function build_docker_images() { git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="chatqna-guardrails chatqna-ui dataprep-redis retriever-redis guardrails nginx" + service_list="chatqna-guardrails chatqna-ui dataprep-redis retriever guardrails nginx" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker pull ghcr.io/huggingface/tgi-gaudi:2.0.6 diff --git a/ChatQnA/tests/test_compose_on_gaudi.sh b/ChatQnA/tests/test_compose_on_gaudi.sh index 9cfe519b87..22eccb2d5d 100644 --- a/ChatQnA/tests/test_compose_on_gaudi.sh +++ b/ChatQnA/tests/test_compose_on_gaudi.sh @@ -19,7 +19,7 @@ function build_docker_images() { git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="chatqna chatqna-ui dataprep-redis retriever-redis nginx" + service_list="chatqna chatqna-ui dataprep-redis retriever nginx" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker pull ghcr.io/huggingface/tgi-gaudi:2.0.6 diff --git a/ChatQnA/tests/test_compose_on_rocm.sh b/ChatQnA/tests/test_compose_on_rocm.sh index 09a79e9d81..e1cd6adb67 100644 --- a/ChatQnA/tests/test_compose_on_rocm.sh +++ b/ChatQnA/tests/test_compose_on_rocm.sh @@ -52,7 +52,7 @@ function build_docker_images() { git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="chatqna chatqna-ui dataprep-redis retriever-redis nginx" + service_list="chatqna chatqna-ui dataprep-redis retriever nginx" docker compose -f build.yaml build ${service_list} --no-cache > "${LOG_PATH}"/docker_image_build.log docker pull ghcr.io/huggingface/text-generation-inference:2.3.1-rocm diff --git a/ChatQnA/tests/test_compose_on_xeon.sh b/ChatQnA/tests/test_compose_on_xeon.sh index 25ca70bc77..babca0cd43 100644 --- a/ChatQnA/tests/test_compose_on_xeon.sh +++ b/ChatQnA/tests/test_compose_on_xeon.sh @@ -19,7 +19,7 @@ function build_docker_images() { git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="chatqna chatqna-ui dataprep-redis retriever-redis nginx" + service_list="chatqna chatqna-ui dataprep-redis retriever nginx" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker pull ghcr.io/huggingface/text-generation-inference:2.4.0-intel-cpu diff --git a/ChatQnA/tests/test_compose_pinecone_on_xeon.sh b/ChatQnA/tests/test_compose_pinecone_on_xeon.sh index 63147e4eb7..c7e33fcdb7 100755 --- a/ChatQnA/tests/test_compose_pinecone_on_xeon.sh +++ b/ChatQnA/tests/test_compose_pinecone_on_xeon.sh @@ -19,7 +19,7 @@ function build_docker_images() { git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="chatqna chatqna-ui dataprep-pinecone retriever-pinecone nginx" + service_list="chatqna chatqna-ui dataprep-pinecone retriever nginx" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker pull ghcr.io/huggingface/text-generation-inference:2.4.0-intel-cpu diff --git a/ChatQnA/tests/test_compose_qdrant_on_xeon.sh b/ChatQnA/tests/test_compose_qdrant_on_xeon.sh index 79108ddd47..ee4b4efb0a 100644 --- a/ChatQnA/tests/test_compose_qdrant_on_xeon.sh +++ b/ChatQnA/tests/test_compose_qdrant_on_xeon.sh @@ -19,7 +19,7 @@ function build_docker_images() { git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="chatqna chatqna-ui dataprep-qdrant retriever-qdrant nginx" + service_list="chatqna chatqna-ui dataprep-qdrant retriever nginx" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker images && sleep 1s diff --git a/ChatQnA/tests/test_compose_vllm_on_gaudi.sh b/ChatQnA/tests/test_compose_vllm_on_gaudi.sh index f68d246a0b..b66ebe877a 100644 --- a/ChatQnA/tests/test_compose_vllm_on_gaudi.sh +++ b/ChatQnA/tests/test_compose_vllm_on_gaudi.sh @@ -20,7 +20,7 @@ function build_docker_images() { git clone https://github.com/HabanaAI/vllm-fork.git echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="chatqna chatqna-ui dataprep-redis retriever-redis vllm-gaudi nginx" + service_list="chatqna chatqna-ui dataprep-redis retriever vllm-gaudi nginx" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker pull ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 diff --git a/ChatQnA/tests/test_compose_vllm_on_xeon.sh b/ChatQnA/tests/test_compose_vllm_on_xeon.sh index 72f0dd465a..6d95c68a91 100644 --- a/ChatQnA/tests/test_compose_vllm_on_xeon.sh +++ b/ChatQnA/tests/test_compose_vllm_on_xeon.sh @@ -20,7 +20,7 @@ function build_docker_images() { git clone https://github.com/vllm-project/vllm.git echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="chatqna chatqna-ui dataprep-redis retriever-redis vllm nginx" + service_list="chatqna chatqna-ui dataprep-redis retriever vllm nginx" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker pull ghcr.io/huggingface/tgi-gaudi:2.0.6 diff --git a/ChatQnA/tests/test_compose_without_rerank_on_gaudi.sh b/ChatQnA/tests/test_compose_without_rerank_on_gaudi.sh index 1f2f94eba0..e1187bfcf9 100644 --- a/ChatQnA/tests/test_compose_without_rerank_on_gaudi.sh +++ b/ChatQnA/tests/test_compose_without_rerank_on_gaudi.sh @@ -19,7 +19,7 @@ function build_docker_images() { git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="chatqna-without-rerank chatqna-ui dataprep-redis retriever-redis nginx" + service_list="chatqna-without-rerank chatqna-ui dataprep-redis retriever nginx" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker pull ghcr.io/huggingface/tgi-gaudi:2.0.6 diff --git a/ChatQnA/tests/test_compose_without_rerank_on_xeon.sh b/ChatQnA/tests/test_compose_without_rerank_on_xeon.sh index e530cdf1b0..230b8a5d60 100644 --- a/ChatQnA/tests/test_compose_without_rerank_on_xeon.sh +++ b/ChatQnA/tests/test_compose_without_rerank_on_xeon.sh @@ -19,7 +19,7 @@ function build_docker_images() { git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="chatqna-without-rerank chatqna-ui dataprep-redis retriever-redis nginx" + service_list="chatqna-without-rerank chatqna-ui dataprep-redis retriever nginx" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker pull ghcr.io/huggingface/tgi-gaudi:2.0.6 diff --git a/DocIndexRetriever/tests/test_compose_on_xeon.sh b/DocIndexRetriever/tests/test_compose_on_xeon.sh index 8c52a32228..43e39da5a5 100644 --- a/DocIndexRetriever/tests/test_compose_on_xeon.sh +++ b/DocIndexRetriever/tests/test_compose_on_xeon.sh @@ -21,7 +21,7 @@ function build_docker_images() { echo "Cloning GenAIComps repository" git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ fi - service_list="dataprep-redis embedding retriever-redis reranking doc-index-retriever" + service_list="dataprep-redis embedding retriever reranking doc-index-retriever" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker pull ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 diff --git a/DocIndexRetriever/tests/test_compose_without_rerank_on_xeon.sh b/DocIndexRetriever/tests/test_compose_without_rerank_on_xeon.sh index fb499fb657..a65dbac6a7 100644 --- a/DocIndexRetriever/tests/test_compose_without_rerank_on_xeon.sh +++ b/DocIndexRetriever/tests/test_compose_without_rerank_on_xeon.sh @@ -21,7 +21,7 @@ function build_docker_images() { echo "Cloning GenAIComps repository" git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ fi - service_list="dataprep-redis embedding retriever-redis doc-index-retriever" + service_list="dataprep-redis embedding retriever doc-index-retriever" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker pull ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 diff --git a/MultimodalQnA/tests/test_compose_on_gaudi.sh b/MultimodalQnA/tests/test_compose_on_gaudi.sh index ed73dce0c1..85e2af3e24 100644 --- a/MultimodalQnA/tests/test_compose_on_gaudi.sh +++ b/MultimodalQnA/tests/test_compose_on_gaudi.sh @@ -22,7 +22,7 @@ function build_docker_images() { cd $WORKPATH/docker_image_build git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="multimodalqna multimodalqna-ui embedding-multimodal-bridgetower embedding retriever-redis lvm dataprep-multimodal-redis whisper" + service_list="multimodalqna multimodalqna-ui embedding-multimodal-bridgetower embedding retriever lvm dataprep-multimodal-redis whisper" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker pull ghcr.io/huggingface/tgi-gaudi:2.0.6 diff --git a/MultimodalQnA/tests/test_compose_on_rocm.sh b/MultimodalQnA/tests/test_compose_on_rocm.sh index 68a7e02b23..058d9984db 100644 --- a/MultimodalQnA/tests/test_compose_on_rocm.sh +++ b/MultimodalQnA/tests/test_compose_on_rocm.sh @@ -23,7 +23,7 @@ function build_docker_images() { git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="multimodalqna multimodalqna-ui embedding-multimodal-bridgetower embedding retriever-redis lvm dataprep-multimodal-redis whisper" + service_list="multimodalqna multimodalqna-ui embedding-multimodal-bridgetower embedding retriever lvm dataprep-multimodal-redis whisper" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker images && sleep 1m diff --git a/MultimodalQnA/tests/test_compose_on_xeon.sh b/MultimodalQnA/tests/test_compose_on_xeon.sh index 7d030e930d..9a8eeec8bf 100644 --- a/MultimodalQnA/tests/test_compose_on_xeon.sh +++ b/MultimodalQnA/tests/test_compose_on_xeon.sh @@ -22,7 +22,7 @@ function build_docker_images() { cd $WORKPATH/docker_image_build git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="multimodalqna multimodalqna-ui embedding-multimodal-bridgetower embedding retriever-redis lvm-llava lvm dataprep-multimodal-redis whisper" + service_list="multimodalqna multimodalqna-ui embedding-multimodal-bridgetower embedding retriever lvm-llava lvm dataprep-multimodal-redis whisper" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker images && sleep 1m diff --git a/ProductivitySuite/docker_compose/intel/cpu/xeon/README.md b/ProductivitySuite/docker_compose/intel/cpu/xeon/README.md index 10afc85340..32b68f092f 100644 --- a/ProductivitySuite/docker_compose/intel/cpu/xeon/README.md +++ b/ProductivitySuite/docker_compose/intel/cpu/xeon/README.md @@ -19,7 +19,7 @@ docker build --no-cache -t opea/embedding:latest --build-arg https_proxy=$https_ ### 2. Build Retriever Image ```bash -docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . +docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile . ``` ### 3. Build Rerank Image From 2b6ca6bce4b7597f91bf129e19175b7255dfe00c Mon Sep 17 00:00:00 2001 From: letonghan Date: Wed, 15 Jan 2025 21:01:53 +0800 Subject: [PATCH 09/15] fix chatqna xeon pinecone service port Signed-off-by: letonghan --- ChatQnA/docker_compose/intel/cpu/xeon/compose_pinecone.yaml | 2 -- ChatQnA/tests/test_compose_pinecone_on_xeon.sh | 3 ++- 2 files changed, 2 insertions(+), 3 deletions(-) diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/compose_pinecone.yaml b/ChatQnA/docker_compose/intel/cpu/xeon/compose_pinecone.yaml index d72b49363f..022fe3b612 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/compose_pinecone.yaml +++ b/ChatQnA/docker_compose/intel/cpu/xeon/compose_pinecone.yaml @@ -12,8 +12,6 @@ services: - tei-embedding-service ports: - "6007:6007" - - "6008:6008" - - "6009:6009" environment: no_proxy: ${no_proxy} http_proxy: ${http_proxy} diff --git a/ChatQnA/tests/test_compose_pinecone_on_xeon.sh b/ChatQnA/tests/test_compose_pinecone_on_xeon.sh index c7e33fcdb7..35c58f6754 100755 --- a/ChatQnA/tests/test_compose_pinecone_on_xeon.sh +++ b/ChatQnA/tests/test_compose_pinecone_on_xeon.sh @@ -38,6 +38,7 @@ function start_services() { export PINECONE_INDEX_NAME="langchain-test" export INDEX_NAME="langchain-test" export HUGGINGFACEHUB_API_TOKEN=${HUGGINGFACEHUB_API_TOKEN} + export LOGFLAG=true # Start Docker Containers docker compose -f compose_pinecone.yaml up -d > ${LOG_PATH}/start_services_with_compose.log @@ -111,7 +112,7 @@ function validate_microservices() { # test /v1/dataprep/delete_file validate_service \ - "http://${ip_address}:6009/v1/dataprep/delete_file" \ + "http://${ip_address}:6007/v1/dataprep/delete_file" \ '{"status":true}' \ "dataprep_del" \ "dataprep-pinecone-server" From 77d3a6141e97fe96cd56c2f1579cceec0f9f7610 Mon Sep 17 00:00:00 2001 From: letonghan Date: Wed, 15 Jan 2025 21:08:24 +0800 Subject: [PATCH 10/15] fix graphrag test retriever service port Signed-off-by: letonghan --- GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml | 2 +- GraphRAG/tests/test_compose_on_gaudi.sh | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml b/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml index 5d80d6ff31..427cd962ae 100644 --- a/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml +++ b/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml @@ -102,7 +102,7 @@ services: - tgi-gaudi-service - tei-embedding-service ports: - - "6009:6009" + - "7000:7000" ipc: host environment: no_proxy: ${no_proxy} diff --git a/GraphRAG/tests/test_compose_on_gaudi.sh b/GraphRAG/tests/test_compose_on_gaudi.sh index 3525936ae9..2d5866e92c 100755 --- a/GraphRAG/tests/test_compose_on_gaudi.sh +++ b/GraphRAG/tests/test_compose_on_gaudi.sh @@ -127,8 +127,8 @@ function validate_microservices() { # retrieval microservice validate_service \ - "${ip_address}:6009/v1/retrieval" \ - "Retrieval of answers from community summaries successful" \ + "${ip_address}:7000/v1/retrieval" \ + "retrieved_docs" \ "retriever_community_answers_neo4j" \ "retriever-neo4j-server" \ "{\"model\": \"gpt-4o-mini\",\"messages\": [{\"role\": \"user\",\"content\": \"Who is John Brady and has he had any confrontations?\"}]}" From 103c4fd6cecfc239b05571c7867785844fad0a22 Mon Sep 17 00:00:00 2001 From: letonghan Date: Wed, 15 Jan 2025 21:12:03 +0800 Subject: [PATCH 11/15] enlarge sleep time for lvm llava service to start up Signed-off-by: letonghan --- MultimodalQnA/tests/test_compose_on_rocm.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/MultimodalQnA/tests/test_compose_on_rocm.sh b/MultimodalQnA/tests/test_compose_on_rocm.sh index 058d9984db..061c3e5f12 100644 --- a/MultimodalQnA/tests/test_compose_on_rocm.sh +++ b/MultimodalQnA/tests/test_compose_on_rocm.sh @@ -208,7 +208,7 @@ function validate_microservices() { "retriever-redis" \ "{\"text\":\"test\",\"embedding\":${your_embedding}}" - sleep 3m + sleep 5m # llava server echo "Evaluating lvm-llava" From f264cb923aa834453ab4ed9c6b985ab0792baa6f Mon Sep 17 00:00:00 2001 From: letonghan Date: Wed, 15 Jan 2025 21:34:12 +0800 Subject: [PATCH 12/15] enlarge sleep time for retriever Signed-off-by: letonghan --- MultimodalQnA/tests/test_compose_on_rocm.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/MultimodalQnA/tests/test_compose_on_rocm.sh b/MultimodalQnA/tests/test_compose_on_rocm.sh index 061c3e5f12..65fe94390d 100644 --- a/MultimodalQnA/tests/test_compose_on_rocm.sh +++ b/MultimodalQnA/tests/test_compose_on_rocm.sh @@ -196,7 +196,7 @@ function validate_microservices() { "dataprep_get" \ "dataprep-multimodal-redis" - sleep 1m + sleep 2m # multimodal retrieval microservice echo "Validating retriever-redis" From 3cb661953369d287e2ea8cda4481c02d3b192cf5 Mon Sep 17 00:00:00 2001 From: letonghan Date: Wed, 15 Jan 2025 22:04:27 +0800 Subject: [PATCH 13/15] update Signed-off-by: letonghan --- GraphRAG/tests/test_compose_on_gaudi.sh | 2 ++ 1 file changed, 2 insertions(+) diff --git a/GraphRAG/tests/test_compose_on_gaudi.sh b/GraphRAG/tests/test_compose_on_gaudi.sh index 2d5866e92c..515938729b 100755 --- a/GraphRAG/tests/test_compose_on_gaudi.sh +++ b/GraphRAG/tests/test_compose_on_gaudi.sh @@ -125,6 +125,8 @@ function validate_microservices() { "extract_graph_neo4j" \ "dataprep-neo4j-server" + sleep 2m + # retrieval microservice validate_service \ "${ip_address}:7000/v1/retrieval" \ From e3fdce76a5b2e168289c4f00655009096a9023c1 Mon Sep 17 00:00:00 2001 From: letonghan Date: Thu, 16 Jan 2025 10:33:37 +0800 Subject: [PATCH 14/15] fix graphrag port and env var issue Signed-off-by: letonghan --- GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml | 4 ++-- GraphRAG/tests/test_compose_on_gaudi.sh | 1 + 2 files changed, 3 insertions(+), 2 deletions(-) diff --git a/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml b/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml index 427cd962ae..baf7b95a9d 100644 --- a/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml +++ b/GraphRAG/docker_compose/intel/hpu/gaudi/compose.yaml @@ -111,7 +111,7 @@ services: host_ip: ${host_ip} HUGGING_FACE_HUB_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} HF_TOKEN: ${HF_TOKEN} - NEO4J_URL: ${NEO4J_URL} + NEO4J_URI: ${NEO4J_URL} NEO4J_USERNAME: ${NEO4J_USERNAME} NEO4J_PASSWORD: ${NEO4J_PASSWORD} TGI_LLM_ENDPOINT: ${TGI_LLM_ENDPOINT} @@ -140,7 +140,7 @@ services: - http_proxy=${http_proxy} - MEGA_SERVICE_HOST_IP=graphrag-gaudi-backend-server - RETRIEVER_SERVICE_HOST_IP=retriever-neo4j-llamaindex - - RETRIEVER_SERVICE_PORT=6009 + - RETRIEVER_SERVICE_PORT=7000 - LLM_SERVER_HOST_IP=tgi-gaudi-service - LLM_SERVER_PORT=${LLM_SERVER_PORT:-80} - LOGFLAG=${LOGFLAG} diff --git a/GraphRAG/tests/test_compose_on_gaudi.sh b/GraphRAG/tests/test_compose_on_gaudi.sh index 515938729b..96e671b3f6 100755 --- a/GraphRAG/tests/test_compose_on_gaudi.sh +++ b/GraphRAG/tests/test_compose_on_gaudi.sh @@ -38,6 +38,7 @@ function start_services() { export TEI_EMBEDDING_ENDPOINT="http://${ip_address}:6006" export TGI_LLM_ENDPOINT="http://${ip_address}:6005" export host_ip=${ip_address} + export LOGFLAG=true # Start Docker Containers sed -i "s|container_name: graphrag-gaudi-backend-server|container_name: graphrag-gaudi-backend-server\n volumes:\n - \"${WORKPATH}\/docker_image_build\/GenAIComps:\/home\/user\/GenAIComps\"|g" compose.yaml From c0d81c942b1e748fb2dafe3788690334717bc094 Mon Sep 17 00:00:00 2001 From: letonghan Date: Thu, 16 Jan 2025 14:16:57 +0800 Subject: [PATCH 15/15] Revert "specify comps branch for ci test" Revert CI related branch settings for merge. This reverts commit 421fa6d6141bae6c1a17e3970182c8834c126951. --- .github/workflows/_run-docker-compose.yml | 1 - .github/workflows/pr-docker-compose-e2e.yml | 2 +- .github/workflows/pr-dockerfile-path-and-build-yaml-scan.yml | 1 - 3 files changed, 1 insertion(+), 3 deletions(-) diff --git a/.github/workflows/_run-docker-compose.yml b/.github/workflows/_run-docker-compose.yml index a1eb5464ac..daf87add83 100644 --- a/.github/workflows/_run-docker-compose.yml +++ b/.github/workflows/_run-docker-compose.yml @@ -134,7 +134,6 @@ jobs: SERVING_TOKEN: ${{ secrets.SERVING_TOKEN }} IMAGE_REPO: ${{ inputs.registry }} IMAGE_TAG: ${{ inputs.tag }} - opea_branch: "refactor_retrievers" example: ${{ inputs.example }} hardware: ${{ inputs.hardware }} test_case: ${{ matrix.test_case }} diff --git a/.github/workflows/pr-docker-compose-e2e.yml b/.github/workflows/pr-docker-compose-e2e.yml index 446afa9250..fe052f90a1 100644 --- a/.github/workflows/pr-docker-compose-e2e.yml +++ b/.github/workflows/pr-docker-compose-e2e.yml @@ -4,7 +4,7 @@ name: E2E test with docker compose on: - pull_request: + pull_request_target: branches: ["main", "*rc"] types: [opened, reopened, ready_for_review, synchronize] # added `ready_for_review` since draft is skipped paths: diff --git a/.github/workflows/pr-dockerfile-path-and-build-yaml-scan.yml b/.github/workflows/pr-dockerfile-path-and-build-yaml-scan.yml index c87cbae3b4..dc62caaa35 100644 --- a/.github/workflows/pr-dockerfile-path-and-build-yaml-scan.yml +++ b/.github/workflows/pr-dockerfile-path-and-build-yaml-scan.yml @@ -22,7 +22,6 @@ jobs: run: | cd .. git clone https://github.com/opea-project/GenAIComps.git - cd GenAIComps && git checkout refactor_retrievers - name: Check for Missing Dockerfile Paths in GenAIComps run: |