Skip to content

Commit 88fde62

Browse files
authored
fix image build issue on push (#780)
Signed-off-by: chensuyue <[email protected]>
1 parent 1144fae commit 88fde62

File tree

2 files changed

+4
-2
lines changed

2 files changed

+4
-2
lines changed

.github/workflows/push-image-build.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,8 @@ concurrency:
1818
jobs:
1919
job1:
2020
uses: ./.github/workflows/_get-test-matrix.yml
21+
with:
22+
test_mode: "docker_image_build/build.yaml"
2123

2224
image-build:
2325
needs: job1

docker_images_list.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -46,9 +46,9 @@ Take ChatQnA for example. ChatQnA is a chatbot application service based on the
4646
| [opea/gmcrouter](https://hub.docker.com/r/opea/gmcrouter) | [Link](https://github.com/opea-project/GenAIInfra/blob/main/microservices-connector/Dockerfile.manager) | The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to route the traffic among the microservices defined in GMC |
4747
| [opea/gmcmanager](https://hub.docker.com/r/opea/gmcmanager) | [Link](https://github.com/opea-project/GenAIInfra/blob/main/microservices-connector/Dockerfile.router) | The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to be controller manager to handle GMC CRD |
4848
| [opea/guardrails-tgi](https://hub.docker.com/r/opea/guardrails-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/llama_guard/langchain/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide content review for GenAI application use |
49-
| [opea/guardrails-pii-detection](https://hub.docker.com/r/opea/guardrails-pii-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/pii_detection/docker/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide PII detection for GenAI application use |
49+
| [opea/guardrails-pii-detection](https://hub.docker.com/r/opea/guardrails-pii-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/pii_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide PII detection for GenAI application use |
5050
| [opea/habanalabs](https://hub.docker.com/r/opea/habanalabs) | | |
51-
| [opea/knowledge_graphs](https://hub.docker.com/r/opea/knowledge_graphs) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/knowledgegraphs/langchain/docker/Dockerfile) | The docker image served as knowledge graph gateway to enhance question answering with graph knowledge searching. |
51+
| [opea/knowledge_graphs](https://hub.docker.com/r/opea/knowledge_graphs) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/knowledgegraphs/langchain/Dockerfile) | The docker image served as knowledge graph gateway to enhance question answering with graph knowledge searching. |
5252
| [opea/llm-docsum-tgi](https://hub.docker.com/r/opea/llm-docsum-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/summarization/tgi/langchain/Dockerfile) | This docker image is designed to build a document summarization microservice using the HuggingFace Text Generation Inference(TGI) framework. The microservice accepts document input and generates a document summary. |
5353
| [opea/llm-faqgen-tgi](https://hub.docker.com/r/opea/llm-faqgen-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/faq-generation/tgi/langchain/Dockerfile) | This docker image is designed to build a frequently asked questions microservice using the HuggingFace Text Generation Inference(TGI) framework. The microservice accepts document input and generates a FAQ. |
5454
| [opea/llm-ollama](https://hub.docker.com/r/opea/llm-ollama) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/text-generation/ollama/langchain/Dockerfile) | The docker image exposed the OPEA LLM microservice based on ollama for GenAI application use |

0 commit comments

Comments
 (0)