Skip to content

Conversation

@jeremytourville
Copy link

@jeremytourville jeremytourville commented Feb 26, 2025

Fixes #716

@gerrod3
Copy link
Collaborator

gerrod3 commented Mar 4, 2025

@jeremytourville Thanks for the PR! What is the outcome you would like to see? Are you able to build your own FIPS enabled images and this contribution is just an example for other users? Or would you like for us to build and publish this FIPs image? I'm averse to accepting new images to maintain and publish (I'm actually on a crusade to simplify and reduce the amount we currently maintain). The changes needed to add FIPS seem simple enough to add as an example in the docs and that's were I am currently leaning in the direction this PR should take.

@jeremytourville
Copy link
Author

@gerrod3 I am NOT a developer! (Just a devops guy who know enough to be dangerous and write a little code) I hope you can review this image for correctness. It seems to work but you might find issues. I was hoping your team could build and publish a FIPS enabled image as a base standard. (Then leave it up to the person deploying to determine if the host OS should run FIPS or not. I am unable to use the Pulp project as a whole unless I can run a FIPS enabled image - see the issue I posted. It is mandated for many of us who support Gov't sector work.

@jeremytourville
Copy link
Author

jeremytourville commented Mar 5, 2025

@gerrod3 @ggainey @mikedep333 @git-hyagi

As hyagi has commented below, there is a module that is preventing a build from happening that would allow FIPS to be run. I have been having limited success on getting an image that contains all components.

I can't speak to how challenging this may be to get fixed. I do hope you will all give this issue some consideration for the near future and put it on the road map. This would certainly increase the user base as more of us could use the images in either Podman, Docker or Kubernetes/Openshift. Thank you!

@git-hyagi Thank you for the email. (It is copied here for posterity's sake)

Hi @Jeremy_Tourville!

Sorry for the late response! ggainey and mikedep333 were my top picks/recommendations, but I noticed that you had already spoken with them in GH issues.
I was trying to reproduce this error in my lab env and, even though I could see a “non-compliant” package, I could not rebuild and reinstall it.

Here are the steps I’ve done so far:

  • try to find the non-FIPS compliant module (it provided a very messy output, but helped to find it):
$ kubectl debug -it $(kubectl get pods -oname -l app=pulp-content) --image=[quay.io/pulp/pulp-minimal:stable](http://quay.io/pulp/pulp-minimal:stable)

pod$ pip3 list | awk '{gsub(/-/,"_") ;print $1}' > /tmp/c

pod$ while read i ; do echo -n "$i   "; python3 -c "import $i" ; done </tmp/c
  • with the above script, I could identify that createrepo_c was the impacted module/package
  • then I cloned the pulp-oci-images repo (still in the host with fips enabled)
  • modified the Containerfile to install some dependencies and rebuild/reinstall createrepo_c:
$ vim images/pulp-minimal/stable/Containerfile.core

  -c /constraints.txt && \

  rm -rf /root/.cache/pip


+ RUN dnf -y install gcc python3-devel cmake bzip2-devel bzip2-libs libcurl-devel libxml2-devel glib2-devel sqlite-devel rpm-devel libmodulemd python3-libmodulemd

+ RUN pip3 install --ignore-installed --no-binary :all: createrepo_c


# Prevent pip-installed /usr/local/bin/pulp-content from running instead of the one we are building (/usr/bin/pulp-content).

RUN rm -f /usr/local/bin/pulp-content

...


  • but it is failing to build:
$ podman build -t pulp/base:latest -f images/Containerfile.core.base .

$ podman build -t pulp/pulp-minimal:latest -f images/pulp-minimal/stable/Containerfile.core .

...
  --   Package 'modulemd-2.0', required by 'virtual:world', not found                                                                                                                                                                                                                     

  CMake Error at /usr/share/cmake/Modules/FindPkgConfig.cmake:607 (message):                                                                                                                                                                                                              

    A required package was not found                                                                                                                                                                                                                                                      

  Call Stack (most recent call first):

    /usr/share/cmake/Modules/FindPkgConfig.cmake:829 (_pkg_check_modules_internal)

    CMakeLists.txt:94 (pkg_check_modules)

...

@dralley
Copy link
Contributor

dralley commented Mar 16, 2025

This PR should make createrepo_c FIPS compliant again.

rpm-software-management/createrepo_c#446

@gerrod3
Copy link
Collaborator

gerrod3 commented Mar 20, 2025

@jeremytourville Can you try running the latest images in FIPS mode and see if they work?

@jeremytourville
Copy link
Author

jeremytourville commented Mar 24, 2025 via email

@dralley
Copy link
Contributor

dralley commented Mar 24, 2025

Which version of createrepo_c is installed in your image?

@jeremytourville
Copy link
Author

jeremytourville commented Mar 24, 2025 via email

@gerrod3
Copy link
Collaborator

gerrod3 commented Mar 28, 2025

Can you run pip list | grep createrepo_c inside of the container? We want to make sure that the image you are using has the latest version of createrepo_c which should contain the fix to enable running in FIPS mode.

@jeremytourville
Copy link
Author

pulp/pulp-minimal:latest

[root@ab822cdefd12 /]# pip list | grep createrepo_c
createrepo_c                             1.2.1

pulp/pulp:latest

[root@d43431258adb /]# pip list | grep createrepo_c
createrepo_c                             1.2.1

So, it looks like both the single process and multi-process images both run the same version.

Logs for pulp/pulp:latest still show crypto/fips/fips.c:154: OpenSSL internal error: FATAL FIPS SELFTEST FAILURE

Attempting to run pulp/pulp-minimal:latest as part of a compose file fails. I get a bunch of exit code 125 & 127 errors. The only way I could run the pulp image was in debug mode as just a single image. I have not modified the compose file in any way. I am using exactly what is published Github.

[root@sgsir-podman01 compose]#podman compose up -d
Executing external compose provider "/bin/podman-compose". Please refer to the documentation for details.

podman-compose version: 1.0.6
['podman', '--version', '']
using podman version: 4.9.4-rhel
** excluding:  set()
['podman', 'ps', '--filter', 'label=io.podman.compose.project=compose', '-a', '--format', '{{ index .Labels "io.podman.compose.config-hash"}}']
podman volume inspect pg_datadev || podman volume create pg_datadev
['podman', 'volume', 'inspect', 'pg_datadev']
['podman', 'network', 'exists', 'compose_default']
podman run --name=compose_postgres_1 -d --label io.podman.compose.config-hash=259b30dc8eb88e8b280ed84c28d1b408d89ab1d099c6b305af8b0bf66351509e --label io.podman.compose.project=compose --label io.podman.compose.version=1.0.6 --label [email protected] --label com.docker.compose.project=compose --label com.docker.compose.project.working_dir=/opt/pulp-prod/pulp-oci-images/images/compose --label com.docker.compose.project.config_files=compose.yml --label com.docker.compose.container-number=1 --label com.docker.compose.service=postgres -e POSTGRES_USER=pulp -e POSTGRES_PASSWORD=password -e POSTGRES_DB=pulp -e POSTGRES_INITDB_ARGS=--auth-host=scram-sha-256 -e POSTGRES_HOST_AUTH_METHOD=scram-sha-256 -v pg_datadev:/var/lib/postgresql/data -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/postgres/passwd:/etc/passwd:Z --net compose_default --network-alias postgres -p 5432:5432 --restart always --healthcheck-command /bin/sh -c 'pg_isready -U pulp' --healthcheck-interval 10s --healthcheck-timeout 5s --healthcheck-retries 5 docker.io/library/postgres:13
5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d
exit code: 0
podman volume inspect redis_datadev || podman volume create redis_datadev
['podman', 'volume', 'inspect', 'redis_datadev']
['podman', 'network', 'exists', 'compose_default']
podman run --name=compose_redis_1 -d --label io.podman.compose.config-hash=259b30dc8eb88e8b280ed84c28d1b408d89ab1d099c6b305af8b0bf66351509e --label io.podman.compose.project=compose --label io.podman.compose.version=1.0.6 --label [email protected] --label com.docker.compose.project=compose --label com.docker.compose.project.working_dir=/opt/pulp-prod/pulp-oci-images/images/compose --label com.docker.compose.project.config_files=compose.yml --label com.docker.compose.container-number=1 --label com.docker.compose.service=redis -v redis_datadev:/data --net compose_default --network-alias redis --restart always --healthcheck-command /bin/sh -c 'redis-cli ping' --healthcheck-interval 10s --healthcheck-timeout 5s --healthcheck-retries 5 docker.io/library/redis:latest
b9e56833921538371e49f8b3123304ce6aab6c64537f6ff17fc77ad032396d50
exit code: 0
podman volume inspect pulpdev || podman volume create pulpdev
['podman', 'volume', 'inspect', 'pulpdev']
['podman', 'network', 'exists', 'compose_default']
podman run --name=compose_migration_service_1 -d --requires=compose_postgres_1 --label io.podman.compose.config-hash=259b30dc8eb88e8b280ed84c28d1b408d89ab1d099c6b305af8b0bf66351509e --label io.podman.compose.project=compose --label io.podman.compose.version=1.0.6 --label [email protected] --label com.docker.compose.project=compose --label com.docker.compose.project.working_dir=/opt/pulp-prod/pulp-oci-images/images/compose --label com.docker.compose.project.config_files=compose.yml --label com.docker.compose.container-number=1 --label com.docker.compose.service=migration_service -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/settings.py:/etc/pulp/settings.py:z -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/certs:/etc/pulp/certs:z -v pulpdev:/var/lib/pulp --net compose_default --network-alias migration_service pulp/pulp-minimal:latest pulpcore-manager migrate --noinput
a25ffeba5a50464806ebc958b0ca31cbcd7abe00edc0adcc9c25655daf854f09
exit code: 0
podman volume inspect pulpdev || podman volume create pulpdev
['podman', 'volume', 'inspect', 'pulpdev']
['podman', 'network', 'exists', 'compose_default']
podman run --name=compose_set_init_password_service_1 -d --requires=compose_postgres_1 --label io.podman.compose.config-hash=259b30dc8eb88e8b280ed84c28d1b408d89ab1d099c6b305af8b0bf66351509e --label io.podman.compose.project=compose --label io.podman.compose.version=1.0.6 --label [email protected] --label com.docker.compose.project=compose --label com.docker.compose.project.working_dir=/opt/pulp-prod/pulp-oci-images/images/compose --label com.docker.compose.project.config_files=compose.yml --label com.docker.compose.container-number=1 --label com.docker.compose.service=set_init_password_service -e PULP_DEFAULT_ADMIN_PASSWORD=password -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/settings.py:/etc/pulp/settings.py:z -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/certs:/etc/pulp/certs:z -v pulpdev:/var/lib/pulp --net compose_default --network-alias set_init_password_service pulp/pulp-minimal:latest set_init_password.sh
47000b22eb6ee22c5b1f4b3a27cb9e9570ebde898ef61136ed6fd02dc3d48d72
exit code: 0
podman volume inspect pulpdev || podman volume create pulpdev
['podman', 'volume', 'inspect', 'pulpdev']
['podman', 'network', 'exists', 'compose_default']
podman run --name=compose_signing_key_service_1 -d --requires=compose_migration_service_1,compose_postgres_1 --label io.podman.compose.config-hash=259b30dc8eb88e8b280ed84c28d1b408d89ab1d099c6b305af8b0bf66351509e --label io.podman.compose.project=compose --label io.podman.compose.version=1.0.6 --label [email protected] --label com.docker.compose.project=compose --label com.docker.compose.project.working_dir=/opt/pulp-prod/pulp-oci-images/images/compose --label com.docker.compose.project.config_files=compose.yml --label com.docker.compose.container-number=1 --label com.docker.compose.service=signing_key_service -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/settings.py:/etc/pulp/settings.py:z -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/certs:/etc/pulp/certs:z -v pulpdev:/var/lib/pulp --net compose_default --network-alias signing_key_service pulp/pulp-minimal:latest sh -c add_signing_service.sh
Error: generating dependency graph for container 6654f76395231a15d507705eeb157936f0264fe5582c29f0e01c8212a51aeb34: container a25ffeba5a50464806ebc958b0ca31cbcd7abe00edc0adcc9c25655daf854f09 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 127
podman start compose_signing_key_service_1
Error: unable to start container "6654f76395231a15d507705eeb157936f0264fe5582c29f0e01c8212a51aeb34": generating dependency graph for container 6654f76395231a15d507705eeb157936f0264fe5582c29f0e01c8212a51aeb34: container a25ffeba5a50464806ebc958b0ca31cbcd7abe00edc0adcc9c25655daf854f09 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 125
podman volume inspect pulpdev || podman volume create pulpdev
['podman', 'volume', 'inspect', 'pulpdev']
['podman', 'network', 'exists', 'compose_default']
podman run --name=compose_pulp_content_1 -d --requires=compose_redis_1,compose_migration_service_1,compose_postgres_1 --label io.podman.compose.config-hash=259b30dc8eb88e8b280ed84c28d1b408d89ab1d099c6b305af8b0bf66351509e --label io.podman.compose.project=compose --label io.podman.compose.version=1.0.6 --label [email protected] --label com.docker.compose.project=compose --label com.docker.compose.project.working_dir=/opt/pulp-prod/pulp-oci-images/images/compose --label com.docker.compose.project.config_files=compose.yml --label com.docker.compose.container-number=1 --label com.docker.compose.service=pulp_content -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/settings.py:/etc/pulp/settings.py:z -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/certs:/etc/pulp/certs:z -v pulpdev:/var/lib/pulp --net compose_default --network-alias pulp_content -u pulp --hostname pulp-content --restart always --healthcheck-command /bin/sh -c 'readyz.py /pulp/content/' --healthcheck-interval 10s --healthcheck-timeout 5s --healthcheck-retries 5 pulp/pulp-minimal:latest pulp-content
Error: generating dependency graph for container 6295049303f8c9afcbb7a01c6732864bbc1327f603582fed499b0d9662db0624: container a25ffeba5a50464806ebc958b0ca31cbcd7abe00edc0adcc9c25655daf854f09 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 127
podman start compose_pulp_content_1
Error: unable to start container "6295049303f8c9afcbb7a01c6732864bbc1327f603582fed499b0d9662db0624": generating dependency graph for container 6295049303f8c9afcbb7a01c6732864bbc1327f603582fed499b0d9662db0624: container a25ffeba5a50464806ebc958b0ca31cbcd7abe00edc0adcc9c25655daf854f09 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 125
podman volume inspect pulpdev || podman volume create pulpdev
['podman', 'volume', 'inspect', 'pulpdev']
['podman', 'network', 'exists', 'compose_default']
podman run --name=compose_pulp_content_2 -d --requires=compose_redis_1,compose_migration_service_1,compose_postgres_1 --label io.podman.compose.config-hash=259b30dc8eb88e8b280ed84c28d1b408d89ab1d099c6b305af8b0bf66351509e --label io.podman.compose.project=compose --label io.podman.compose.version=1.0.6 --label [email protected] --label com.docker.compose.project=compose --label com.docker.compose.project.working_dir=/opt/pulp-prod/pulp-oci-images/images/compose --label com.docker.compose.project.config_files=compose.yml --label com.docker.compose.container-number=2 --label com.docker.compose.service=pulp_content -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/settings.py:/etc/pulp/settings.py:z -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/certs:/etc/pulp/certs:z -v pulpdev:/var/lib/pulp --net compose_default --network-alias pulp_content -u pulp --hostname pulp-content --restart always --healthcheck-command /bin/sh -c 'readyz.py /pulp/content/' --healthcheck-interval 10s --healthcheck-timeout 5s --healthcheck-retries 5 pulp/pulp-minimal:latest pulp-content
Error: generating dependency graph for container 06d1b11877cdb9e57cfafa8613e3f903a74acb9b97d25cefdd2b3ef4d8a78f0c: container a25ffeba5a50464806ebc958b0ca31cbcd7abe00edc0adcc9c25655daf854f09 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 127
podman start compose_pulp_content_2
Error: unable to start container "06d1b11877cdb9e57cfafa8613e3f903a74acb9b97d25cefdd2b3ef4d8a78f0c": generating dependency graph for container 06d1b11877cdb9e57cfafa8613e3f903a74acb9b97d25cefdd2b3ef4d8a78f0c: container a25ffeba5a50464806ebc958b0ca31cbcd7abe00edc0adcc9c25655daf854f09 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 125
podman volume inspect pulpdev || podman volume create pulpdev
['podman', 'volume', 'inspect', 'pulpdev']
['podman', 'network', 'exists', 'compose_default']
podman run --name=compose_pulp_worker_1 -d --requires=compose_redis_1,compose_migration_service_1,compose_postgres_1 --label io.podman.compose.config-hash=259b30dc8eb88e8b280ed84c28d1b408d89ab1d099c6b305af8b0bf66351509e --label io.podman.compose.project=compose --label io.podman.compose.version=1.0.6 --label [email protected] --label com.docker.compose.project=compose --label com.docker.compose.project.working_dir=/opt/pulp-prod/pulp-oci-images/images/compose --label com.docker.compose.project.config_files=compose.yml --label com.docker.compose.container-number=1 --label com.docker.compose.service=pulp_worker -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/settings.py:/etc/pulp/settings.py:z -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/certs:/etc/pulp/certs:z -v pulpdev:/var/lib/pulp --net compose_default --network-alias pulp_worker -u pulp --restart always pulp/pulp-minimal:latest pulp-worker
Error: generating dependency graph for container 5fb0498777fb2961e5a13bce79ca06a46271fad5a2436e9bdff33aa0b55b3923: container a25ffeba5a50464806ebc958b0ca31cbcd7abe00edc0adcc9c25655daf854f09 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 127
podman start compose_pulp_worker_1
Error: unable to start container "5fb0498777fb2961e5a13bce79ca06a46271fad5a2436e9bdff33aa0b55b3923": generating dependency graph for container 5fb0498777fb2961e5a13bce79ca06a46271fad5a2436e9bdff33aa0b55b3923: container a25ffeba5a50464806ebc958b0ca31cbcd7abe00edc0adcc9c25655daf854f09 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 125
podman volume inspect pulpdev || podman volume create pulpdev
['podman', 'volume', 'inspect', 'pulpdev']
['podman', 'network', 'exists', 'compose_default']
podman run --name=compose_pulp_worker_2 -d --requires=compose_redis_1,compose_migration_service_1,compose_postgres_1 --label io.podman.compose.config-hash=259b30dc8eb88e8b280ed84c28d1b408d89ab1d099c6b305af8b0bf66351509e --label io.podman.compose.project=compose --label io.podman.compose.version=1.0.6 --label [email protected] --label com.docker.compose.project=compose --label com.docker.compose.project.working_dir=/opt/pulp-prod/pulp-oci-images/images/compose --label com.docker.compose.project.config_files=compose.yml --label com.docker.compose.container-number=2 --label com.docker.compose.service=pulp_worker -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/settings.py:/etc/pulp/settings.py:z -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/certs:/etc/pulp/certs:z -v pulpdev:/var/lib/pulp --net compose_default --network-alias pulp_worker -u pulp --restart always pulp/pulp-minimal:latest pulp-worker
Error: generating dependency graph for container ad86bf8e74c806db4c42fb1249b316f3ffb3395964febcf538499242a56c111d: container a25ffeba5a50464806ebc958b0ca31cbcd7abe00edc0adcc9c25655daf854f09 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 127
podman start compose_pulp_worker_2
Error: unable to start container "ad86bf8e74c806db4c42fb1249b316f3ffb3395964febcf538499242a56c111d": generating dependency graph for container ad86bf8e74c806db4c42fb1249b316f3ffb3395964febcf538499242a56c111d: container a25ffeba5a50464806ebc958b0ca31cbcd7abe00edc0adcc9c25655daf854f09 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 125
podman volume inspect pulpdev || podman volume create pulpdev
['podman', 'volume', 'inspect', 'pulpdev']
['podman', 'network', 'exists', 'compose_default']
podman run --name=compose_pulp_api_1 -d --requires=compose_redis_1,compose_signing_key_service_1,compose_set_init_password_service_1,compose_migration_service_1,compose_postgres_1 --label io.podman.compose.config-hash=259b30dc8eb88e8b280ed84c28d1b408d89ab1d099c6b305af8b0bf66351509e --label io.podman.compose.project=compose --label io.podman.compose.version=1.0.6 --label [email protected] --label com.docker.compose.project=compose --label com.docker.compose.project.working_dir=/opt/pulp-prod/pulp-oci-images/images/compose --label com.docker.compose.project.config_files=compose.yml --label com.docker.compose.container-number=1 --label com.docker.compose.service=pulp_api -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/settings.py:/etc/pulp/settings.py:z -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/certs:/etc/pulp/certs:z -v pulpdev:/var/lib/pulp --net compose_default --network-alias pulp_api -u pulp --hostname pulp-api --restart always --healthcheck-command /bin/sh -c 'readyz.py /pulp/api/v3/status/' --healthcheck-interval 10s --healthcheck-timeout 5s --healthcheck-retries 5 pulp/pulp-minimal:latest pulp-api
Error: generating dependency graph for container ebd04acf6f8293c20e162d8320fddb6267cac6df9928275de8444557e0dd9976: container 47000b22eb6ee22c5b1f4b3a27cb9e9570ebde898ef61136ed6fd02dc3d48d72 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 127
podman start compose_pulp_api_1
Error: unable to start container "ebd04acf6f8293c20e162d8320fddb6267cac6df9928275de8444557e0dd9976": generating dependency graph for container ebd04acf6f8293c20e162d8320fddb6267cac6df9928275de8444557e0dd9976: container 6654f76395231a15d507705eeb157936f0264fe5582c29f0e01c8212a51aeb34 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 125
podman volume inspect pulpdev || podman volume create pulpdev
['podman', 'volume', 'inspect', 'pulpdev']
['podman', 'network', 'exists', 'compose_default']
podman run --name=compose_pulp_api_2 -d --requires=compose_redis_1,compose_signing_key_service_1,compose_set_init_password_service_1,compose_migration_service_1,compose_postgres_1 --label io.podman.compose.config-hash=259b30dc8eb88e8b280ed84c28d1b408d89ab1d099c6b305af8b0bf66351509e --label io.podman.compose.project=compose --label io.podman.compose.version=1.0.6 --label [email protected] --label com.docker.compose.project=compose --label com.docker.compose.project.working_dir=/opt/pulp-prod/pulp-oci-images/images/compose --label com.docker.compose.project.config_files=compose.yml --label com.docker.compose.container-number=2 --label com.docker.compose.service=pulp_api -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/settings.py:/etc/pulp/settings.py:z -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/certs:/etc/pulp/certs:z -v pulpdev:/var/lib/pulp --net compose_default --network-alias pulp_api -u pulp --hostname pulp-api --restart always --healthcheck-command /bin/sh -c 'readyz.py /pulp/api/v3/status/' --healthcheck-interval 10s --healthcheck-timeout 5s --healthcheck-retries 5 pulp/pulp-minimal:latest pulp-api
Error: generating dependency graph for container 4275d3cfa57001c9298e458ccc9d56a08b262b5b879e8ffd3d5134d3c97de6c3: container 47000b22eb6ee22c5b1f4b3a27cb9e9570ebde898ef61136ed6fd02dc3d48d72 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 127
podman start compose_pulp_api_2
Error: unable to start container "4275d3cfa57001c9298e458ccc9d56a08b262b5b879e8ffd3d5134d3c97de6c3": generating dependency graph for container 4275d3cfa57001c9298e458ccc9d56a08b262b5b879e8ffd3d5134d3c97de6c3: container 6654f76395231a15d507705eeb157936f0264fe5582c29f0e01c8212a51aeb34 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 125
['podman', 'network', 'exists', 'compose_default']
podman run --name=compose_pulp_web_1 -d --requires=compose_pulp_api_1,compose_pulp_api_2,compose_redis_1,compose_signing_key_service_1,compose_pulp_content_1,compose_pulp_content_2,compose_set_init_password_service_1,compose_migration_service_1,compose_postgres_1 --label io.podman.compose.config-hash=259b30dc8eb88e8b280ed84c28d1b408d89ab1d099c6b305af8b0bf66351509e --label io.podman.compose.project=compose --label io.podman.compose.version=1.0.6 --label [email protected] --label com.docker.compose.project=compose --label com.docker.compose.project.working_dir=/opt/pulp-prod/pulp-oci-images/images/compose --label com.docker.compose.project.config_files=compose.yml --label com.docker.compose.container-number=1 --label com.docker.compose.service=pulp_web -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/bin/nginx.sh:/usr/bin/nginx.sh:Z -v /opt/pulp-prod/pulp-oci-images/images/compose/assets/nginx/nginx.conf.template:/etc/nginx/nginx.conf.template:Z --net compose_default --network-alias pulp_web -p 8080:8080 -u root --hostname pulp --restart always pulp/pulp-web:latest /usr/bin/nginx.sh
Error: generating dependency graph for container 035f38c5d4c5b01420eefceb2aa6c8e8e66d1d1e40815ffa98b7aae39b2eac5b: container 06d1b11877cdb9e57cfafa8613e3f903a74acb9b97d25cefdd2b3ef4d8a78f0c depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 127
podman start compose_pulp_web_1
Error: unable to start container "035f38c5d4c5b01420eefceb2aa6c8e8e66d1d1e40815ffa98b7aae39b2eac5b": generating dependency graph for container 035f38c5d4c5b01420eefceb2aa6c8e8e66d1d1e40815ffa98b7aae39b2eac5b: container 6654f76395231a15d507705eeb157936f0264fe5582c29f0e01c8212a51aeb34 depends on container 5b6d6d55aa155f9607f0b92521550ff0acc875ce454b9d8e733304b7b3973c1d not found in input list: no such container
exit code: 125

@dralley
Copy link
Contributor

dralley commented Mar 31, 2025

Well, it's the correct (new) version at least, so that's not the issue. And the openssl errors raise when you try to import the createrepo_c package?

@jeremytourville
Copy link
Author

jeremytourville commented Mar 31, 2025

And the openssl errors raise when you try to import the createrepo_c package?

The openssl errors happen when I simply try to run either the:

  • compose file for the single process image
  • podman run for the multi-process image

I'm not doing anything to specifically import those. If there's an import happening, the code must be doing it.

@stale
Copy link

stale bot commented May 24, 2025

This pull request has been marked 'stale' due to lack of recent activity. If there is no further activity, the PR will be closed in another 30 days. Thank you for your contribution!

@stale stale bot added the stale label May 24, 2025
@dralley
Copy link
Contributor

dralley commented Jun 3, 2025

I still encounter the same issue - crypto/fips/fips.c:154: OpenSSL internal
error: FATAL FIPS SELFTEST FAILURE

So, I think this is the real key detail.

I've had a couple of discussions with folks and it seems like this pinpoints the point of failure as being the self-integrity check which openssl performs when running in FIPS mode. Since the package is installed via PyPI, it's pulling a pre-built binary wheel package processed with manylinux, which does a number of things including bundling a copy of openssl to be used instead of the system package. And I believe this bundling process might be causing problems with the integrity check.

If that's the case then building createrepo_c from source inside of the container so that system libraries are used instead of bundled ones should probably work, though it's an unfortunate thing to need to do. Otherwise maybe using an image based off of Katello's RPM packages might work.

@stale
Copy link

stale bot commented Jun 3, 2025

This issue is no longer marked for closure.

@stale stale bot removed the stale label Jun 3, 2025
@dralley
Copy link
Contributor

dralley commented Jun 3, 2025

Alternatively maybe something along these lines works:

https://stackoverflow.com/questions/70788681/pyinstaller-fatal-fips-selftest-failure

This seems to be an old post but just in case...

I use PyInstaller quite often, but after upgrading from CentOS 7.x to RHEL 8.x I began experiencing the same behavior. Here is what I found... PyInstaller pulls in the correct openssl shared libraries but does not bundle in the the associated hmac's ( in our case /usr/lib64/.libcrypto.so.1.1.hmac and /usr/lib64/.libssl.so.1.1.hmac ).

I chose to just bundle the hmac files generated by the system. I did this by editing the associated .spec file that pyinstaller uses to create the bundled executable.

Changed:

binaries=[],

to:

binaries=[('/usr/lib64/.libcrypto.so.1.1.hmac','.'),'/usr/lib64/.libssl.so.1.1.hmac','.')],

No other modifications were required. If you don't use a .spec file then I suppose this could also be accomplished via the cmdline.


Piggybacking off of gcgold's answer (11766308) since I don't have enough reputation to comment yet.

That solution worked for us on RHEL8 as well, with minor tweaks. In our case, the libraries on the build server were version so.1.1.1k, but when bundled by PyInstaller they were renamed to version so.1.1. Thus, we had to copy and rename the HMAC files to match. E.g.: .libcrypto.so.1.1.1k.hmac to .libcrypto.so.1.1.hmac.

If not creating a onefile build then renaming the HMAC files to match, in the resulting dist/program/_internal directory, after the build works. But for a onefile build we had to copy and rename the HMAC files to a temporary directory ahead of time and then provide the renamed HMAC files via the --add-binary option or spec file.

That might work if it's failing because the fingerprints of the bundled openssl are straight up missing. In that case obviously re-bundling the fingerprints would potentially help.

With that said I can't find those hmac files anywhere on the CentOS Stream 9 system in the first place

@stale
Copy link

stale bot commented Jul 3, 2025

This pull request has been marked 'stale' due to lack of recent activity. If there is no further activity, the PR will be closed in another 30 days. Thank you for your contribution!

@stale stale bot added the stale label Jul 3, 2025
@stale
Copy link

stale bot commented Aug 2, 2025

This pull request has been closed due to inactivity. If you feel this is in error, please reopen the pull request or file a new PR with the relevant details.

@stale stale bot closed this Aug 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Images don't run properly (or at all) if FIPS is enabled

3 participants