Skip to content

mca/base update #449

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Apr 16, 2015
Merged

mca/base update #449

merged 3 commits into from
Apr 16, 2015

Conversation

hjelmn
Copy link
Member

@hjelmn hjelmn commented Mar 6, 2015

No description provided.

@hjelmn
Copy link
Member Author

hjelmn commented Mar 6, 2015

@jsquyres This is what I was talking about to enable differentiating frameworks/components from different projects in the MCA variable system.

This PR will probably need a little work before it is really ready to be merged with master.

@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/326/
Test PASSed.

@jsquyres
Copy link
Member

This is related to #475.

@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/379/

Build Log
last 50 lines

[...truncated 1165 lines...]
configure.ac:44: installing 'confdb/config.sub'
configure.ac:34: installing 'confdb/install-sh'
configure.ac:34: installing 'confdb/missing'
Makefile.am: installing 'confdb/depcomp'
autoreconf: Leaving directory `.'

7. Running autotools on top-level tree

==> Remove stale files
==> Writing m4 file with autogen.pl results
==> Generating opal_get_version.sh
Running: autom4te --language=m4sh opal_get_version.m4sh -o opal_get_version.sh
==> Running autoreconf
Running: autoreconf -ivf --warnings=all,no-obsolete,no-override -I config
autoreconf: Entering directory `.'
autoreconf: configure.ac: not using Gettext
autoreconf: running: aclocal -I config --force --warnings=all,no-obsolete,no-override -I config
autoreconf: configure.ac: tracing
autoreconf: running: libtoolize --copy --force
libtoolize: putting auxiliary files in AC_CONFIG_AUX_DIR, `config'.
libtoolize: copying file `config/ltmain.sh'
libtoolize: putting macros in AC_CONFIG_MACRO_DIR, `config'.
libtoolize: copying file `config/libtool.m4'
libtoolize: copying file `config/ltoptions.m4'
libtoolize: copying file `config/ltsugar.m4'
libtoolize: copying file `config/ltversion.m4'
libtoolize: copying file `config/lt~obsolete.m4'
autoreconf: running: /hpc/local/bin/autoconf --include=config --force --warnings=all,no-obsolete,no-override
autoreconf: running: /hpc/local/bin/autoheader --include=config --force --warnings=all,no-obsolete,no-override
autoreconf: running: automake --add-missing --copy --force-missing --warnings=all,no-obsolete,no-override
configure.ac:83: installing 'config/compile'
configure.ac:73: installing 'config/config.guess'
configure.ac:73: installing 'config/config.sub'
configure.ac:93: installing 'config/install-sh'
configure.ac:93: installing 'config/missing'
automake: error: cannot open < ompi/mca/Makefile.am: No such file or directory
autoreconf: automake failed with exit status: 1
Command failed: autoreconf -ivf --warnings=all,no-obsolete,no-override -I config
Build step 'Execute shell' marked build as failure
TAP Reports Processing: START
Looking for TAP results report in workspace using pattern: **/*.tap
Did not find any matching files.
Anchor chain: could not read file with links: /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/jenkins_sidelinks.txt (No such file or directory)
[copy-to-slave] The build is taking place on the master node, no copy back to the master will take place.
Setting commit status on GitHub for https://github.com/open-mpi/ompi/commit/7e361427e37cb1b215d6b2b5c38226d02b110986
[BFA] Scanning build for known causes...

[BFA] Done. 0s
Setting status of 73404fd4edc40ceb27a6ab706de04e48d651aeee to FAILURE with url http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr/379/ and message: Merged build finished.

Test FAILed.

@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/380/

Build Log
last 50 lines

[...truncated 1164 lines...]
configure.ac:44: installing 'confdb/config.sub'
configure.ac:34: installing 'confdb/install-sh'
configure.ac:34: installing 'confdb/missing'
Makefile.am: installing 'confdb/depcomp'
autoreconf: Leaving directory `.'

7. Running autotools on top-level tree

==> Remove stale files
==> Writing m4 file with autogen.pl results
==> Generating opal_get_version.sh
Running: autom4te --language=m4sh opal_get_version.m4sh -o opal_get_version.sh
==> Running autoreconf
Running: autoreconf -ivf --warnings=all,no-obsolete,no-override -I config
autoreconf: Entering directory `.'
autoreconf: configure.ac: not using Gettext
autoreconf: running: aclocal -I config --force --warnings=all,no-obsolete,no-override -I config
autoreconf: configure.ac: tracing
autoreconf: running: libtoolize --copy --force
libtoolize: putting auxiliary files in AC_CONFIG_AUX_DIR, `config'.
libtoolize: copying file `config/ltmain.sh'
libtoolize: putting macros in AC_CONFIG_MACRO_DIR, `config'.
libtoolize: copying file `config/libtool.m4'
libtoolize: copying file `config/ltoptions.m4'
libtoolize: copying file `config/ltsugar.m4'
libtoolize: copying file `config/ltversion.m4'
libtoolize: copying file `config/lt~obsolete.m4'
autoreconf: running: /hpc/local/bin/autoconf --include=config --force --warnings=all,no-obsolete,no-override
autoreconf: running: /hpc/local/bin/autoheader --include=config --force --warnings=all,no-obsolete,no-override
autoreconf: running: automake --add-missing --copy --force-missing --warnings=all,no-obsolete,no-override
configure.ac:83: installing 'config/compile'
configure.ac:73: installing 'config/config.guess'
configure.ac:73: installing 'config/config.sub'
configure.ac:93: installing 'config/install-sh'
configure.ac:93: installing 'config/missing'
automake: error: cannot open < ompi/mca/Makefile.am: No such file or directory
autoreconf: automake failed with exit status: 1
Command failed: autoreconf -ivf --warnings=all,no-obsolete,no-override -I config
Build step 'Execute shell' marked build as failure
TAP Reports Processing: START
Looking for TAP results report in workspace using pattern: **/*.tap
Did not find any matching files.
Anchor chain: could not read file with links: /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/jenkins_sidelinks.txt (No such file or directory)
[copy-to-slave] The build is taking place on the master node, no copy back to the master will take place.
Setting commit status on GitHub for https://github.com/open-mpi/ompi/commit/4787895063086bb541c79488aa8ef6e522856932
[BFA] Scanning build for known causes...

[BFA] Done. 0s
Setting status of 969dd9d5080657e82d1dd682cc5c1d98b69d3b78 to FAILURE with url http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr/380/ and message: Merged build finished.

Test FAILed.

@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/381/

Build Log
last 50 lines

[...truncated 5586 lines...]
  CCLD     libmca_timer.la
make[3]: Entering directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/opal/mca/timer'
make[3]: Nothing to be done for `install-exec-am'.
 /bin/mkdir -p '/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/include/openmpi/opal/mca/timer'
 /usr/bin/install -c -m 644  timer.h '/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/include/openmpi/opal/mca/timer/.'
 /bin/mkdir -p '/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/include/openmpi/opal/mca/timer/base'
 /usr/bin/install -c -m 644  base/base.h base/timer_base_null.h '/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/include/openmpi/opal/mca/timer/base'
make[3]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/opal/mca/timer'
make[2]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/opal/mca/timer'
Making install in mca/backtrace/execinfo
make[2]: Entering directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/opal/mca/backtrace/execinfo'
  CC       backtrace_execinfo.lo
  CC       backtrace_execinfo_component.lo
  CCLD     libmca_backtrace_execinfo.la
make[3]: Entering directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/opal/mca/backtrace/execinfo'
make[3]: Nothing to be done for `install-exec-am'.
make[3]: Nothing to be done for `install-data-am'.
make[3]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/opal/mca/backtrace/execinfo'
make[2]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/opal/mca/backtrace/execinfo'
Making install in mca/dl/dlopen
make[2]: Entering directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/opal/mca/dl/dlopen'
  CC       dl_dlopen_component.lo
  CC       dl_dlopen_module.lo
dl_dlopen_component.c:47: error: 'MCA_BASE_VERSION_2_0_0' undeclared here (not in a function)
dl_dlopen_component.c:47: warning: initialization makes integer from pointer without a cast
dl_dlopen_component.c:47: error: initializer element is not computable at load time
dl_dlopen_component.c:47: error: (near initialization for 'mca_dl_dlopen_component.base.base_version.mca_minor_version')
dl_dlopen_component.c:47: warning: missing braces around initializer
dl_dlopen_component.c:47: warning: (near initialization for 'mca_dl_dlopen_component.base.base_version.mca_project_name')
dl_dlopen_component.c:50: warning: initialization makes integer from pointer without a cast
dl_dlopen_component.c:50: error: initializer element is not computable at load time
dl_dlopen_component.c:50: error: (near initialization for 'mca_dl_dlopen_component.base.base_version.mca_project_name[2]')
make[2]: *** [dl_dlopen_component.lo] Error 1
make[2]: *** Waiting for unfinished jobs....
make[2]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/opal/mca/dl/dlopen'
make[1]: *** [install-recursive] Error 1
make[1]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/opal'
make: *** [install-recursive] Error 1
Build step 'Execute shell' marked build as failure
TAP Reports Processing: START
Looking for TAP results report in workspace using pattern: **/*.tap
Did not find any matching files.
Anchor chain: could not read file with links: /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/jenkins_sidelinks.txt (No such file or directory)
[copy-to-slave] The build is taking place on the master node, no copy back to the master will take place.
Setting commit status on GitHub for https://github.com/open-mpi/ompi/commit/ba7c724768076e72eac8f53a89e0fb2243e02e96
[BFA] Scanning build for known causes...

[BFA] Done. 0s
Setting status of e058f352f785d5498b846d8a24c5d2a1b0036d7d to FAILURE with url http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr/381/ and message: Merged build finished.

Test FAILed.

hjelmn added 3 commits March 27, 2015 10:59
This commit adds support for project_framework_component_* parameter
matching. This is the first step in allowing the same framework name
in multiple projects. This change also bumps the MCA component version
to 2.1.0.

All master frameworks have been updated to use the new component
versioning macro. An mca.h has been added to each project to add a
project specific versioning macro of the form
PROJECT_MCA_VERSION_2_1_0.

Signed-off-by: Nathan Hjelm <[email protected]>
@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/389/
Test PASSed.

@hppritcha
Copy link
Member

is this PR ready for merging?

@hjelmn
Copy link
Member Author

hjelmn commented Mar 30, 2015

Not quite. I need to make a few changes to make sure MCA variable groups use the project name. Should be ready later today.

@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/400/

Build Log
last 50 lines

[...truncated 21981 lines...]
malloc debug: Request for 4 zeroed elements of size 0 (grpcomm_rcd.c, 99)
malloc debug: Request for 4 zeroed elements of size 0 (grpcomm_rcd.c, 99)
Hello, world, I am 5 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1428-g8cb2cc0, Unreleased developer copy, 138)
Hello, world, I am 6 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1428-g8cb2cc0, Unreleased developer copy, 138)
Hello, world, I am 1 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1428-g8cb2cc0, Unreleased developer copy, 138)
Hello, world, I am 7 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1428-g8cb2cc0, Unreleased developer copy, 138)
Hello, world, I am 0 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1428-g8cb2cc0, Unreleased developer copy, 138)
Hello, world, I am 2 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1428-g8cb2cc0, Unreleased developer copy, 138)
Hello, world, I am 4 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1428-g8cb2cc0, Unreleased developer copy, 138)
Hello, world, I am 3 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1428-g8cb2cc0, Unreleased developer copy, 138)
[jenkins01:12613] *** Process received signal ***
[jenkins01:12613] Signal: Segmentation fault (11)
[jenkins01:12613] Signal code: Address not mapped (1)
[jenkins01:12613] Failing at address: 0x7ffff3ee6b30
[jenkins01:12613] [ 0] /lib64/libpthread.so.0[0x3d6980f710]
[jenkins01:12613] [ 1] /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace-2/ompi_install1/lib/libopen-pal.so.0(mca_base_var_deregister+0xa9)[0x7ffff7a37328]
[jenkins01:12613] [ 2] /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace-2/ompi_install1/lib/libopen-pal.so.0(mca_base_var_group_deregister+0xb4)[0x7ffff7a4078c]
[jenkins01:12613] [ 3] /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace-2/ompi_install1/lib/libopen-pal.so.0(mca_base_var_group_deregister+0x190)[0x7ffff7a40868]
[jenkins01:12613] [ 4] /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace-2/ompi_install1/lib/libopen-pal.so.0(mca_base_framework_close+0xf2)[0x7ffff7a42521]
[jenkins01:12613] [ 5] /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace-2/ompi_install1/lib/openmpi/mca_ess_hnp.so(+0x51ea)[0x7ffff63881ea]
[jenkins01:12613] [ 6] /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace-2/ompi_install1/lib/libopen-rte.so.0(orte_finalize+0x9d)[0x7ffff7d1f699]
[jenkins01:12613] [ 7] mpirun[0x40584e]
[jenkins01:12613] [ 8] mpirun[0x403714]
[jenkins01:12613] [ 9] /lib64/libc.so.6(__libc_start_main+0xfd)[0x3d6901ed1d]
[jenkins01:12613] [10] mpirun[0x403639]
[jenkins01:12613] *** End of error message ***
Build step 'Execute shell' marked build as failure
TAP Reports Processing: START
Looking for TAP results report in workspace using pattern: **/*.tap
Saving reports...
Processing '/var/lib/jenkins/jobs/gh-ompi-master-pr/builds/400/tap-master-files/cov_stat.tap'
Parsing TAP test result [/var/lib/jenkins/jobs/gh-ompi-master-pr/builds/400/tap-master-files/cov_stat.tap].
not ok - coverity detected 916 failures in all_400 # SKIP http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr//ws/cov_build/all_400/output/errors/index.html
not ok - coverity detected 5 failures in oshmem_400 # TODO http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr//ws/cov_build/oshmem_400/output/errors/index.html
ok - coverity found no issues for yalla_400
ok - coverity found no issues for mxm_400
not ok - coverity detected 2 failures in fca_400 # TODO http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr//ws/cov_build/fca_400/output/errors/index.html
ok - coverity found no issues for hcoll_400

TAP Reports Processing: FINISH
coverity_for_all    http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr//ws/cov_build/all_400/output/errors/index.html
coverity_for_oshmem http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr//ws/cov_build/oshmem_400/output/errors/index.html
coverity_for_fca    http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr//ws/cov_build/fca_400/output/errors/index.html
[copy-to-slave] The build is taking place on the master node, no copy back to the master will take place.
Setting commit status on GitHub for https://api.github.com/repos/open-mpi/ompi/commit/8cb2cc0c498165f3d77bbec755cca3143dfc4089
[BFA] Scanning build for known causes...

[BFA] Done. 0s
Setting status of 94851e44d59281fadf3a23874e98c486ec9424ee to FAILURE with url http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr/400/ and message: Merged build finished.

Test FAILed.

@hjelmn
Copy link
Member Author

hjelmn commented Apr 6, 2015

bot:retest

@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/416/

Build Log
last 50 lines

[...truncated 21791 lines...]
+ '[' yes == yes ']'
+ timeout -s SIGKILL 10m mpirun -np 8 -bind-to core -mca pml ob1 -mca btl self,tcp /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace-2/ompi_install1/examples/hello_c
Hello, world, I am 7 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1469-g566b0ea, Unreleased developer copy, 138)
Hello, world, I am 0 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1469-g566b0ea, Unreleased developer copy, 138)
Hello, world, I am 2 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1469-g566b0ea, Unreleased developer copy, 138)
Hello, world, I am 4 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1469-g566b0ea, Unreleased developer copy, 138)
Hello, world, I am 5 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1469-g566b0ea, Unreleased developer copy, 138)
Hello, world, I am 6 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1469-g566b0ea, Unreleased developer copy, 138)
Hello, world, I am 1 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1469-g566b0ea, Unreleased developer copy, 138)
Hello, world, I am 3 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-1469-g566b0ea, Unreleased developer copy, 138)
[jenkins01:07230] *** Process received signal ***
[jenkins01:07230] Signal: Segmentation fault (11)
[jenkins01:07230] Signal code: Address not mapped (1)
[jenkins01:07230] Failing at address: 0x7ffff3ee3b30
[jenkins01:07230] [ 0] /lib64/libpthread.so.0[0x3d6980f710]
[jenkins01:07230] [ 1] /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace-2/ompi_install1/lib/libopen-pal.so.0(mca_base_var_deregister+0xa9)[0x7ffff7a34751]
[jenkins01:07230] [ 2] /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace-2/ompi_install1/lib/libopen-pal.so.0(mca_base_var_group_deregister+0xb4)[0x7ffff7a3dd50]
[jenkins01:07230] [ 3] /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace-2/ompi_install1/lib/libopen-pal.so.0(mca_base_var_group_deregister+0x190)[0x7ffff7a3de2c]
[jenkins01:07230] [ 4] /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace-2/ompi_install1/lib/libopen-pal.so.0(mca_base_framework_close+0xf2)[0x7ffff7a3faf9]
[jenkins01:07230] [ 5] /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace-2/ompi_install1/lib/openmpi/mca_ess_hnp.so(+0x51ea)[0x7ffff63851ea]
[jenkins01:07230] [ 6] /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace-2/ompi_install1/lib/libopen-rte.so.0(orte_finalize+0x9d)[0x7ffff7d1e699]
[jenkins01:07230] [ 7] mpirun[0x40590c]
[jenkins01:07230] [ 8] mpirun[0x4037a4]
[jenkins01:07230] [ 9] /lib64/libc.so.6(__libc_start_main+0xfd)[0x3d6901ed1d]
[jenkins01:07230] [10] mpirun[0x4036c9]
[jenkins01:07230] *** End of error message ***
Build step 'Execute shell' marked build as failure
TAP Reports Processing: START
Looking for TAP results report in workspace using pattern: **/*.tap
Saving reports...
Processing '/var/lib/jenkins/jobs/gh-ompi-master-pr/builds/416/tap-master-files/cov_stat.tap'
Parsing TAP test result [/var/lib/jenkins/jobs/gh-ompi-master-pr/builds/416/tap-master-files/cov_stat.tap].
not ok - coverity detected 909 failures in all_416 # SKIP http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr//ws/cov_build/all_416/output/errors/index.html
not ok - coverity detected 5 failures in oshmem_416 # TODO http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr//ws/cov_build/oshmem_416/output/errors/index.html
ok - coverity found no issues for yalla_416
ok - coverity found no issues for mxm_416
not ok - coverity detected 2 failures in fca_416 # TODO http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr//ws/cov_build/fca_416/output/errors/index.html
ok - coverity found no issues for hcoll_416

TAP Reports Processing: FINISH
coverity_for_all    http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr//ws/cov_build/all_416/output/errors/index.html
coverity_for_oshmem http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr//ws/cov_build/oshmem_416/output/errors/index.html
coverity_for_fca    http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr//ws/cov_build/fca_416/output/errors/index.html
[copy-to-slave] The build is taking place on the master node, no copy back to the master will take place.
Setting commit status on GitHub for https://api.github.com/repos/open-mpi/ompi/commit/566b0eabfe95ddccd1c5394b246c2fccfd03d9ba
[BFA] Scanning build for known causes...

[BFA] Done. 0s
Setting status of 94851e44d59281fadf3a23874e98c486ec9424ee to FAILURE with url http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr/416/ and message: Merged build finished.

Test FAILed.

@hjelmn hjelmn force-pushed the mca_base_update branch from 94851e4 to 6d1a416 Compare April 8, 2015 01:13
@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/422/
Test PASSed.

@hjelmn
Copy link
Member Author

hjelmn commented Apr 14, 2015

bot:retest:

@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/437/
Test PASSed.

@hjelmn
Copy link
Member Author

hjelmn commented Apr 15, 2015

bot:retest

@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/444/
Test PASSed.

hjelmn added a commit that referenced this pull request Apr 16, 2015
@hjelmn hjelmn merged commit 3436f29 into open-mpi:master Apr 16, 2015
@rolfv
Copy link

rolfv commented Apr 17, 2015

@hjelmn I have a bunch of new failures from last nights run that I assume are from these changes. You can take a look at the MTT runs from last night to see more details. But, here is what a failure looks like. Note that I compiled Open MPI with --enable-debug if that makes any difference.

rvandevaart@ivy4 collective]$ mpirun -np 4 ineighbor_allgather
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 11490 on node ivy4 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------


(gdb) where
#0  0x00007fcb7bc941d0 in ?? ()
#1  <signal handler called>
#2  0x00007fcb844c111d in mca_base_component_repository_open (framework=0x7fcb8538f600, ri=0x236e540)
    at ../../../../opal/mca/base/mca_base_component_repository.c:321
#3  0x00007fcb844bffb1 in find_dyn_components (path=0x0, framework=0x7fcb8538f600, names=0x0,
    include_mode=true) at ../../../../opal/mca/base/mca_base_component_find.c:262
#4  0x00007fcb844bfb9d in mca_base_component_find (directory=0x0, framework=0x7fcb8538f600,
    ignore_requested=false, open_dso_components=true)
    at ../../../../opal/mca/base/mca_base_component_find.c:133
#5  0x00007fcb844c1dd7 in mca_base_framework_components_open (framework=0x7fcb8538f600,
    flags=MCA_BASE_OPEN_FIND_COMPONENTS) at ../../../../opal/mca/base/mca_base_components_open.c:61
#6  0x00007fcb8512e549 in mca_topo_base_open (flags=MCA_BASE_OPEN_FIND_COMPONENTS)
    at ../../../../ompi/mca/topo/base/topo_base_frame.c:71
#7  0x00007fcb8512f3a9 in mca_topo_base_lazy_init () at ../../../../ompi/mca/topo/base/topo_base_lazy_init.c:39
#8  0x00007fcb8512b88a in mca_topo_base_comm_select (comm=0x604220, preferred_module=0x0,
    selected_module=0x7ffff809aea0, type=256) at ../../../../ompi/mca/topo/base/topo_base_comm_select.c:87
#9  0x00007fcb850a234b in PMPI_Cart_create (old_comm=0x604220, ndims=2, dims=0x7ffff809b070,
    periods=0x7ffff809b060, reorder=0, comm_cart=0x7ffff809b090) at pcart_create.c:95
#10 0x0000000000401408 in main (argc=1, argv=0x7ffff809b1c8) at ineighbor_allgather.c:86
(gdb) up
#1  <signal handler called>
(gdb) up
#2  0x00007fcb844c111d in mca_base_component_repository_open (framework=0x7fcb8538f600, ri=0x236e540)
    at ../../../../opal/mca/base/mca_base_component_repository.c:321
321            if (0 == strcmp(mitem->cli_component->mca_component_name, ri->ri_name)) {
(gdb) print ri->ri_name
$1 = "basic", '\000' <repeats 58 times>
(gdb) print mitem->cli_component->mca_component_name
Cannot access memory at address 0x48
(gdb) 

@hjelmn
Copy link
Member Author

hjelmn commented Apr 17, 2015

Ok, looks like basic needs to be updated to use C99 style initialization. Fixing now.

@hjelmn hjelmn deleted the mca_base_update branch May 23, 2016 17:44
jsquyres pushed a commit to jsquyres/ompi that referenced this pull request Sep 21, 2016
Bring over the hwloc assembly changes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants