Skip to content

Improve verbose output of alt_e2eshark runner #462

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

rkayaith
Copy link
Member

With the verbose flag on, the runner now:

  • prints executed shell commands as they're run
  • prints the output of failed commands

An example of what a failed test looks like now:

Stages to be run: ['setup', 'import_model', 'preprocessing', 'compilation', 'construct_inputs', 'native_inference', 'compiled_inference', 'postprocessing']
Test list: ['for_loop_basic']
running test for_loop_basic...
  $ python -m iree.compiler.tools.import_onnx /home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/test-run/for_loop_basic/model.onnx --num-elements-threshold=100 --params-scope=model -o /home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/test-run/for_loop_basic/model.torch_onnx.mlir
  $ iree-compile /home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/test-run/for_loop_basic/model.torch_onnx.mlir --iree-hal-target-backends=llvm-cpu --iree-llvmcpu-target-cpu=host -o /home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/test-run/for_loop_basic/compiled_model.vmfb
        FAILED (compilation)                    
Traceback (most recent call last):
  File "/home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/run.py", line 224, in run_tests
    compiled_artifact = config.compile(model_artifact, save_to=artifact_save_to, extra_options=options.compilation_options)
  File "/home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/e2e_testing/test_configs/onnxconfig.py", line 207, in compile
    return self.backend.compile(mlir_module, save_to=save_to, extra_options=extra_options)
  File "/home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/e2e_testing/backends.py", line 117, in compile
    run_command_and_log(compile_command, save_to, "compilation", verbose=self.verbose)
  File "/home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/e2e_testing/logging_utils.py", line 42, in run_command_and_log
    raise RuntimeError(error_msg)
RuntimeError: failure executing command:
  iree-compile /home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/test-run/for_loop_basic/model.torch_onnx.mlir --iree-hal-target-backends=llvm-cpu --iree-llvmcpu-target-cpu=host -o /home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/test-run/for_loop_basic/compiled_model.vmfb
Error detail in '/home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/test-run/for_loop_basic/detail/compilation.detail.log'
stderr:
  /home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/test-run/for_loop_basic/model.torch_onnx.mlir:8:12: error: failed to legalize unresolved materialization from ('tensor<i64>') to ('tensor<1xi64>') that remained live after conversion
        %3 = torch.operator "onnx.Add"(%arg3, %arg1) : (!torch.vtensor<[1],si64>, !torch.vtensor<[1],si64>) -> !torch.vtensor<[1],si64> 
             ^
  /home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/test-run/for_loop_basic/model.torch_onnx.mlir:8:12: note: see current operation: %9 = "builtin.unrealized_conversion_cast"(%8) : (tensor<i64>) -> tensor<1xi64>
  /home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/test-run/for_loop_basic/model.torch_onnx.mlir:8:12: note: see existing live user here: 
  %7 = linalg.generic {indexing_maps = [affine_map<(d0) -> (0)>, affine_map<(d0) -> (0)>, affine_map<(d0) -> (d0)>], iterator_types = ["parallel"]} ins(%1, %5 : tensor<1xi64>, tensor<1xi64>) outs(%6 : tensor<1xi64>) {
  ^bb0(%in: i64, %in_0: i64, %out: i64):
    %9 = arith.muli %in_0, %c1_i64 : i64
    %10 = arith.addi %in, %9 : i64
    linalg.yield %10 : i64
  } -> tensor<1xi64>


Test Summary:
        PASSES: 0
        TOTAL: 1
results stored in /home/rkayaith/repos/SHARK-TestSuite/alt_e2eshark/test-run

@rkayaith rkayaith requested a review from zjgarvey March 19, 2025 17:37
@zjgarvey
Copy link
Contributor

This information is contained in test-run/test-name/detail/ and the command is contained in test-run/test-name/commands/, is there a reason you want these printed?

I'd prefer not to make this change unless the verbosity level can be managed with a bit more granularity.

@rkayaith
Copy link
Member Author

The file output is nice when running a set of tests, but when iterating on a single test it's quite inconvenient to dig through the files to find this information.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants