-
Notifications
You must be signed in to change notification settings - Fork 536
[Benchmark] fail test if model artifact does not exist #8482
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/8482
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit f2b250e with merge base a0924f7 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
@pytorchbot label "topic: not user facing" |
31d500b
to
5ff1e8f
Compare
09fa7a9
to
a0d81f8
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! As the build is broken atm with a fix #8474 (review) coming. So, let's wait for fix to land and do a rebase
70af338
to
102a0dc
Compare
102a0dc
to
9e38e39
Compare
9e38e39
to
f2b250e
Compare
Summary
Fixes #8125
Fails the benchmark-on-device job the job at verification step if artifacts don't exist.
Verily Test Spect
to download the test-specDetails
move prepare-test-specs under export-models, add dependency on export-models job.
The prepare-test-specs will fail to generate test-spec if no model exists in s3, this can cause the benchmark-on-device job fail at step
verily test-spec
others
the failure in this pr seems like not related to this pr
see test pr: #8484
just added echo, it still fails