-
Notifications
You must be signed in to change notification settings - Fork 537
Support benchmark using prebuilt artifacts #8246
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
AI: Circle back with @kirklandsign and @shoumikhin on what pre-built artifacts that need to be there on the device besides the app and the export model |
@huydhn if you're asking about the benchmarking app, we need to put all |
@shoumikhin @kirklandsign I believe @huydhn is referring to the new artifacts generated by the profilers, e.g. ETdump, chrome trace, simpleperf, etc. specifically, generated from these work:
For example, for users be able to accessible the artifacts and visualize in framegraphs, we will need to store the artifacts somewhere, e.g. S3. Same for Etdump. |
@shoumikhin @kirklandsign fyi, Huy will be on leave from Feb 25, if you guys need help to store the artifacts in DB, or access the artifacts from the dashboard UI or showing the URL in the job log, Huy can help with it or give you a pointer before he in on leave. |
Follow up on the discussion around supporting artifacts generated from profilers. I think if we can provide a generic script allow uploading arbitrary blobs (typically zipped) to S3, it should provide the flexibility to store any artifact from profiling. wdyt @shoumikhin @kirklandsign @huydhn? @huydhn is there a size limit for uploading to S3? If yes, what it is? Put together with the work in #8245, the UX would like what I described here: #8402 (comment) |
Uploading from linux/macos runner should be ok per Huy. |
🚀 The feature, motivation and pitch
Enable this for on-demand workflow only, to offer developers additional flexibility and efficiency.
Scenarios where benchmarking w/ prebuilt artifacts are needed:
UX:
Source of the artifacts to be used in the benchmark workflow:
We will need to define the UX to support this feature. For example, allow users to upload prebuilt artifacts via script. The script will return with links to the artifacts. Then users can schedule an on-demand workflow via UI, or users can do everything via the script.
Policy and TTL to keep the uploaded artifacts.
CC: @digantdesai @kimishpatel @cccclai
Alternatives
No response
Additional context
No response
RFC (Optional)
No response
cc @huydhn @kirklandsign @shoumikhin @mergennachin @byjlw
The text was updated successfully, but these errors were encountered: