Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

(maybe) Add Tooling and Automation for Benchmarking/Profiling #21

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
ericsnowcurrently opened this issue Mar 19, 2021 · 3 comments
Closed

Comments

@ericsnowcurrently
Copy link
Collaborator

ericsnowcurrently commented Mar 19, 2021

One we have our benchmarking machine set up there are a number of things we should consider doing to make life easier for us. Most will involve working through the persistent Azure host we'll be using for tunneling anyway.

Here is the possible tooling and automation to think about:

  • move the scripts from and to a repo (e.g. https://github.com/faster-cpython/tools)
  • <bench>
    • add a job queue (for benchmarking & profiling)
      • CLI tool ("add", "list", "remove")
      • cron job or service/daemon to run jobs out of the queue
      • upload
    • add a cron job or service to keep master up-to-date
  • <portal>
    • run the job queue (from above) here instead of on the benchmarking machine
    • artifact storage, including benchmark results
    • a web interface for benchmark results (see speed.python.org and https://github.com/tobami/codespeed)
    • a simple web interface for the job queue
    • set up a cron job or service to generate up-to-date benchmarking results
      • for a curated list of tags (plus any new ones) and for master (maybe once each day)
      • we'll use these for comparison
  • <pyperformance>
    • a script to set up a "compile" config for a job
  • <github>
    • create a github action for requesting a benchmarking run and checking the results
    • do benchmarking runs for PRs; options:
      • a PR check (fail if performance regresses)
      • a bot to listen for requests in PR comments
  • <speed.python.org> keep those up-to-date results here instead
  • ...
@gvanrossum
Copy link
Collaborator

How are we doing access control? Note that Mark (and other core devs who might help out) won’t have MS intranet access.

@ericsnowcurrently
Copy link
Collaborator Author

Good question. For shell access we'll use SSH keys. For any web interface we might introduce I suppose we can use OAuth through GitHub.

@gvanrossum
Copy link
Collaborator

gvanrossum commented Mar 22, 2021 via email

@faster-cpython faster-cpython locked and limited conversation to collaborators Dec 2, 2021
@gramster gramster moved this to Todo in Fancy CPython Board Jan 10, 2022
@gramster gramster moved this from Todo to Done in Fancy CPython Board Jan 10, 2022

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
Development

No branches or pull requests

3 participants