-
Notifications
You must be signed in to change notification settings - Fork 695
Create a collaborative benchmark repo #1287
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
We have recently benchmarked different Wasm execution environments on some tests. If you want, you could use them for your benchmark and I can help with setup. There are 8 tests:
All of these tests have some adjustable parameters set up as environment variables. Furthermore, we have Redis with lua ported to Wasm that could be compiled without any imported functions, and it could be a good test if we predefine some Redis commands in a resulted binary like this. |
The emscripten benchmark suite has a bunch of real-world codebases like poppler, bullet, box2d, lua, and sqlite. Some builds are here, I can update those if it would be useful. It's pretty easy to build them too, basically edit that benchmark python file to pick which VMs to run etc., then |
Wasmer has a speed center (see the blog post). It's open source, https://github.com/wasmerio/wasmer-bench, anybody can install it. It allows to track performance, compare different modules (or runtimes) etc. cc @bjfish |
The benchmarks repository has been created! For any questions; please use that repository's issue tracker. |
We discussed this in the 2019 CG meeting and decided to create a new collaborative benchmark repository.
See https://github.com/WebAssembly/meetings/blob/master/2019/CG-06.md#collaborative-benchmark-suite--hr.
cc @titzer
The text was updated successfully, but these errors were encountered: