Automatic website gen for integration test cases executions#190
Automatic website gen for integration test cases executions#190
Conversation
Signed-off-by: Ignacio Hagopian <jsign.uy@gmail.com>
Signed-off-by: Ignacio Hagopian <jsign.uy@gmail.com>
Signed-off-by: Ignacio Hagopian <jsign.uy@gmail.com>
Signed-off-by: Ignacio Hagopian <jsign.uy@gmail.com>
Signed-off-by: Ignacio Hagopian <jsign.uy@gmail.com>
…eless-validator guest program Signed-off-by: Ignacio Hagopian <jsign.uy@gmail.com>
Signed-off-by: Ignacio Hagopian <jsign.uy@gmail.com>
Signed-off-by: Ignacio Hagopian <jsign.uy@gmail.com>
Signed-off-by: Ignacio Hagopian <jsign.uy@gmail.com>
Signed-off-by: Ignacio Hagopian <jsign.uy@gmail.com>
| @@ -10,12 +10,15 @@ concurrency: | |||
| group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }} | |||
| - zkvm: openvm | ||
| threads: 2 |
There was a problem hiding this comment.
This is an old setup since OpenVM uses quite a lot of RAM per case, so going very far in parallelization makes the CI machine struggle. But since OpenVM emulation is very fast, doesn't hurt much CI duration.
For the rest, I configured 12 threads to avoid going with 16 which is the default for the CI machine. Mainly to give some stability to wall-clock time runs.
| run: | | ||
| ${{ matrix.zkvm == 'openvm' && 'RAYON_NUM_THREADS=1' || '' }} RUST_LOG=warn,benchmark_runner=info ZKVM=${{ matrix.zkvm }} EL=${{ matrix.el }} cargo test --release -p integration-tests -- --test-threads=1 ${{ matrix.test }} | ||
| zkvm: zisk # See https://github.com/eth-act/ere/issues/186 | ||
| uses: ./.github/workflows/run-benchmark.yml |
There was a problem hiding this comment.
Most of deleted code in this file was extracted to this separate component for better modularity and avoid repetition.
| el: 'none' | ||
| threads: 12 | ||
|
|
||
| generate-benchmark-website: |
There was a problem hiding this comment.
This and next jobs are new ones pulling uploaded artifacts, merging, generating website and publishing.
Pending to only run this on master -- for now is always running to test the PR.
| )?; | ||
| let config = RunConfig { | ||
| output_folder: cli.output_folder, | ||
| sub_folder: Some(el.as_ref().to_lowercase()), |
There was a problem hiding this comment.
I would like to eventually reconsider this solution, but probably closer when we accept raw-ELFs potentially for externally compiled ELFs.
Signed-off-by: Ignacio Hagopian <jsign.uy@gmail.com>


This PR's primary goal is to leverage the work done in the integration test pipeline to generate an automated website displaying zkVM execution results.
Summary of changes:
WORKLOAD_OUTPUT_DIRso the test results aren't discarded, but saved to a desired path. The way it works is that it usestempdir(as it always did yesterday) or a defined path depending on that var.WORKLOAD_OUTPUT_DIRso all the integration test results runs are uploaded as artifacts. This can be useful to further inspect runs, or have a medium-term historical view if useful.stateless-validatorguest program runs, thezkevm-metricsfolder structure has a new level "EL". So instead of now beingzkevm-metrics/<zkvm>/[results]iszkevm-metrics/<el>/<zkvm>/[results]. Apart for being more organized, it was useful to allow the generated website to identify ELs correctly.The idea is to try to generate execution reports automatically:
https://eth-act.github.io/zkevm-benchmark-workload/