Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test: benchmark CI from laurentsenta #292

Closed
wants to merge 3 commits into from
Closed

test: benchmark CI from laurentsenta #292

wants to merge 3 commits into from

Conversation

SgtPooki
Copy link
Member

@SgtPooki SgtPooki commented Oct 26, 2023

@SgtPooki SgtPooki requested a review from a team as a code owner October 26, 2023 22:55
@SgtPooki SgtPooki requested a review from laurentsenta October 27, 2023 00:05
@laurentsenta
Copy link

laurentsenta commented Oct 27, 2023

@SgtPooki thanks for looking into this! You might have a problem with reproducibility as it is right now, when I experimented, 6 runs gave 6 different results (see the Attempts button at the top), you might need self-hosted runners. cc @galargh

@SgtPooki
Copy link
Member Author

SgtPooki commented Oct 27, 2023

Good call.

In addition to self-hosted runners, I wonder if there are more methods we could use to ensure a more accurate representation of performance across runners. Something like running a noop test and using that as a baseline to compare the actual runs in that CI action to. We could get a relative result via a normalization function that should take the environment into consideration.

Here is a good article on this i just checked out, that isnt directly useful but talks about normalization in CI environments: https://pythonspeed.com/articles/consistent-benchmarking-in-ci/

Another good read, with DENSE solution details, can be found at https://arxiv.org/pdf/1608.04295.pdf. It covers the theory behind the system used by the Julia language for benchmarking in noisy environments

@SgtPooki SgtPooki marked this pull request as draft November 1, 2023 17:14
@SgtPooki SgtPooki closed this Jan 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
No open projects
Development

Successfully merging this pull request may close these issues.

feat: run benchmarks via github action and publish results
2 participants