-
Notifications
You must be signed in to change notification settings - Fork 102
Description
Most of the time the reason we use BenchmarkTools is not because we want to know how fast is A but rather if A is faster than B and by how much. A very good addition in my opinion to BenchmarkTools would be a macro to compare A vs B vs… X instead us guessing if one is faster than the others based on their statistics. This macro would also allow for internal bias reduction (reloading A and B and…, etc.) and running such macro for a long time should account as well for the whole machine/OS potential bias.
And I concur. In benchmarking and optimizing a function, I often define function_old() and function_new() and check if changes to function_new() have the runtime impact I expect. In a benchmarking package, ideally I can perform that comparison correctly, easily, quickly, and precisely. A well crafted varargs @benchmark that supports @benchmark function_old() function_new() would be ideal.
This extension has the additional potential to help users like me avoid common benchmark comparison pitfalls like those discussed in the linked discourse thread