astropy / astropy-benchmarks

Benchmarks for the astropy project
https://spacetelescope.github.io/bench/astropy-benchmarks/
BSD 3-Clause "New" or "Revised" License
7 stars 27 forks source link

Add benchmarks from unyt paper (Goldbaum 2018) #54

Closed pllim closed 6 years ago

pllim commented 6 years ago

Fix #53. xref astropy/astropy#7546 and astropy/astropy#7549

ngoldbaum commented 6 years ago

This is probably a separate issue, but don't astropy's benchmarks use airspeed velocity? The last time I checked asv doesn't have a way to control CPU throttling or other effects that might introduce benchmark variance. You may want to consider integrating perf into your airspeed velocity setup to reduce issues like that.

pllim commented 6 years ago

You may want to consider integrating perf into your airspeed velocity setup

Perhaps @astrofrog can comment on this. Locally, I run it on a single core on my machine but I don't know my OS does throttling or not. I think we see small scale variations but overall trend is reliable. Ref: http://www.astropy.org/astropy-benchmarks/

(Yes, it is a separate issue. This PR simply introduces new benchmarks according to your paper.)

astrofrog commented 6 years ago

@ngoldbaum:

This is probably a separate issue, but don't astropy's benchmarks use airspeed velocity? The last time I checked asv doesn't have a way to control CPU throttling or other effects that might introduce benchmark variance

When running asv, we only run it on physical machines (not using CI) and prevent core-swapping using taskset -c 0 asv so we get reasonably stable results:

http://www.astropy.org/astropy-benchmarks/#coordinates.time_latitude

pllim commented 6 years ago

@mhvk and @ngoldbaum , I think I addressed all your comments.

@astrofrog , I ran the benchmarks and results should be in https://github.com/pllim/astropy-benchmarks/tree/lim-test but I am having trouble getting them to show up in https://pllim.github.io/astropy-benchmarks/ . I think it is the same problem as #50 but I am not sure where is the script that you updated for the fix.

pllim commented 6 years ago

Update: Turns out I needed to also copy results/benchmarks.json over to "results" branch, not just results/machine_name/ folder. New benchmarks (for limited number of commits) appear at https://pllim.github.io/astropy-benchmarks , so subclassing seems to work!