issues
search
zarr-developers
/
perfcapture
Capture the performance of a computer system whilst running a set of benchmark workloads.
MIT License
2
stars
2
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Update sphinx requirement from <8.0.3 to <8.1.4
#29
dependabot[bot]
closed
1 week ago
0
Update sphinx requirement from <8.0.1 to <8.0.3
#28
dependabot[bot]
closed
2 months ago
0
Update sphinx requirement from <7.4.8 to <8.0.1
#27
dependabot[bot]
closed
2 months ago
0
Update sphinx requirement from <7.4.5 to <7.4.8
#26
dependabot[bot]
closed
3 months ago
0
Update sphinx requirement from <7.3.8 to <7.4.5
#25
dependabot[bot]
closed
3 months ago
0
Update sphinx requirement from <7.2.7 to <7.3.8
#24
dependabot[bot]
closed
6 months ago
0
Add selected_datasets option to restrict datasets that are tested
#23
jbms
closed
11 months ago
1
Resolve block device symlinks
#22
jbms
closed
11 months ago
1
CI is failing
#21
JackKelly
opened
1 year ago
0
`PerfCounters` should return a `results` DataFrame, not format its output.
#20
JackKelly
closed
1 year ago
0
`perfcapture` should output file, which we then read in an `ipynb`
#19
JackKelly
closed
1 year ago
0
Remove `parameterize.py` if we don't use it.
#18
JackKelly
opened
1 year ago
0
Allow users to select workload(s) & dataset(s) at the CLI
#17
JackKelly
closed
11 months ago
1
Enable `perfcapture` to be run against recipes on GitHub so users don't have to clone `zarr-benchmark`
#16
JackKelly
opened
1 year ago
0
Update sphinx requirement from <7.2.5 to <7.2.7
#13
dependabot[bot]
closed
1 year ago
0
Consider defining workloads & benchmarks in yaml to be consistent with `zarr_implementations`
#12
JackKelly
opened
1 year ago
0
Maybe rename `Dataset.prepare` to `Dataset.create`
#11
JackKelly
closed
1 year ago
0
Automated integration test which runs the cli against all examples
#10
JackKelly
opened
1 year ago
0
Document terminology & make it consistent throughout the code
#9
JackKelly
opened
1 year ago
2
Consider different names for project (instead of `perfcapture`)?
#8
JackKelly
opened
1 year ago
0
Document how to add a new workload and dataset
#7
JackKelly
opened
1 year ago
0
Implement parameterised `Workload.run` & `Dataset.prepare` methods
#6
JackKelly
closed
1 year ago
1
Increment the `Development status` in setup.cfg
#5
JackKelly
closed
1 year ago
0
Measure & record performance while benchmarks run!
#4
JackKelly
closed
1 year ago
1
Configure entry point to make it easier to run `cli.py`
#3
JackKelly
opened
1 year ago
1
Automatically release to PyPI & publish release on GitHub
#2
JackKelly
opened
1 year ago
0
Automate building & publishing of docs
#1
JackKelly
opened
1 year ago
0
Simple API for specifying each benchmark workload
#14
JackKelly
closed
1 year ago
2
What to measure during benchmarking?
#15
JackKelly
opened
1 year ago
1