DeepLabCut / DLC-inferencespeed-benchmark

A database of inference speed benchmark results on various platforms and architectures
https://deeplabcut.github.io/DLC-inferencespeed-benchmark/
MIT License
4 stars 4 forks source link

Benchmarking #4

Closed AlexEMG closed 3 years ago

AlexEMG commented 3 years ago

Just ran the benchmark on my NVIDIA RTX TITAN with Ubuntu 20.04!

glopesdev commented 3 years ago

@AlexEMG I've added a github action and preprocessing script to try out on the most recent datasets. I don't think we will need the stats object after all, given we can run python and the inference times are in the pickle file anyway.

I will merge this pull request and probably remove the oldest benchmark pickles, as they might not be compatible with the final benchmark format.