issues
search
decile-team
/
cords
Reduce end to end training time from days to hours (or hours to minutes), and energy requirements/costs by an order of magnitude using coresets and data selection.
https://cords.readthedocs.io/en/latest/
MIT License
316
stars
53
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Replace apricot by submodlib in cords
#16
rishabhk108
opened
3 years ago
2
Added checkpoints to save the model and updated documentation
#15
dheerajnbhat
closed
3 years ago
0
Documentation Improvement
#14
krishnatejakk
opened
3 years ago
0
Implement a better version of OMP algorithm
#13
krishnatejakk
closed
2 years ago
1
Feature: add support for hyperparameter tuning with subset selection
#12
savan77
closed
3 years ago
0
New Gradient Computation Code
#11
krishnatejakk
closed
3 years ago
0
CONFIG Files Pull
#10
krishnatejakk
closed
3 years ago
0
Dev
#9
krishnatejakk
closed
3 years ago
0
Modify submodular selection strategy to include different submodular functions
#8
krishnatejakk
opened
3 years ago
0
Implement master train strategy code for all selection strategies
#7
krishnatejakk
closed
3 years ago
1
Update readme.md file with new information
#6
krishnatejakk
closed
3 years ago
0
CORDS: Get an efficient version of computing gradients for general loss function
#5
krishnatejakk
closed
3 years ago
1
Regularized versions for GLISTER, GradMatch for subset selection
#4
krishnatejakk
opened
3 years ago
0
Inclusion of new tutorials for CORDS with better documentation
#3
krishnatejakk
closed
3 years ago
0
CORDS: Empirical study of different selection strategies in different setting like regression, classification, object detection
#2
krishnatejakk
opened
3 years ago
1
CORDS gradient calculations for different loss functions
#1
krishnatejakk
closed
3 years ago
1
Previous