Using standard counting can be quite time consuming for large tables.
I use Postgres, and after lowering the db plan, this is acceptable for the project, but the counting items for the Dashboard page increased slightly (for some large tables (>8kk) it now takes >15s) that in sum will lead timeout error, and in order not to turn off the statistics it would be nice to be able to change the strategy, for example, in a model I could redefine it like this:
def self.count(arg=nil)
can_be_approximate = !arg && !try(:to_sql)
return super(arg) unless can_be_approximate
ActiveRecord::Base.connection.exec_query("SELECT reltuples AS approximate_row_count FROM pg_class WHERE relname = '#{table_name}'")[0]['approximate_row_count'].to_i
end
under certain conditions (like mine) for some large tables this can work >100 times faster. (>15s => ~0.15s)
Using standard counting can be quite time consuming for large tables. I use Postgres, and after lowering the db plan, this is acceptable for the project, but the counting items for the Dashboard page increased slightly (for some large tables (>8kk) it now takes >15s) that in sum will lead timeout error, and in order not to turn off the statistics it would be nice to be able to change the strategy, for example, in a model I could redefine it like this:
under certain conditions (like mine) for some large tables this can work >100 times faster. (>15s => ~0.15s)