Closed codekitchen closed 6 months ago
You can batch updates yourself by introducing an intermediary object that handles batching according to your app's business rules (this is known as the Mediator Design Pattern). You would instead explicitly data-bind the table to the Mediator object instead of the original Model, and the Mediator object would actually set the attribute that is data-bound to the table (e.g. mediator.records
if table is data bound as cell_rows <=> [mediator, :records]
) once its value has changed and is ready for display to the user. Alternative, you can call notify_observers(:attribute_name)
from the object that is data-bound to the table
once the attribute data is ready to display to the user (Glimmer automatically adds that notify_observers(attr)
method to any object data-bound to a table
).
Alternatively, you can have the table
use implicit data-binding instead of explicit data-binding.
Here is an example with explicit data-binding, which would get affected if the model attribute (e.g. user.contacts
) got updated 100 times per second or something:
table {
text_column('Name')
text_column('Email')
text_column('Phone')
text_column('City')
text_column('State')
cell_rows <=> [user, :contacts]
}
Here is the same example written with implicit data-binding, which only binds the table to the data collection once, but does not observe a model attribute's updates (so if user.contacts
got replaced by a new collection multiple times per second, that won't affect the table):
table {
text_column('Name')
text_column('Email')
text_column('Phone')
text_column('City')
text_column('State')
cell_rows user.contacts
}
In the case of implicit data-binding, you'd have to add an extra observer to finally update the table after a batch of changes occur like this:
@table = table {
text_column('Name')
text_column('Email')
text_column('Phone')
text_column('City')
text_column('State')
cell_rows user.contacts
}
...
# In other code in the View (usually in a `after_body` hook on a `Glimmer::LibUI::Application` or `Glimmer::LibUI::CustomControl`), you do something like this:
@table.cell_rows = new_contacts
If you want to load data gradually, you can load it gradually while using a different paginated table control called refined_table
. That is what we do in the Internet Radio app, rubio-radio, which loads 33,000 radio stations into a Glimmer DSL for LibUI table gradually using refined_table
(data is added gradually to the @presenter.stations
array that is implicitly data-bound to refined_table
by passing as its model_array
option).
https://github.com/kojix2/rubio-radio/blob/main/lib/rubio/view/radio.rb
Also, you can initialize a normal table
from a large dataset gradually row-by-row as the user is scrolling down through the table
by setting cell_rows
to an object that is an Enumerator
or Enumerator::Lazy
instead of Array
.
Example: https://github.com/AndyObtiva/glimmer-dsl-libui/blob/master/examples/lazy_table4.rb
In conclusion, there are several options to explore:
table
refined_table
custom control for paginating a table while loading its data incrementallyEnumerator
table
cell_rows
value, which loads data into a table row-by-row as the user is scrolling downIf your query is adding records 1 by 1, I think options 3 and 4 work best. I would probably start with option 4 first as I believe it is the simplest.
If your data just needs to be shown to the user after it is fully loaded, without gradually loading into the table, then options 1 and 2 work best.
(If you come up with a general observer batching solution, you are welcome to contribute to Glimmer)
If you need further help, feel free to ask. I have a lot of experience with GUI problems and have built many GUI apps using Glimmer with and without a database and/or online API.
You can also hit me up on Gitter Chat (on Element) if you prefer to chat directly instead: https://gitter.im/AndyObtiva/glimmer?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
Oh that is great thanks, I hadn't full understood "implicit data-binding" before. That's perfect for this use case where frequent updates to existing rows is what is causing the perf issue. I just tried it out and it simplifies things a lot. Now I'm essentially just doing (simplified):
Glimmer::LibUI.timer(0.1) { @table.cell_rows = self.conns }
and this batches up the frequent updates to the existing table rows. Stress testing with thousands of requests per second goes smoothly now. Thanks!
Hi! Is there any ability in glimmer to batch up observer updates in any way? I have a table with a connection query count, and when queries are coming in quickly it can trigger thousands of updates in one second, causing the UI to lock up for a few seconds while it catches up:
I'm looking at the observer code in glimmer and nothing is jumping out at me, but I wanted to make sure I'm not missing anything. Ideally it'd be nice to configure an observed property to limit fired change events to one every 100 ms or something. Right now I'm working around it with some custom event queueing logic that tries to combine updates in a 100 ms window and only update the model once per batch, but of course it'd be great to avoid having to do that in the application code.