Open htsv opened 4 years ago
Are you using moor_ffi
or moor_flutter
? Are you using batches to insert?
I'm using moor_ffi
. I'm using batch insert as of now.
As I'm also writing the same entry to an FTS table, through a trigger and it does not update if I use the replace mode on the insert, I'm deleting the entry first and then inserting it. As in the code below.
Future<void> batchInsertProducts(List<Product> entries) async {
return await batch((batch) {
entries.forEach((entry) {
batch.delete(db.products, entry);
batch.insert(db.products, entry);
});
});
Product is a class Product extends DataClass implements Insertable<Product>
.
This is the method that I use to insert. I have to use await
to ensure an update to another table is made after the values are inserted in this table.
Thanks for the quick reply!
Sounds like some memory problem, I am afraid I can't help you. Try updating moor and moor_ffi, not sure if that makes a difference.
Any difference between OS, simulator and real device?
The package version that I am using currently is moor: ^3.0.2
and moor_ffi: ^0.5.0
. I'll upgrade them to the latest and try it again.
Yes, the behaviour varies on the simulator and the real device. Tested it just now. Thanks for the advice!
I faced the issue on real devices. On the simulator (both Android and iOS), the wait time is pretty low, something that I wouldn't mind. This should mean that there is a memory problem, right? Is there something that I can do about this? Please let me know.
It seems like some memory problem but like I said, I have no idea. Just asking questions that might help solve this.
I learned today that sqlite can use some sort of temp file: https://github.com/simolus3/moor/issues/621#issuecomment-641318713 Maybe this can help on Android?
I updated the packages, it wasn't of much use.
So, I went ahead and implemented the suggested solution i.e., as mentioned in issue #621 and it was not of much help either. It stayed the same. Also, I think giving a temporary path is useful in a situation where the query to a database results in a large number of rows (I was able to test this as I had a table with 200k products and queried to get about 14k products in the result. This was not possible before without temp file). I'm not sure though.
Do you have any other suggestions? I'm unsure of what else could be done.
No, you probably have to wait for @simolus3's input.
As I'm also writing the same entry to an FTS table, through a trigger and it does not update if I use the replace mode on the insert
Interesting. It looks like an INSERT
statement that behaves like an update doesn't call triggers listening on inserts. I assume this should work if you have another trigger to propagate updates into the fts table, but I haven't tried it.
I also think that this is likely some memory issue - a batch
implicitly creates a transaction if there isn't one, so there should only be one write to disk. Does the situation improve at all if you split the work across multiple batches and use a transaction
around all of that? That might reduce memory usage.
Thanks for the suggestion! I'll test it out and let you know if it helps.
Hey, I split the work into multiple batches wrapped in a transaction but there is no improvement. Could this be due to something else? Is there any other thing that I could try?
Nothing obvious comes to mind, sorry. I would have to do some research to find out why this is happening. It might be that fts5 tokenization and indexing is just taking a long time on slow devices, but that's just speculation.
If this is causing frame drops for you, you could try running the operation in a background isolate. That won't make it any faster, but at least it doesn't block the main thread.
I'm running it on background isolate already, there is no frame drop issue, its the speed that's bothering. I'll see if I can change the way I'm entering the data. I'll add it here if I find some workaround.
Anyways, thank you for the suggestions!
Yeah I noticed also that first Insert in table takes around 10x times more than next inserts individually. Not sure why this is the case.
` I/flutter ( 9726): ⏳ TIMER [NEW T1 INSERT] STARTED I/flutter ( 9726): ⏳ TIMER [NEW T1 INSERT] STOPPED AT => 0:00:00.243582 I/flutter ( 9726): ⏳ TIMER [NEW LIST_O1 INSERT] STARTED I/flutter ( 9726): ⏳ TIMER [NEW LIST_O1 INSERT] STOPPED AT => 0:00:00.182205 I/flutter ( 9726): ⏳ TIMER [NEW T_O_REL_1 INSERT] STARTED I/flutter ( 9726): ⏳ TIMER [NEW T_O_REL_1 INSERT] STOPPED AT => 0:00:00.033983
I/flutter ( 9726): ⏳ TIMER [NEW T2 INSERT] STARTED I/flutter ( 9726): ⏳ TIMER [NEW T2 INSERT] STOPPED AT => 0:00:00.001626 I/flutter ( 9726): ⏳ TIMER [NEW LIST_O2 INSERT] STARTED I/flutter ( 9726): ⏳ TIMER [NEW LIST_O2 INSERT] STOPPED AT => 0:00:00.000998 I/flutter ( 9726): ⏳ TIMER [NEW T_O_REL_2 INSERT] STARTED I/flutter ( 9726): ⏳ TIMER [NEW T_O_REL_2 INSERT] STOPPED AT => 0:00:00.000458 `
I'm using moor^3.0.2. The first entry to the database in a table that has about 10k entries takes a long time, whereas it does not happen for the tables that have about 1k entries.
I'm facing this issue when trying to sync with the database on the backend. The calls are paginated to get at most 1000 entries on a single page. I'm batch inserting these entries, I'd also tried single entires. Both cases cause the same issue.
But once when the first insert is done everything works fine as it should.
Please let me know what can be done. Hope you reply soon!