Closed Melcus closed 8 years ago
If you're using MySQL try to update TNTSearch to the latest version composer require teamtnt/tntsearch
and then run tntsearch:import Model
again.
Let us know if it works
It's importing 1000 at a time, Well done ! Any news on filtering?
Filtering is planned, but it's hard to say when exactly it'll be done since we're working on this project in our spare time
Alright, thank you.
@Melcus How long did it take for you to index those 5 million rows? And whats the search execution time?
It goes at about 1000 / sec ( I'm on homestead, so theoretically, it can go even faster ). I didn't import all of them yet, the data is still growing. Only thing I tested was on 20k rows, ~54ms
If I'm trying to use it to create a search engine for a dynamic website with records updated daily, do I have to import the new records every time there is a new entry? Or can I import the Model once when I install it?
@nivanmorgan you only import all of the records once, and thats it. Everything else is automatically handled by scout meaning it will sync the index on each, update, delete or insert.
Hi,
Is there any way to disable handlers ?
I need to disable index at import data time and after that refresh index.
Automatic index synchronization make my data imports very slow.
I want to import a table with about 5 million rows and it runs out of memory. Is there a way to import them in batches using the tntsearch:import?
Also, having filtering would make this package a must in every project.