Closed Cliffus closed 8 years ago
It will be very useful. I certainly will implement support for caching.
great! thanks a lot
thanks for fixing my issues! when are you going to release a new version for using it with laravel?
Now each search result is cached, also count of queries in getting limited results is reduced. These improvements are available in version 1.2.* for laravel 4.2.
I'm already using laravel 5.1, any idea when it will be also available?
I will try to make necessary changes for laravel 5.0 and 5.1 next week.
great, thanks a lot
All improvements are already ported. Try v2.1.2.* for laravel 5.1.
thank you very much, I'll try them asap
can you summarise the changes I have to implement to make use of the caching? or is this caching used automagically?
There is one large change: new filtering of models. In SearchableInterface
: isSearchable
has been replaced on getSearchableIds()
. Caching for results (in memory, as array) is enabled by default.
There are any troubles?
well, I have a database with +/- 75000 rows in it, and Lucene only searches the name field. I didn't implement the SearchableInterface, because all items need to be searchable.
I noticed with Laravel debug bar, that the laravel-lucene-search bundle still queries all item id's from the database and the Nqxcode\LuceneSearch\Model\Config->models() method checks if the hits are also in the id's array. Maybe it makes more sense to skip this check when a model class doesn't implement the SearchableInterface interface? I forked this repo and tried it out:
You can speed up getting of results if you do not select all results, you should get results in parts (use paginate()
or limit()
& get()
).
In Nqxcode\LuceneSearch\Model\Config->models()
method, filtering of nonexistent hits (or passing only allowed hits which were obtained by getSearchableIds
) is executed.
If you will disable this filtering, the found results will be incorrect. In results there can be not existing models. You will receive unessential acceleration, but the logic of getting results will be broken.
I made some improvements for get() function (show d2a6155 commit), pull them to yourself.
This is more like a feature request than an issue, but it would be great if this library would support caching. Now the library runs a select query for every hit the lucene search engine returns. Caching the results of these queries would be a huge speed improvement.
What do you think?