Open philipboomy opened 6 years ago
So I was partly wrong. It is working correctly. The reason why it was still pushing 4 records to Algolia was because I had 4 files in my folder but here is the weird part. I only had 3 records in the CP. This is probably just an abnormality so lets ignore that for now.
I do wonder about something else. I deleted an record in the CP and the Algolia index was not updated even though I have toggled this on in the settings: Search auto-index
Does that mean I have to do this "php please search:update collections/recipes" if I delete a record? Thats not possible for a client to do. I am not sure if this is a bug or I need to do a feature request that Algolia is updated after a deleting en entry.
If its possible to delete an entry in Algolia when deleting it in the CP then it would be nice if it didn't delete all the records at Algolia as bigger sites will quickly reach the "INDEXING OPERATIONS" limit in Algolia. If a client has 2000 entires and do 2-3 deletes a day (which is doubt but still)
Update to the above issue regarding 3 or 4 entries. I am thinking if 1 of the post was a draft. That could explain the issue. So if thats true then it was a user error. Sorry about that.
I have testede the Algolia indexing after deleting an entry and it works fine. So that is why I believe the indexing issue was due to an entry being a draft.
I will keep this issue open so you can see that the issue is "fixed".
What is best practice here if I want to to keep this bit here for you to consider:
I do wonder about something else. I deleted an record in the CP and the Algolia index was not updated even though I have toggled this on in the settings: Search auto-index
Does that mean I have to do this "php please search:update collections/recipes" if I delete a record? Thats not possible for a client to do. I am not sure if this is a bug or I need to do a feature request that Algolia is updated after a deleting en entry.
If its possible to delete an entry in Algolia when deleting it in the CP then it would be nice if it didn't delete all the records at Algolia as bigger sites will quickly reach the "INDEXING OPERATIONS" limit in Algolia. If a client has 2000 entires and do 2-3 deletes a day (which is doubt but still)
Should I open a FR?
Once again sorry about the draft entry.
Don't know if this helps, @philipboomy but this is related to #1839 in that collection entries aren't updated in search indexes when using individual collection specific indexes. A workaround, is to either regularly rebuild the Algolia index or use an add-on to watch for those changes and update it yourself. :/
Ah I see. I should have found that issue myself. Thanks for linking to it. Hopefully it can be fixed but in the mean time I might do a cronjob or something simple.
Any updates on this issue?
Expected behaviour
Expects that Algolia is updated when running this update so it reflects whats in the Statamic install.
Actual behaviour
Deleted entries in Statamic is still in the Algolia index even after cache:clear command.
Steps to reproduce
Server Details
Operating System: Mac
Web Server: Valet
PHP Version: 7.1
Statamic Version: 2.8.10
Updated from an older Statamic or fresh install: 2.8.6
List of installed addons: No addons
Logs
If any logs (browser, server, or Statamic) are appropriate...