Closed phcaptjim closed 6 years ago
Hi Jim, Existing store in Redis should not be cleared just because you restarted your Streams application or you changed the tuple schema. Please ensure that if someone has not restarted Redis in order for it to lose its memory contents.
On a different note, it is not correct to change the tuple schema when you have an existing store that already holds data created using a different schema. If you have to change your schema, then you have to start inserting the data based on the new schema in a completely different store. Using a different test application, you should then migrate your existing data in the old store to the new store by transforming the stored tuple data to use the new schema. After this migration, you can get rid of the old store via the dpsRemoveStore API within your test application.
Hi Nysenthil, Thanks for your comments on the data store. What you are saying makes perfect sense. I will try and find out how the store was reset as we did not restart Redis and going forward will try and apply your technique of creating a new store when the schema is changing. Thanks!
We are running Streams 4.2.1 and have Redis running to do a real time lookup for patient beds. Last week we changed some logic in our SPL code and migrated that code to our test environment. We noticed that when we restarted the SPL jobs and HAPI microservices, Redis was cleared during this process. I am trying to figure out why that would happen?
We are using the following command when checking for the data store:
store_handle = dpsCreateOrGetStore("ADTStore", dummyRstring, ADTInfoTuple, err) ;
One thing that did change is the size of the tuple. We added 4 additional fields to our tuples so perhaps that is the reason the data store was cleared? Does anyone have any experience in this? I want to see if there is a way to make this update in the future without clearing our data store. Thanks!