Closed GoogleCodeExporter closed 9 years ago
Original comment by mitja.le...@gmail.com
on 20 Apr 2012 at 1:10
So what we supposed to do if we encountered this problem ?
Original comment by Rabarija...@gmail.com
on 20 Apr 2012 at 8:33
The problem should be fixed in last svn commit. Indexes have to be recreated,
since initial vectors (IV) in CBC compression are not necasery correct, if data
is read in different oreder.
Original comment by mitja.le...@gmail.com
on 20 Apr 2012 at 3:35
Thanks you:) I'll check it this week.
Original comment by Rabarija...@gmail.com
on 21 Apr 2012 at 7:48
I did a check out of the last svn commit but I still get the error and more
often than when I used the version 0.9.2.2 of LuceneTransform. I'm doing the
indexation of more than 40 Gb of data. Is it because of the data volume ? When
I indexed 300 Mb of data, I didn't get it.
Original comment by Rabarija...@gmail.com
on 23 Apr 2012 at 8:42
When I indexed 2.5 million records,the index was hanging for ever (It was a
deadlock) during the concurrent update and read using SolrJ. I had to go with
alternate approach for encryption (field level encryption).
Original comment by sudheerp...@gmail.com
on 23 Apr 2012 at 4:00
Original issue reported on code.google.com by
sudheerp...@gmail.com
on 31 Jan 2012 at 5:37