mitjale / lucenetransform

Automatically exported from code.google.com/p/lucenetransform
0 stars 0 forks source link

TransformedIndexInput throws IOException during search when sort order is provided based on date feild #3

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?

1. Configure with Solr using the with EncryptedDitecroryFacory 
2. Search Solr with sort order with a date field
3. TransformedIndexInput throws IOException

What is the expected output? What do you see instead?
Give the result based on the sort order.
Throws IOException with message
Invalid compression chunk location 131072!=4 java.io.IOException: Invalid 
compression chunk location 131072!=4

What version of the product are you using? On what operating system?
Solr version : 3.3.0
LuceneTransform-0.9.2.2.jar
Operating System: Ubuntu 10.04 
JVM version : Java HotSpot(TM) Server VM (build 17.0-b16, mixed mode)

Please provide any additional information below.
Same error is happening when adding a facet with date range.

Stack Trace:
HTTP Status 500 - Invalid compression chunk location 131072!=4 
java.io.IOException: Invalid compression chunk location 131072!=4 at 
org.apache.lucene.store.transform.TransformedIndexInput.readDecompressImp(Transf
ormedIndexInput.java:471) at 
org.apache.lucene.store.transform.TransformedIndexInput.readDecompress(Transform
edIndexInput.java:430) at 
org.apache.lucene.store.transform.TransformedIndexInput.readBytes(TransformedInd
exInput.java:563) at 
org.apache.lucene.index.TermBuffer.read(TermBuffer.java:82) at 
org.apache.lucene.index.SegmentTermEnum.next(SegmentTermEnum.java:131) at 
org.apache.lucene.search.FieldCacheImpl$LongCache.createValue(FieldCacheImpl.jav
a:512) at 
org.apache.lucene.search.FieldCacheImpl$Cache.get(FieldCacheImpl.java:191) at 
org.apache.lucene.search.FieldCacheImpl.getLongs(FieldCacheImpl.java:478) at 
org.apache.lucene.search.FieldComparator$LongComparator.setNextReader(FieldCompa
rator.java:513) at 
org.apache.lucene.search.TopFieldCollector$OneComparatorNonScoringCollector.setN
extReader(TopFieldCollector.java:95) at 
org.apache.solr.search.DocSetDelegateCollector.setNextReader(DocSetHitCollector.
java:147) at 
org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:523) at 
org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:320) at 
org.apache.solr.search.SolrIndexSearcher.getDocListAndSetNC(SolrIndexSearcher.ja
va:1295) at 
org.apache.solr.search.SolrIndexSearcher.getDocListC(SolrIndexSearcher.java:1062
) at 
org.apache.solr.search.SolrIndexSearcher.search(SolrIndexSearcher.java:358) at 
org.apache.solr.handler.component.QueryComponent.process(QueryComponent.java:258
) at 
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.
java:194) at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java
:129) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1368) at 
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:356) 
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:252)
 at 
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilt
erChain.java:243) at 
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.
java:210) at 
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:2
24) at 
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:1
75) at 
org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.jav
a:462) at 
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:164) 
at 
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:100) 
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:851) at 
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118
) at 
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:405) at 
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:278) at 
org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractPro
tocol.java:515) at 
org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:300)
 at 
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:8
86) at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) 
at java.lang.Thread.run(Thread.java:619) 

Original issue reported on code.google.com by sudheerp...@gmail.com on 31 Jan 2012 at 5:37

GoogleCodeExporter commented 9 years ago

Original comment by mitja.le...@gmail.com on 20 Apr 2012 at 1:10

GoogleCodeExporter commented 9 years ago
So what we supposed to do if we encountered this problem ?

Original comment by Rabarija...@gmail.com on 20 Apr 2012 at 8:33

GoogleCodeExporter commented 9 years ago
The problem should be fixed in last svn commit. Indexes have to be recreated, 
since initial vectors (IV) in CBC compression are not necasery correct, if data 
is read in different oreder.  

Original comment by mitja.le...@gmail.com on 20 Apr 2012 at 3:35

GoogleCodeExporter commented 9 years ago
Thanks you:) I'll check it this week.

Original comment by Rabarija...@gmail.com on 21 Apr 2012 at 7:48

GoogleCodeExporter commented 9 years ago
I did a check out of the last svn commit but I still get the error and more 
often than when I used the version 0.9.2.2 of LuceneTransform. I'm doing the 
indexation of more than 40 Gb of data. Is it because of the data volume ? When 
I indexed 300 Mb of data, I didn't get it. 

Original comment by Rabarija...@gmail.com on 23 Apr 2012 at 8:42

GoogleCodeExporter commented 9 years ago
When I indexed 2.5 million records,the index was hanging for ever (It was a 
deadlock) during the concurrent update and read using SolrJ. I had to go with 
alternate approach for encryption (field level encryption). 

Original comment by sudheerp...@gmail.com on 23 Apr 2012 at 4:00