Stratio / cassandra-lucene-index

Lucene based secondary indexes for Cassandra
Apache License 2.0
600 stars 171 forks source link

Invalid Request - Error parsing long with timestamp pattern #219

Closed Merkaban closed 8 years ago

Merkaban commented 8 years ago

Hallo everybody, we just updated our Cassandra from 2.2.6 to 3.9 and therefore updated the cassandra-lucene-index to 3.9.0. The problem we ran into exists no matter if we update the versions on a running system with existing data oder deploy a new system from scratch on our testservers:

Query:

SELECT cid, state, gwid, start, end FROM foobar.stateseries WHERE cid = '1' AND state = 'up' AND stratio_col = '{ filter : { type : "date_range", field:"duration", from : 6277910400000, to : 7193059200000} }';

Leads to exception:

Error parsing Long with value '1475321693728' using date pattern yyyy/MM/dd HH:mm:ss.SSS Z
com.datastax.driver.core.exceptions.InvalidQueryException: Error parsing Long with value '1475321693728' using date pattern yyyy/MM/dd HH:mm:ss.SSS Z
    at com.datastax.driver.core.exceptions.InvalidQueryException.copy(InvalidQueryException.java:50)
    at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:37)
    at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:245)
    at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:68)
    at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:51)
    at com.diehl.dcs.stratus.service_database.services.impl.DatabaseServiceImpl.retrieveStateSeriesByCustomer(DatabaseServiceImpl.java:1138)
    at com.diehl.dcs.stratus.service_database.services.impl.StateSeriesServiceImpl.retrievePerMonthByCustomerId(StateSeriesServiceImpl.java:125)
    at com.diehl.dcs.stratus.service_database.receiver.ReceiverMq.process(ReceiverMq.java:624)
    at com.diehl.dcs.stratus.service_database.receiver.ReceiverMq.receive(ReceiverMq.java:530)
    at sun.reflect.GeneratedMethodAccessor66.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.springframework.util.MethodInvoker.invoke(MethodInvoker.java:269)
    at org.springframework.amqp.rabbit.listener.adapter.MessageListenerAdapter.invokeListenerMethod(MessageListenerAdapter.java:387)
    at org.springframework.amqp.rabbit.listener.adapter.MessageListenerAdapter.onMessage(MessageListenerAdapter.java:298)
    at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.doInvokeListener(AbstractMessageListenerContainer.java:757)
    at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.invokeListener(AbstractMessageListenerContainer.java:680)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.access$001(SimpleMessageListenerContainer.java:93)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer$1.invokeListener(SimpleMessageListenerContainer.java:183)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.invokeListener(SimpleMessageListenerContainer.java:1358)
    at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.executeListener(AbstractMessageListenerContainer.java:661)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.doReceiveAndExecute(SimpleMessageListenerContainer.java:1102)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.receiveAndExecute(SimpleMessageListenerContainer.java:1086)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.access$1100(SimpleMessageListenerContainer.java:93)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer$AsyncMessageProcessingConsumer.run(SimpleMessageListenerContainer.java:1203)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: Error parsing Long with value '1475321693728' using date pattern yyyy/MM/dd HH:mm:ss.SSS Z
    at com.datastax.driver.core.Responses$Error.asException(Responses.java:136)
    at com.datastax.driver.core.DefaultResultSetFuture.onSet(DefaultResultSetFuture.java:179)
    at com.datastax.driver.core.RequestHandler.setFinalResult(RequestHandler.java:177)
    at com.datastax.driver.core.RequestHandler.access$2500(RequestHandler.java:46)
    at com.datastax.driver.core.RequestHandler$SpeculativeExecution.setFinalResult(RequestHandler.java:797)
    at com.datastax.driver.core.RequestHandler$SpeculativeExecution.onSet(RequestHandler.java:631)
    at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1068)
    at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:991)
    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:328)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:321)
    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:328)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:321)
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:328)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:321)
    at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
    at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:328)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:321)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1280)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:328)
    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:890)
    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:564)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:505)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:419)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:391)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:112)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:145)
    ... 1 common frames omitted

If I use the cqlsh to run the query, the result is:

InvalidRequest: Error from server: code=2200 [Invalid query] message="Error parsing Long with value '6277910400000' using date pattern yyyy/MM/dd HH:mm:ss.SSS Z"

Additional information:

describe table stateSeries:

CREATE TABLE stratus.stateseries (
    cid text,
    state text,
    gwid text,
    start timestamp,
    end timestamp,
    stratio_col text,
    PRIMARY KEY ((cid, state), gwid, start)
) WITH CLUSTERING ORDER BY (gwid ASC, start ASC)
    AND bloom_filter_fp_chance = 0.01
    AND caching = {'keys': 'ALL', 'rows_per_partition': 'NONE'}
    AND comment = ''
    AND compaction = {'class': 'org.apache.cassandra.db.compaction.SizeTieredCompactionStrategy', 'max_threshold': '32', 'min_threshold': '4'}
    AND compression = {'chunk_length_in_kb': '64', 'class': 'org.apache.cassandra.io.compress.LZ4Compressor'}
    AND crc_check_chance = 1.0
    AND dclocal_read_repair_chance = 0.1
    AND default_time_to_live = 0
    AND gc_grace_seconds = 864000
    AND max_index_interval = 2048
    AND memtable_flush_period_in_ms = 0
    AND min_index_interval = 128
    AND read_repair_chance = 0.0
    AND speculative_retry = '99PERCENTILE';
CREATE CUSTOM INDEX status_index ON stratus.stateseries (stratio_col) USING 'com.stratio.cassandra.lucene.Index' WITH OPTIONS = {'refresh_seconds': '60', 'max_merge_mb': '5', 'ram_buffer_mb': '64', 'max_cached_mb': '30', 'schema': '{       default_analyzer : "english",       fields : {          customerid      : {type : "string"},            businessstate   : {type : "string"},            gatewayid       : {type : "string"},            start           : {type : "date", pattern: "yyyy-mm-dd HH:mm:ssZ"},             end             : {type : "date", pattern: "yyyy-mm-dd HH:mm:ssZ"},             duration        : {type : "date_range", from : "start", to : "end"}         }   }'};

Please note, that this has already been up and running succesfully with the former version (C* 2.2.6 and according lucene-plugin version).

Any help would be appreciated. Thanks

Merkaban commented 8 years ago

Just an update regarding this issue: everything works fine, if I use a date-string with the correct pattern. The problems only occur, if I try to use longs (epoch time) within the query. I hope this is useful...

adelapena commented 8 years ago

Hi,

Parsing UNIX timestamps as dates was an undocumented and problematic feature that was removed since 3.0.7.2.

The problem with parsing UNIX timestamps was that it produced ambiguities with some date patterns. For example, given a pattern such as "yyMMdd", a numeric value such as 161127 can be interpreted both as a timestamp and as a text date satisfying the pattern.

That was the reason to move to a pure date pattern approach.

Merkaban commented 8 years ago

Hi Adelapena, thank you very much for your help! That was fast and is much appreciated. I will continue using date strings. Regards Klaus