jprante / elasticsearch-jdbc

JDBC importer for Elasticsearch
Apache License 2.0
2.84k stars 709 forks source link

Can't see river state #658

Open tienkhoi opened 9 years ago

tienkhoi commented 9 years ago

Hi all,

I trying to see the state of rivers by access to

http://localhost:9200/_river/jdbc/*/_state?pretty and http://localhost:9200/_river/jdbc/service/_state?pretty (My river is called service)

But I get this as result

No handler found for uri [/_river/jdbc/*/_state?pretty] and method [GET]

And

No handler found for uri [/_river/jdbc/service/_state?pretty] and method [GET]
jprante commented 9 years ago

River version is not supported in recent releases.

tienkhoi commented 9 years ago

@jprante Thanks for you quick reply.

is there anyway i can check the last time it indexes the sql table ?

I'm trying to make it fetch from database every 1 minute

So this is my creating request

{
   "type":"jdbc",
   "strategy":"simple",
   "schedule":"0 0/1 * 1/1 * ? *",
   "jdbc":{
      "driver":"com.microsoft.sqlserver.jdbc.SQLServerDriver",
      "url":"jdbc:sqlserver://192.168.1.12;databaseName=maindatabase",
      "user":"username",
      "password":"password",
      "sql":"select  * from  emailtable",
      "index":"serviceweb",
      "type":"caseemail",
      "autocommit":true,
   }
}

However it only run for the first time and never repeats again.

jprante commented 9 years ago

Which version is this?

Please put all JDBC importer parameters inside the jdbc structure or they will be ignored.

If you don't index with a _timestamp field, you can not find the last time.

tienkhoi commented 9 years ago

Oops ! It works now. Didn't realize it's no login a plugin. I haven't come back to this repo for couple months.

Btw. is there any way I can track the SQL queries ?

I'm using "select incremental data" method and it response with no rows every time it fetches.

Thank you !

feaster83 commented 9 years ago

You can see the executed statements in the log file or console (depends on your logger configuration). The logging of the SQL statements are on level DEBUG so set the output level to DEBUG or TRACE.

To enable logging you have to change your local log4j2.xml configuration and be sure that the file is configured at the feeder start script:

java -cp "${lib}/*" \
       -Dlog4j.configurationFile=${bin}/log4j2.xml \
       org.xbib.tools.Runner \
       org.xbib.tools.JDBCImporter

Example configuration to display the statements in the log file "logs/jdbc.log" (root logger loglevel changed to DEBUG)

<?xml version="1.0" encoding="UTF-8"?>
<configuration status="OFF">
    <appenders>
        <Console name="Console" target="SYSTEM_OUT">
            <PatternLayout pattern="[%d{ABSOLUTE}][%-5p][%-25c][%t] %m%n"/>
        </Console>
        <File name="File" fileName="logs/jdbc.log" immediateFlush="true"  append="true">
            <PatternLayout pattern="[%d{ABSOLUTE}][%-5p][%-25c][%t] %m%n"/>
        </File>
    </appenders>
    <Loggers>
        <Root level="debug"> <!--- Changed from INFO to DEBUG!! -->
            <AppenderRef ref="File" />
        </Root>
        <!-- set this level to trace to debug SQL value mapping -->
        <Logger name="importer.jdbc.source.standard" level="info">
            <appender-ref ref="Console"/>
        </Logger>
        <Logger name="metrics.source.plain" level="info">
            <appender-ref ref="Console"/>
        </Logger>
        <Logger name="metrics.sink.plain" level="info">
            <appender-ref ref="Console"/>
        </Logger>
        <Logger name="metrics.source.json" level="info">
            <appender-ref ref="Console"/>
        </Logger>
        <Logger name="metrics.sink.json" level="info">
            <appender-ref ref="Console"/>
        </Logger>
    </Loggers>
</configuration>