elastic / logstash

Logstash - transport and process your logs, events, or other data
https://www.elastic.co/products/logstash
Other
14.18k stars 3.49k forks source link

Logstash just gives me two logs #2630

Closed moonlitdelight closed 9 years ago

moonlitdelight commented 9 years ago

←[33mUsing milestone 2 input plugin 'file'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin mi lestones, see http://logstash.net/docs/1.4.2/plugin-milestones {:level=>:warn}←[ 0m ←[33mUsing milestone 2 filter plugin 'csv'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin mi lestones, see http://logstash.net/docs/1.4.2/plugin-milestones {:level=>:warn}←[ 0m

My configuration :

input { file { path => [ "e:\mycsvfile.csv" ] start_position => "beginning" } } filter { csv { columns => ["col1","col2"] source => "csv_data" separator => "," } }

output { elasticsearch { host => localhost port => 9200 index => test index_type => test_type protocol => http } stdout { codec => rubydebug } }

My environment: Windows 8 logstash 1.4.2

magnusbaeck commented 9 years ago

I suspect you have to specify the filepath with forward slashes. Try starting Logstash with --verbose to get more logs. It'll tell you if it has problems opening files.

moonlitdelight commented 9 years ago

Thank you for your response. I tried:

logstash.bat agent -f test.conf --verbose ←[33mUsing milestone 2 input plugin 'file'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin mi lestones, see http://logstash.net/docs/1.4.2/plugin-milestones {:level=>:warn}←[ 0m ←[33mUsing milestone 2 filter plugin 'csv'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin mi lestones, see http://logstash.net/docs/1.4.2/plugin-milestones {:level=>:warn}←[ 0m ←[32mRegistering file input {:path=>["e:/temp.csv"], :level=>:info}←[0m ←[32mNo sincedb_path set, generating one based on the file path {:sincedb_path=> "C:\Users\gemini/.sincedb_d8e46c18292a898ea0b5b1cd94987f21", :path=>["e:/tem p.csv"], :level=>:info}←[0m ←[32mPipeline started {:level=>:info}←[0m ←[32mNew Elasticsearch output {:cluster=>nil, :host=>"localhost", :port=>9200, : embedded=>false, :protocol=>"http", :level=>:info}←[0m ←[32mAutomatic template management enabled {:managetemplate=>"true", :level=>:i nfo}←[0m ←[32mUsing mapping template {:template=>"{ \"template\" : \"logstash-\", \"se ttings\" : { \"index.refreshinterval\" : \"5s\" }, \"mappings\" : { \" default_\" : { \"_all\" : {\"enabled\" : true}, \"dynamic_templates\ " : [ { \"stringfields\" : { \"match\" : \"\", \"m atch_mapping_type\" : \"string\", \"mapping\" : { \"type\" : \"string\", \"index\" : \"analyzed\", \"omitnorms\" : true, \" fields\" : { \"raw\" : {\"type\": \"string\", \"index\" : \"not analyzed\", \"ignore_above\" : 256} } } } } ], \"properties\" : { \"@version\": { \"type\": \"string\", \"in dex\": \"not_analyzed\" }, \"geoip\" : { \"type\" : \"object\ ", \"dynamic\": true, \"path\": \"full\", \" properties\" : { \"location\" : { \"type\" : \"geo_point\" } } } } } }}", :level=>:info}←[0m

It stays like this for a while and no new index is created in elasticsearch.

moonlitdelight commented 9 years ago

I even tried specifying the cluster. I suspect that it is an issue between logstash and elasticsearch. Any ideas will be helpful. Thank you in advance.

moonlitdelight commented 9 years ago

I had to add:

sincedb_path => "NIL"

and it worked.

http://logstash.net/docs/1.1.0/inputs/file#setting_sincedb_path

sincedb_path Value type is string There is no default value for this setting. Where to write the since database (keeps track of the current position of monitored log files). Defaults to the value of environment variable "$SINCEDB_PATH" or "$HOME/.sincedb".

I've had several sincedb files generated in my C:\users{user}.

Thank you.

moonlitdelight commented 9 years ago

So, it happened again. Turned out that when I added "NIL", it generated a NIL file in my bin directory, since that's where my configuration file is located. I guess, the right approach would be, deleting the sincedb* generated files if we want to reset logstash.

In my case, I will be removing sincedb_path => "NIL", and let it default to my $HOME path. Whenever I want to do a reset, I would just delete the sincedb file generated in my $HOME path.