deviantony / docker-elk

The Elastic stack (ELK) powered by Docker and Compose.
MIT License
17.08k stars 6.74k forks source link

Input file in logstash #25

Closed disastrous-charly closed 8 years ago

disastrous-charly commented 8 years ago

Hello,

I'm trying to use your project with a input of type "file" in logstash, instead of sending them with TCP. I mount my log folder on my logstash container, I can see them in the container, no problem.

But in Kibana, I can't create an index, so I guess logstash don't push my log files to elastic. Do you know if it comes from your configuration ?

Your work is really awesome BTW !

deviantony commented 8 years ago

My first thought was that maybe the elasticsearch is not ready when logstash is starting/indexing.

But I've just tried it on my workstation at work and I'm having troubles inspecting docker containers using docker exec (I can't see the mounted logstash configuration nor the mounted log files)

I'll try again on my personal laptop.

deviantony commented 8 years ago

I made it !

In order to use the file input in logstash, you'll have to update the _startposition option.

The default value is "end" which means the logstash container will only tail the new lines in the log file, see https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html#plugins-inputs-file-start_position

I've used the following container definition for logstash:

logstash:
  image: logstash:latest
  command: logstash -f /etc/logstash/conf.d/logstash.conf
  volumes:
    - ./logstash/config:/etc/logstash/conf.d
    - /tmp/logstash/:/logstash
  ports:
    - "5000:5000"
  links:
    - elasticsearch

And the following logstash configuration:

input {
    file {
        path => "/logstash/test.log"
        start_position => "beginning"
    }
}

output {
    elasticsearch {
        hosts => "elasticsearch:9200"
    }
}

With a log file /tmp/logstash/test.log, works perfectly !

deviantony commented 8 years ago

@disastrous-charly any news?

disastrous-charly commented 8 years ago

Hey sorry, forgot to respond.

Thanks for your solution. It still not working for me... Issue with my input file ? I don't have time to dig the subject right now, but I keep an eye on your work !

deviantony commented 8 years ago

Ok, feel free to reopen the issue !

phocean commented 8 years ago

Hi, I do have the same issue. After many various attempts, I have no idea why it is not parsing any file...

Could you re-open this issue ?

deviantony commented 8 years ago

@phocean could you give me more information? OS, docker/compose versions, and share your configuration please.

phocean commented 8 years ago

hello @deviantony ,

Sorry for the disturbance, I found my problem. The mounted folder entry (not files) did not have read access to others (hence, the Logstash account), so logstash could not read them.

Basic stuff that I should have found before coming here. Sorry. And thank you for your reply.

deviantony commented 8 years ago

Glad you solved it :)

mrdotkg commented 6 years ago

how did you solve the access issue? I am stuck with the same.

gp15237125756 commented 5 years ago

I have successfully solved the problem for a few hours! Here is my docker-compose.yml

version: '2'

services:
  elasticsearch:
    build:
      context: elasticsearch/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - ./elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml:ro
    ports:
      - "9200:9200"
      - "9300:9300"
    environment:
      ES_JAVA_OPTS: "-Xmx256m -Xms256m"
    networks:
      - elk

  logstash:
    build:
      context: logstash/
      args:
        ELK_VERSION: $ELK_VERSION
    command: logstash -f /etc/logstash/conf.d/logstash.conf
    volumes:
      - ./logstash/config/logstash.conf:/etc/logstash/conf.d/logstash.conf:ro
      - /home/tms/tomcat/apache-tomcat-7.0.81/logs/schedule/web-gateway-controller.log:/tomcat/web-gateway-controller.log:ro
      - /home/tms/service/service-waybill/logs/schedule-service-waybill.log:/service/service-waybill.log:ro
      - /home/tms/service/service-system/logs/schedule-service-system.log:/service/service-system.log:ro
    ports:
      - "5000:5000"
    environment:
      LS_JAVA_OPTS: "-Xmx256m -Xms256m"
    networks:
      - elk
    links:
      - elasticsearch
    depends_on:
      - elasticsearch

  kibana:
    build:
      context: kibana/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - ./kibana/config/:/usr/share/kibana/config:ro
    ports:
      - "5601:5601"
    networks:
      - elk
    depends_on:
      - elasticsearch
networks:
  elk:
    driver: bridge

please note the volumes configuration that the first path on the left side of colon is host url,and the right side path is the file mounted in container ,if you have not create the directory before,then you have to append :ro in the end which means automatically mount into the container.I have stuck for a few hours here. then here is my logstash.conf

input {
    #tcp {
    #   port => 5000
    #}
    file {
        path => ["/tomcat/web-gateway-controller.log"]
        type => "web-gateway-controller"
        start_position => "beginning"
    }
}

output {
    elasticsearch {
        hosts => "elasticsearch:9200"
    }
}

also copy logstash.conf to /usr/local/bin/docker-elk-master/logstash/config directory. Hope this will help you all!

abhinavgurung commented 5 years ago

Hello, I have very similar issue. I am trying to read a csv file using logstash. here is my logstash config file. There is no data stored in elasticsearch or printed out. I beleive something has to do with the mounting of the data.

input { file{ path => ["/home/abhinavkumar.gurung/Applications/csv/devian/data/Admission_Predict.csv"] start_position => "beginning"

read from the beginning of the file

    sincedb_path => "/dev/null"
    codec => plain {
        charset => "UTF-8"
    }
}

} filter { csv{ columns => ["Serial No","GRE Score","TOEFL Score","University Rating","SOP","LOR","CGPA","Research","Chance of Admit"] separator => "," }

mutate {
    convert => {
            "Serial No" => "integer"
            "GRE Score" => "integer"
            "TOEFL Score" => "integer"
            "University Rating" => "integer"
            "SOP" => "float"
            "LOR" => "float"
            "CGPA" => "float"
            "Research" => "integer"
            "Chance of Admit" => "float"
        }
}

} output { elasticsearch { action => "index" hosts => ["elasticsearch:9200"] document_type => "_doc" user => elastic password => changeme index => "cars" } stdout { codec => rubydebug } }

abhinavgurung commented 5 years ago

this is my docker-compose version: '2'

services:

elasticsearch: build: context: elasticsearch/ args: ELK_VERSION: 7.0.1 volumes:

networks:

elk: driver: bridge

HyuLeX commented 1 month ago

I have Logstash run in docker compose contains basic ELK stack: elasticsearch, kibana and logstash which reads input file from my Window 11 host. When "docker compose up -d", Logstash up for few secs then down, I read the log inside container and saw ERROR log that wrote: ArgumentError: File paths must be absolute, relative path specified: ...\logstash\logstash_ingest_data\access_log.log>. Below is my logstash.conf, pls tell where im wrong if s1 knows. Thks

`input { file { path => ["...\logstash\logstash_ingest_data\access_log.log"] start_position => "beginning" sincedb_path => "/dev/null" } }

filter { grok { match => {"message" => "%{COMBINEDAPACHELOG}"} } date { match => ["timestamp", "dd/MMM/yyyy:HH:mm:ss Z"] } }

output { elasticsearch { hosts => ["http://elasticsearch:9200"] } stdout { codec => rubydebug } }`

antoineco commented 1 month ago

@HyuLeX please open a new issue describing your problem, and include all the details requested in the issue template. Thanks!