Closed pragadeeshraju closed 2 years ago
[2020/04/07 15:00:58] [ warn] net_tcp_fd_connect: getaddrinfo(host='kubernetes.default.svc.cluster.local'): No such host is known.
@pragadeeshraju Your error is probably due this bug.
In short, Windows pods don't have a reliable network on boot. It takes 10-60
secs before we can make requests reliably on k8s. This breaks the initial API
connection of filter_kubernetes
.
As a workaround, try adding 60 sec sleep at the startup. If it solves the issue, that bug is the root cause of your problem.
Thanks for your reply @fujimotos ,
let me try that now adding 60 secs
@fujimotos would you please let me how can we add the startup sleep in windows.
i have the same problem, i will have to try adding delay as you mentioned
@fujimotos i tried using delay start. but still the same problem.
fluent-bit-rkbxk 0/1 Init:0/1 0 68s
fluent-bit-rkbxk 1/1 Running 0 79s
i even tried deplaying 120 secs
Logs
Fluent Bit v1.4.2
* Copyright (C) 2019-2020 The Fluent Bit Authors
* Copyright (C) 2015-2018 Treasure Data
* Fluent Bit is a CNCF sub-project under the umbrella of Fluentd
* https://fluentbit.io
[2020/04/10 08:39:39] [ info] Configuration:
[2020/04/10 08:39:39] [ info] flush time | 1.000000 seconds
[2020/04/10 08:39:39] [ info] grace | 60 seconds
[2020/04/10 08:39:39] [ info] daemon | 0
[2020/04/10 08:39:39] [ info] ___________
[2020/04/10 08:39:39] [ info] inputs:
[2020/04/10 08:39:39] [ info] tail
[2020/04/10 08:39:39] [ info] ___________
[2020/04/10 08:39:39] [ info] filters:
[2020/04/10 08:39:39] [ info] kubernetes.0
[2020/04/10 08:39:39] [ info] ___________
[2020/04/10 08:39:39] [ info] outputs:
[2020/04/10 08:39:39] [ info] es.0
[2020/04/10 08:39:39] [ info] ___________
[2020/04/10 08:39:39] [ info] collectors:
[2020/04/10 08:39:39] [debug] [storage] [cio stream] new stream registered: tail.0
[2020/04/10 08:39:39] [ info] [storage] version=1.0.3, initializing...
[2020/04/10 08:39:39] [ info] [storage] in-memory
[2020/04/10 08:39:39] [ info] [storage] normal synchronization mode, checksum disabled, max_chunks_up=128
[2020/04/10 08:39:39] [ info] [engine] started (pid=10952)
[2020/04/10 08:39:39] [debug] [engine] coroutine stack size: 98302 bytes (96.0K)
[2020/04/10 08:39:39] [debug] [input:tail:tail.0] scanning path C:\\ProgramData\\Docker\\containers\\*\\*.log
[2020/04/10 08:39:39] [error] [sqldb] error=unrecognized token: "562949953460365��"
[2020/04/10 08:39:39] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\51f77a92839b87c3d4806e8fc0f07a6effa8f8c484d869a5787fb3b062b06155\51f77a92839b87c3d4806e8fc0f07a6effa8f8c484d869a5787fb3b062b06155-json.log, offset=0
[2020/04/10 08:39:39] [error] [sqldb] error=unrecognized token: "237001930390711��"
[2020/04/10 08:39:39] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\710d649b86322548cbeee7e1d09d787b280a86ea530390ed9f1b3c42055fcbd6\710d649b86322548cbeee7e1d09d787b280a86ea530390ed9f1b3c42055fcbd6-json.log, offset=0
[2020/04/10 08:39:39] [error] [sqldb] error=unrecognized token: "365917469762065��"
[2020/04/10 08:39:39] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\7fbc003699e6b2568f388dcd7963812f7d1cae210619d2c8e7f7b9aa8927e8b0\7fbc003699e6b2568f388dcd7963812f7d1cae210619d2c8e7f7b9aa8927e8b0-json.log, offset=0
[2020/04/10 08:39:39] [error] [sqldb] error=unrecognized token: "112027040731186��"
[2020/04/10 08:39:39] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\91b8c66256ff2324c637271094053c06cb38c69d8f60fce4e445842931961cbc\91b8c66256ff2324c637271094053c06cb38c69d8f60fce4e445842931961cbc-json.log, offset=0
[2020/04/10 08:39:39] [error] [sqldb] error=unrecognized token: "481322210179104��"
[2020/04/10 08:39:39] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\bcf32a1af43264904708843e1f101ac2018aaacdc968ff4ebb39509412371ff9\bcf32a1af43264904708843e1f101ac2018aaacdc968ff4ebb39509412371ff9-json.log, offset=0
[2020/04/10 08:39:39] [error] [sqldb] error=unrecognized token: "121034239989487��"
[2020/04/10 08:39:39] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\cd9fda41040faf3949d5035a1a47d3a2e8929b080c464045b8d1b80a858a00b3\cd9fda41040faf3949d5035a1a47d3a2e8929b080c464045b8d1b80a858a00b3-json.log, offset=0
[2020/04/10 08:39:39] [error] [sqldb] error=unrecognized token: "731834939486757��"
[2020/04/10 08:39:39] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\d7134094a4cfd999907588678a44124d4257b2c37b1fac5cfa97db8650894515\d7134094a4cfd999907588678a44124d4257b2c37b1fac5cfa97db8650894515-json.log, offset=0
[2020/04/10 08:39:39] [debug] [input:tail:tail.0] 7 files found for 'C:\\ProgramData\\Docker\\containers\\*\\*.log'
[2020/04/10 08:39:39] [ info] [filter:kubernetes:kubernetes.0] https=1 host=kubernetes.default.svc.cluster.local port=443
[2020/04/10 08:39:39] [ info] [filter:kubernetes:kubernetes.0] local POD info OK
[2020/04/10 08:39:39] [ info] [filter:kubernetes:kubernetes.0] testing connectivity with API server...
[2020/04/10 08:39:39] [ warn] net_tcp_fd_connect: getaddrinfo(host='kubernetes.default.svc.cluster.local'): No such host is known.
[2020/04/10 08:39:39] [error] [filter:kubernetes:kubernetes.0] upstream connection error
[2020/04/10 08:39:39] [ warn] [filter:kubernetes:kubernetes.0] could not get meta for POD fluent-bit-zms8m
[2020/04/10 08:39:39] [debug] [output:es:es.0] host=elasticsearch port=9200 uri=/_bulk index=fluent-bit type=flb_type
[2020/04/10 08:39:39] [debug] [router] match rule tail.0:es.0
[2020/04/10 08:39:39] [ info] [sp] stream processor started
[2020/04/10 08:39:39] [debug] [input:tail:tail.0] file=C:\ProgramData\Docker\containers\51f77a92839b87c3d4806e8fc0f07a6effa8f8c484d869a5787fb3b062b06155\51f77a92839b87c3d4806e8fc0f07a6effa8f8c484d869a5787fb3b062b06155-json.log promote to TAIL_EVENT
[2020/04/10 08:39:39] [ warn] [filter:kubernetes:kubernetes.0] invalid pattern for given tag kube.C:\ProgramData\Docker\containers\710d649b86322548cbeee7e1d09d787b280a86ea530390ed9f1b3c42055fcbd6\710d649b86322548cbeee7e1d09d787b280a86ea530390ed9f1b3c42055fcbd6-json.log
is there somewhere else i did wrong? i have used the same above mentioned conf.
i feel that's not due to the bug
is there somewhere else i did wrong? i have used the same above mentioned conf.
@pragadeeshraju Can you resolve the hostname when you login into the VM?
PS C:\var\log\containers> docker exec -it fluent-bit-rkbxk powershell
PS C:\> ping kubernetes.default.svc.cluster.local
... or you can test the connectivity with:
containers:
- command
- powershell.exe
- -Command
- sleep 60; ping kubernetes.default.svc.cluster.local
Since what Fluent Bit does here is to call plain getaddrinfo()
to resolve
the given host name, it should work fine as long as the network is working.
@fujimotos i tried pinging , seems like its not reachable .
k exec -it fluent-bit-rhsk6 powershell -n logging
Windows PowerShell
Copyright (C) Microsoft Corporation. All rights reserved.
PS C:\> ping kubernetes.default.svc.cluster.local
Pinging kubernetes.default.svc.cluster.local [10.0.0.1] with 32 bytes of data:
Request timed out.
Request timed out.
Request timed out.
Request timed out.
Ping statistics for 10.0.0.1:
Packets: Sent = 4, Received = 0, Lost = 4 (100% loss),
i tried pinging , seems like its not reachable .
@pragadeeshraju This suggests that there is something wrong with your k8s network settings (rather than Fluent Bit).
You need to debug the network configuration of your pod first. Here is the relevant documentation from kubernetes.io.
https://kubernetes.io/docs/tasks/administer-cluster/dns-debugging-resolution/
@fujimotos but still in linux nodes its working fine , where i couldnt ping the same.
but still in linux nodes its working fine , where i couldnt ping the same.
@pragadeeshraju While re-reading your comment, I just noticed that the ping requests on your Windows pods were resolving the host name successfully.
Look at the second line here:
PS C:\> ping kubernetes.default.svc.cluster.local
Pinging kubernetes.default.svc.cluster.local [10.0.0.1] with 32 bytes of data:
It was just 10.0.0.1 not responding to ICMP ping requests, so DNS was working properly when you executed the ping command.
What happens if you login into the pod and launch the fluent-bit manually?
PS> C:\path\to\fluent-bit.exe -c C:\path\to\fluent-bit.conf
(Tweak the executable and config path depending on your pods)
@fujimotos here is whats happening.
PS C:\> c:\\fluent-bit\\bin\\fluent-bit.exe -c C:\\fluent-bit\\conf\\fluent-bit.conf
Fluent Bit v1.4.2
* Copyright (C) 2019-2020 The Fluent Bit Authors
* Copyright (C) 2015-2018 Treasure Data
* Fluent Bit is a CNCF sub-project under the umbrella of Fluentd
* https://fluentbit.io
[2020/04/14 15:26:02] [ info] Configuration:
[2020/04/14 15:26:02] [ info] flush time | 1.000000 seconds
[2020/04/14 15:26:02] [ info] grace | 5 seconds
[2020/04/14 15:26:02] [ info] daemon | 0
[2020/04/14 15:26:02] [ info] ___________
[2020/04/14 15:26:02] [ info] inputs:
[2020/04/14 15:26:02] [ info] tail
[2020/04/14 15:26:02] [ info] ___________
[2020/04/14 15:26:02] [ info] filters:
[2020/04/14 15:26:02] [ info] kubernetes.0
[2020/04/14 15:26:02] [ info] ___________
[2020/04/14 15:26:02] [ info] outputs:
[2020/04/14 15:26:02] [ info] es.0
[2020/04/14 15:26:02] [ info] ___________
[2020/04/14 15:26:02] [ info] collectors:
[2020/04/14 15:26:02] [debug] [storage] [cio stream] new stream registered: tail.0
[2020/04/14 15:26:02] [ info] [storage] version=1.0.3, initializing...
[2020/04/14 15:26:02] [ info] [storage] in-memory
[2020/04/14 15:26:02] [ info] [storage] normal synchronization mode, checksum disabled, max_chunks_up=128
[2020/04/14 15:26:02] [ info] [engine] started (pid=4736)
[2020/04/14 15:26:02] [debug] [engine] coroutine stack size: 98302 bytes (96.0K)
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] scanning path C:\\ProgramData\\Docker\\containers\\*\\*.log
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\0037a3aa958bd24ac41792d04b001e307e3630baddcd9cfd5b9185216f1f77c6\0037a3aa958bd24ac417
92d04b001e307e3630baddcd9cfd5b9185216f1f77c6-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\0446ed93479c23552c7d4adba9ecd6137bd857359a6e23d4017995306b7225a5\0446ed93479c23552c7d
4adba9ecd6137bd857359a6e23d4017995306b7225a5-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\098bfc1d806d00fc72e9f37e1a4a650fd68a95d3864b2483b03ccdfb32f6a3f5\098bfc1d806d00fc72e9
f37e1a4a650fd68a95d3864b2483b03ccdfb32f6a3f5-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\09e37692583ca2ef4c27e3dfca8c247d36f44c16b654af00ae4cef904dbf23a4\09e37692583ca2ef4c27
e3dfca8c247d36f44c16b654af00ae4cef904dbf23a4-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\0b3ec9eacc26087cea76ea6ccf3d461e79afbaa2c653b54772787fd11fbbd963\0b3ec9eacc26087cea76
ea6ccf3d461e79afbaa2c653b54772787fd11fbbd963-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\349f9c91f0090f080ed747b161f5edf4d937a2661db14db5f4ba6c2dacf8762f\349f9c91f0090f080ed7
47b161f5edf4d937a2661db14db5f4ba6c2dacf8762f-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\6a49cb9707ca53fd138f39ec84911f4dba4cc95f8aec0ffb4d9d7b3352b8f750\6a49cb9707ca53fd138f
39ec84911f4dba4cc95f8aec0ffb4d9d7b3352b8f750-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\710d649b86322548cbeee7e1d09d787b280a86ea530390ed9f1b3c42055fcbd6\710d649b86322548cbee
e7e1d09d787b280a86ea530390ed9f1b3c42055fcbd6-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\7fbc003699e6b2568f388dcd7963812f7d1cae210619d2c8e7f7b9aa8927e8b0\7fbc003699e6b2568f38
8dcd7963812f7d1cae210619d2c8e7f7b9aa8927e8b0-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\a1b2ca7321b2b902006c848bb3dd00f0c69f4231f58bb6f31ccdaf7235aeca28\a1b2ca7321b2b902006c
848bb3dd00f0c69f4231f58bb6f31ccdaf7235aeca28-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\aaa3ff855ba59c28b05c682ff156df7b640abc933b21c99014096dbed2206a05\aaa3ff855ba59c28b05c
682ff156df7b640abc933b21c99014096dbed2206a05-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\b2bdf38fb6590232ad4275c1d7eb7b72c0a56b43fe2460b86b7836206acd780a\b2bdf38fb6590232ad42
75c1d7eb7b72c0a56b43fe2460b86b7836206acd780a-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\b8fa96e4956fa925115705814ff1ecb818fae3b57eee7c16f81dce2a28463d38\b8fa96e4956fa9251157
05814ff1ecb818fae3b57eee7c16f81dce2a28463d38-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\bcf32a1af43264904708843e1f101ac2018aaacdc968ff4ebb39509412371ff9\bcf32a1af43264904708
843e1f101ac2018aaacdc968ff4ebb39509412371ff9-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\bd2bb2d76d55578d214ab5e2251796c9e28bdb9e3edbb6fb2ed674254d96407f\bd2bb2d76d55578d214a
b5e2251796c9e28bdb9e3edbb6fb2ed674254d96407f-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\d2c5e0dbd4d16c460d81d049eaed3b0934700a427623a734b4b93b6ac03ba2b5\d2c5e0dbd4d16c460d81
d049eaed3b0934700a427623a734b4b93b6ac03ba2b5-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\d9a1f2b9023b6ad00e301faf3984e40a8bafe3c2ea6e08c1a9392aad06dada3d\d9a1f2b9023b6ad00e30
1faf3984e40a8bafe3c2ea6e08c1a9392aad06dada3d-json.log, offset=0
[2020/04/14 15:26:02] [error] [sqldb] error=near "<": syntax error
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\edd3709132ba1fb0b0a5163f8f5882610d8cccdf2e4302d3746461c4451a448b\edd3709132ba1fb0b0a5
163f8f5882610d8cccdf2e4302d3746461c4451a448b-json.log, offset=0
[2020/04/14 15:26:02] [debug] [input:tail:tail.0] 18 files found for 'C:\\ProgramData\\Docker\\containers\\*\\*.log'
[2020/04/14 15:26:02] [ info] [filter:kubernetes:kubernetes.0] https=1 host=kubernetes.default.svc.cluster.local port=443
[2020/04/14 15:26:02] [ info] [filter:kubernetes:kubernetes.0] local POD info OK
[2020/04/14 15:26:02] [ info] [filter:kubernetes:kubernetes.0] testing connectivity with API server...
[2020/04/14 15:26:03] [debug] [filter:kubernetes:kubernetes.0] API Server (ns=logging, pod=fluent-bit-rhsk6) http_do=0, HTTP Status: 200
[2020/04/14 15:26:03] [ info] [filter:kubernetes:kubernetes.0] API server connectivity OK
[2020/04/14 15:26:03] [debug] [output:es:es.0] host=elasticsearch port=9200 uri=/_bulk index=fluent-bit type=flb_type
[2020/04/14 15:26:03] [debug] [router] match rule tail.0:es.0
[2020/04/14 15:26:03] [ info] [sp] stream processor started
[2020/04/14 15:26:03] [ warn] [filter:kubernetes:kubernetes.0] invalid pattern for given tag kube.C:\ProgramData\Docker\containers\0037a3aa958bd24ac41792d04b001e307e3630baddcd9cfd5b91852
16f1f77c6\0037a3aa958bd24ac41792d04b001e307e3630baddcd9cfd5b9185216f1f77c6-json.log
[2020/04/14 15:26:03] [debug] [input:tail:tail.0] file=C:\ProgramData\Docker\containers\0037a3aa958bd24ac41792d04b001e307e3630baddcd9cfd5b9185216f1f77c6\0037a3aa958bd24ac41792d04b001e307
e3630baddcd9cfd5b9185216f1f77c6-json.log read=32767 lines=324
According to the log, you have connected to API server successfully. (the third line says "connectivity OK")
[2020/04/14 15:26:02] [ info] [filter:kubernetes:kubernetes.0] testing connectivity with API server...
[2020/04/14 15:26:03] [debug] [filter:kubernetes:kubernetes.0] API Server (ns=logging, pod=fluent-bit-rhsk6) http_do=0, HTTP Status: 200
[2020/04/14 15:26:03] [ info] [filter:kubernetes:kubernetes.0] API server connectivity OK
This suggests that:
I'm not entirely sure why hostname resolution did not work when you added a manual sleep, but I guess it depends on the way how you added the delay.
Try the following, and see if it resolves the DNS problem:
containers:
- command
- powershell.exe
- -Command
- sleep 60; C:/fluent-bit/bin/fluent-bit.exe -c C:/fluent-bit/conf/fluent-bit.conf
@fujimotos you were right, before i tried adding initContainers. Now the connectivity seems fine with the API server.
but still im getting the invalid pattern error. is it something to do with my config, i have used the same config given above.
Fluent Bit v1.4.2
* Copyright (C) 2019-2020 The Fluent Bit Authors
* Copyright (C) 2015-2018 Treasure Data
* Fluent Bit is a CNCF sub-project under the umbrella of Fluentd
* https://fluentbit.io
[2020/04/16 04:15:23] [ info] Configuration:
[2020/04/16 04:15:23] [ info] flush time | 1.000000 seconds
[2020/04/16 04:15:23] [ info] grace | 5 seconds
[2020/04/16 04:15:23] [ info] daemon | 0
[2020/04/16 04:15:23] [ info] ___________
[2020/04/16 04:15:23] [ info] inputs:
[2020/04/16 04:15:23] [ info] tail
[2020/04/16 04:15:23] [ info] ___________
[2020/04/16 04:15:23] [ info] filters:
[2020/04/16 04:15:23] [ info] kubernetes.0
[2020/04/16 04:15:23] [ info] ___________
[2020/04/16 04:15:23] [ info] outputs:
[2020/04/16 04:15:23] [ info] es.0
[2020/04/16 04:15:23] [ info] ___________
[2020/04/16 04:15:23] [ info] collectors:
[2020/04/16 04:15:23] [debug] [storage] [cio stream] new stream registered: tail.0
[2020/04/16 04:15:23] [ info] [storage] version=1.0.3, initializing...
[2020/04/16 04:15:23] [ info] [storage] in-memory
[2020/04/16 04:15:23] [ info] [storage] normal synchronization mode, checksum disabled, max_chunks_up=128
[2020/04/16 04:15:23] [ info] [engine] started (pid=12064)
[2020/04/16 04:15:23] [debug] [engine] coroutine stack size: 98302 bytes (96.0K)
[2020/04/16 04:15:23] [debug] [input:tail:tail.0] scanning path C:\\ProgramData\\Docker\\containers\\*\\*.log
[2020/04/16 04:15:23] [error] [sqldb] error=unrecognized token: "337769972125171��"
[2020/04/16 04:15:23] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\3558cbfa135376d6e447c9c12a21bb7c6c91c06ab152eb12d3bf7064e7166ee4\3558cbfa135376d6e447c9c12a21bb7c6c91c06ab152eb12d3bf7064e7166ee4-json.log, offset=0
[2020/04/16 04:15:23] [error] [sqldb] error=unrecognized token: "844424930204353��"
[2020/04/16 04:15:23] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\5fac48b972397451777d32845e6af4f139512c6d29dd121e8f64faad6bf900c6\5fac48b972397451777d32845e6af4f139512c6d29dd121e8f64faad6bf900c6-json.log, offset=0
[2020/04/16 04:15:23] [error] [sqldb] error=unrecognized token: "337769972125156��"
[2020/04/16 04:15:23] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\92eeb82ae070f56231181f67f46c8c013d1599c71020abd76519ca386951769f\92eeb82ae070f56231181f67f46c8c013d1599c71020abd76519ca386951769f-json.log, offset=0
[2020/04/16 04:15:23] [error] [sqldb] error=unrecognized token: "385620718097053��"
[2020/04/16 04:15:23] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\a0be0d1b66eddaf0597536cea0dd0aba9fcf6f3edd90ab1454c2e85e4889259c\a0be0d1b66eddaf0597536cea0dd0aba9fcf6f3edd90ab1454c2e85e4889259c-json.log, offset=0
[2020/04/16 04:15:23] [error] [sqldb] error=unrecognized token: "281474976783024��"
[2020/04/16 04:15:23] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\a0d2703a346775539bcaf20dc1f02f4ac7e9efc2487f9f708206a66ee2fe80a7\a0d2703a346775539bcaf20dc1f02f4ac7e9efc2487f9f708206a66ee2fe80a7-json.log, offset=0
[2020/04/16 04:15:23] [error] [sqldb] error=unrecognized token: "143552238126233��"
[2020/04/16 04:15:23] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\e35f4a4691582dfb1b15bcb72e2f83d998fea054abac221ef7a390050c1b806c\e35f4a4691582dfb1b15bcb72e2f83d998fea054abac221ef7a390050c1b806c-json.log, offset=0
[2020/04/16 04:15:23] [error] [sqldb] error=unrecognized token: "281474976783063��"
[2020/04/16 04:15:23] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\f45e3df47d33e53fe9f775923912ee52fbcd4c6b5ce91b1758273fd802a81540\f45e3df47d33e53fe9f775923912ee52fbcd4c6b5ce91b1758273fd802a81540-json.log, offset=0
[2020/04/16 04:15:23] [debug] [input:tail:tail.0] 7 files found for 'C:\\ProgramData\\Docker\\containers\\*\\*.log'
[2020/04/16 04:15:23] [ info] [filter:kubernetes:kubernetes.0] https=1 host=kubernetes.default.svc.cluster.local port=443
[2020/04/16 04:15:23] [ info] [filter:kubernetes:kubernetes.0] local POD info OK
[2020/04/16 04:15:23] [ info] [filter:kubernetes:kubernetes.0] testing connectivity with API server...
[2020/04/16 04:15:23] [debug] [filter:kubernetes:kubernetes.0] API Server (ns=logging, pod=fluent-bit-8r6gg) http_do=0, HTTP Status: 200
[2020/04/16 04:15:23] [ info] [filter:kubernetes:kubernetes.0] API server connectivity OK
[2020/04/16 04:15:23] [debug] [output:es:es.0] host=elasticsearch port=9200 uri=/_bulk index=fluent-bit type=flb_type
[2020/04/16 04:15:23] [debug] [router] match rule tail.0:es.0
[2020/04/16 04:15:23] [ info] [sp] stream processor started
[2020/04/16 04:15:23] [ warn] [filter:kubernetes:kubernetes.0] invalid pattern for given tag kube.C:\ProgramData\Docker\containers\3558cbfa135376d6e447c9c12a21bb7c6c91c06ab152eb12d3bf7064e7166ee4\3558cbfa135376d6e447c9c12a21bb7c6c91c06ab152eb12d3bf7064e7166ee4-json.log
[2020/04/16 04:15:23] [debug] [input:tail:tail.0] file=C:\ProgramData\Docker\containers\3558cbfa135376d6e447c9c12a21bb7c6c91c06ab152eb12d3bf7064e7166ee4\3558cbfa135376d6e447c9c12a21bb7c6c91c06ab152eb12d3bf7064e7166ee4-json.log read=32767 lines=324
[2020/04/16 04:15:23] [ warn] [filter:kubernetes:kubernetes.0] invalid pattern for given tag kube.C:\ProgramData\Docker\containers\5fac48b972397451777d32845e6af4f139512c6d29dd121e8f64faad6bf900c6\5fac48b972397451777d32845e6af4f139512c6d29dd121e8f64faad6bf900c6-json.log
@pragadeeshraju The regex normally expects a tag like below.
kube.service.var.log.containers.apache-logs-annotated_defaultns_apachelog-2367.log
=============================== --------------------- --------- --------- ----
TAG_PREFIX PODNAME NAMESPACE CONTAINER DOCKERID
On the other hand, your tag looks like
kube.C:\ProgramData\Docker\containers\5fac48b972397451777d32845e6af4f139512c6d29dd121e8f64faad6bf900c6\5fac48b972397451777d32845e6af4f139512c6d29dd121e8f64faad6bf900c6-json.log
The schema is very different. It means that you need to define a custom regex like below.
[FiLTER]
Name kubernates
...
Kube_Tag_Prefix kube.
Regex_Parser .*[\\\.](?<docker_id>[a-z0-9]*)-json.log
(By the way I noticed that we should escape \
to .
in in_tail as we do on Linux.
I'll post a patch on this)
@fujimotos
[2020/04/16 08:35:45] [debug] [input:tail:tail.0] 8 files found for 'C:\\ProgramData\\Docker\\containers\\*\\*.log'
[2020/04/16 08:35:45] [error] [filter:kubernetes:kubernetes.0] invalid parser '.*[\\\.](?<docker_id>[a-z0-9]*)-json.log'
[2020/04/16 08:35:45] [error] Failed initialize filter kubernetes.0
[2020/04/16 08:35:45] [ info] [input] pausing tail.0
[FILTER]
Name kubernetes
Match kube.*
Kube_URL https://kubernetes.default.svc.cluster.local:443
Kube_CA_File /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
Kube_Token_File /var/run/secrets/kubernetes.io/serviceaccount/token
Kube_Tag_Prefix kube.
Regex_Parser .*[\\\.](?<docker_id>[a-z0-9]*)-json.log
Merge_Log On
Merge_Log_Key log_processed
K8S-Logging.Parser On
K8S-Logging.Exclude Off
@fujimotos
Fluent Bit v1.4.2
* Copyright (C) 2019-2020 The Fluent Bit Authors
* Copyright (C) 2015-2018 Treasure Data
* Fluent Bit is a CNCF sub-project under the umbrella of Fluentd
* https://fluentbit.io[2020/04/21 05:27:22] [ info] Configuration:
[2020/04/21 05:27:22] [ info] flush time | 1.000000 seconds
[2020/04/21 05:27:22] [ info] grace | 5 seconds
[2020/04/21 05:27:22] [ info] daemon | 0
[2020/04/21 05:27:22] [ info] ___________
[2020/04/21 05:27:22] [ info] inputs:
[2020/04/21 05:27:22] [ info] tail
[2020/04/21 05:27:22] [ info] ___________
[2020/04/21 05:27:22] [ info] filters:
[2020/04/21 05:27:22] [ info] kubernetes.0
[2020/04/21 05:27:22] [ info] ___________
[2020/04/21 05:27:22] [ info] outputs:
[2020/04/21 05:27:22] [ info] es.0
[2020/04/21 05:27:22] [ info] ___________
[2020/04/21 05:27:22] [ info] collectors:
[2020/04/21 05:27:22] [debug] [storage] [cio stream] new stream registered: tail.0
[2020/04/21 05:27:22] [ info] [storage] version=1.0.3, initializing...
[2020/04/21 05:27:22] [ info] [storage] in-memory
[2020/04/21 05:27:22] [ info] [storage] normal synchronization mode, checksum disabled, max_chunks_up=128
[2020/04/21 05:27:22] [ info] [engine] started (pid=12212)
[2020/04/21 05:27:22] [debug] [engine] coroutine stack size: 98302 bytes (96.0K)
[2020/04/21 05:27:22] [debug] [input:tail:tail.0] scanning path C:\\ProgramData\\Docker\\containers\\*\\*.log
[2020/04/21 05:27:22] [error] [sqldb] error=unrecognized token: "337769972125171��"
[2020/04/21 05:27:22] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\3558cbfa135376d6e447c9c12a21bb7c6c91c06ab152eb12d3bf7064e7166ee4\3558cbfa135376d6e447c9c12a21bb7c6c91c06ab152eb12d3bf7064e7166ee4-json.log, offset=0
[2020/04/21 05:27:22] [error] [sqldb] error=unrecognized token: "844424930204353��"
[2020/04/21 05:27:22] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\5fac48b972397451777d32845e6af4f139512c6d29dd121e8f64faad6bf900c6\5fac48b972397451777d32845e6af4f139512c6d29dd121e8f64faad6bf900c6-json.log, offset=0
[2020/04/21 05:27:22] [error] [sqldb] error=unrecognized token: "951385421285816��"
[2020/04/21 05:27:22] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\7bf63f1f6fc6c7f0e7d2aeca2e96b8a6f89742fb3f1d379e2e48e2cb5089d3e1\7bf63f1f6fc6c7f0e7d2aeca2e96b8a6f89742fb3f1d379e2e48e2cb5089d3e1-json.log, offset=0
[2020/04/21 05:27:22] [error] [sqldb] error=unrecognized token: "337769972125156��"
[2020/04/21 05:27:22] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\92eeb82ae070f56231181f67f46c8c013d1599c71020abd76519ca386951769f\92eeb82ae070f56231181f67f46c8c013d1599c71020abd76519ca386951769f-json.log, offset=0
[2020/04/21 05:27:22] [error] [sqldb] error=unrecognized token: "281474976783024��"
[2020/04/21 05:27:22] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\a0d2703a346775539bcaf20dc1f02f4ac7e9efc2487f9f708206a66ee2fe80a7\a0d2703a346775539bcaf20dc1f02f4ac7e9efc2487f9f708206a66ee2fe80a7-json.log, offset=0
[2020/04/21 05:27:22] [error] [sqldb] error=unrecognized token: "529172956222091��"
[2020/04/21 05:27:22] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\baacaa9f15fd1dd4386513bd32da4ad6f75ee0728b91437d28234ed0de6bb91e\baacaa9f15fd1dd4386513bd32da4ad6f75ee0728b91437d28234ed0de6bb91e-json.log, offset=0
[2020/04/21 05:27:22] [error] [sqldb] error=unrecognized token: "281474976783063��"
[2020/04/21 05:27:22] [debug] [input:tail:tail.0] add to scan queue C:\ProgramData\Docker\containers\f45e3df47d33e53fe9f775923912ee52fbcd4c6b5ce91b1758273fd802a81540\f45e3df47d33e53fe9f775923912ee52fbcd4c6b5ce91b1758273fd802a81540-json.log, offset=0
[2020/04/21 05:27:22] [debug] [input:tail:tail.0] 7 files found for 'C:\\ProgramData\\Docker\\containers\\*\\*.log'
[2020/04/21 05:27:22] [ info] [filter:kubernetes:kubernetes.0] https=1 host=kubernetes.default.svc.cluster.local port=443
[2020/04/21 05:27:22] [ info] [filter:kubernetes:kubernetes.0] local POD info OK
[2020/04/21 05:27:22] [ info] [filter:kubernetes:kubernetes.0] testing connectivity with API server...
[2020/04/21 05:27:22] [debug] [filter:kubernetes:kubernetes.0] API Server (ns=logging, pod=fluent-bit-6rd4z) http_do=0, HTTP Status: 200
[2020/04/21 05:27:22] [ info] [filter:kubernetes:kubernetes.0] API server connectivity OK
[2020/04/21 05:27:22] [debug] [output:es:es.0] host=elasticsearch port=9200 uri=/_bulk index=fluent-bit type=flb_type
[2020/04/21 05:27:22] [debug] [router] match rule tail.0:es.0
[2020/04/21 05:27:22] [ info] [sp] stream processor started
[2020/04/21 05:27:22] [debug] [filter:kubernetes:kubernetes.0] API Server (ns=(null), pod=(null)) http_do=0, HTTP Status: 404
[2020/04/21 05:27:22] [debug] [filter:kubernetes:kubernetes.0] API Server response
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"pods \"(null)\" not found","reason":"NotFound","details":{"name":"(null)","kind":"pods"},"code":404}[2020/04/21 05:27:22] [debug] [filter:kubernetes:kubernetes.0] could not merge JSON log as requested
[2020/04/21 05:27:22] [debug] [filter:kubernetes:kubernetes.0] could not merge JSON log as requested
[2020/04/21 05:27:22] [debug] [filter:kubernetes:kubernetes.0] could not merge JSON log as requested
[2020/04/21 05:27:22] [debug] [filter:kubernetes:kubernetes.0] could not merge JSON log as requested
[2020/04/21 05:27:22] [debug] [filter:kubernetes:kubernetes.0] could not merge JSON log as requested
[2020/04/21 05:27:22] [debug] [filter:kubernetes:kubernetes.0] could not merge JSON log as requested
[2020/04/21 05:27:22] [debug] [filter:kubernetes:kubernetes.0] could not merge JSON log as requested[2020/04/21 05:27:22] [debug] [input:tail:tail.0] file=C:\ProgramData\Docker\containers\3558cbfa135376d6e447c9c12a21bb7c6c91c06ab152eb12d3bf7064e7166ee4\3558cbfa135376d6e447c9c12a21bb7c6c91c06ab152eb12d3bf7064e7166ee4-json.log read=32767 lines=324
[2020/04/21 05:27:22] [debug] [filter:kubernetes:kubernetes.0] API Server (ns=(null), pod=(null)) http_do=0, HTTP Status: 404
[2020/04/21 05:27:22] [debug] [filter:kubernetes:kubernetes.0] API Server response
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"pods \"(null)\" not found","reason":"NotFound","details":{"name":"(null)","kind":"pods"},"code":404}
@fujimotos any insights here please?
@pragadeeshraju I think I can find time to support you in a few days.
Note: I'm working on some fixes on your issues. e.g. #2145 adds retry mechanism to mitigate k8s's unstable network. A bit more will follow. WFM
@fujimotos sure thanks. Please notify me.
Is there any ETA/version that we expect this to be fixed?
Any update on this issue @pragadeeshraju @fujimotos I am getting the same with v1.5.1
@andrew-lozoya Can you post your log file here? I added some logging in v1.5.0 regarding this issue 8750159.
I think I can investigate further if you can provide the information.
@fujimotos
Sorry for the late response... I actually work at New Relic and I am working on getting our customers to standardize on Fluent-Bit for Windows Kubernetes deployments. Any help would be much appreciated.
Here is the container I built I am using the out of the box parsers.conf from the https://fluentbit.io/releases/1.5/td-agent-bit-1.5.1-win64.zip release.
fluent-bit.conf: |
[SERVICE]
Flush 1
Log_Level debug
Daemon off
Parsers_File C:\\fluent-bit\\conf\\parsers.conf
HTTP_Server Off
HTTP_Listen 0.0.0.0
HTTP_Port 2020
[INPUT]
Name tail
Tag kube.*
Path C:\\ProgramData\\Docker\\containers\\*\\*-json.log
Parser docker
DB C:\\flb_kube.dbs
Mem_Buf_Limit 7MB
Skip_Long_Lines On
Refresh_Interval 10
[INPUT]
Name dummy
dummy {"message":"test message", "dummy": "true"}
samples 1
[FILTER]
Name record_modifier
Match *
Record cluster_name ${CLUTSTER_NAME}
[FILTER]
Name kubernetes
Match kube.*
Kube_URL https://kubernetes.default.svc.cluster.local:443
Kube_Tag_Prefix kube.ProgramData.Docker.containers.
Merge_Log On
Merge_Log_Key log_processed
[OUTPUT]
Name newrelic
Match *
licenseKey ${LICENSE_KEY}
endpoint ${ENDPOINT}
{"log":"* Copyright (C) 2015-2018 Treasure Data\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8416839Z"}
{"log":"* Fluent Bit is a CNCF sub-project under the umbrella of Fluentd\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8416839Z"}
{"log":"* https://fluentbit.io\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8416839Z"}
{"log":"\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8456839Z"}
{"log":"[2020/07/31 01:14:22] [ info] Configuration:\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] flush time | 1.000000 seconds\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] grace | 5 seconds\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] daemon | 0\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] ___________\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] inputs:\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] tail\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] dummy\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] ___________\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] filters:\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] record_modifier.0\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] kubernetes.1\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] ___________\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] outputs:\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] newrelic.0\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] ___________\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] collectors:\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8486854Z"}
{"log":"[2020/07/31 01:14:22] [ info] [engine] started (pid=9316)\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8706857Z"}
{"log":"[2020/07/31 01:14:22] [debug] [engine] coroutine stack size: 98302 bytes (96.0K)\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8706857Z"}
{"log":"[2020/07/31 01:14:22] [debug] [storage] [cio stream] new stream registered: tail.0\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8706857Z"}
{"log":"[2020/07/31 01:14:22] [debug] [storage] [cio stream] new stream registered: dummy.1\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8706857Z"}
{"log":"[2020/07/31 01:14:22] [ info] [storage] version=1.0.4, initializing...\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8706857Z"}
{"log":"[2020/07/31 01:14:22] [ info] [storage] in-memory\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8706857Z"}
{"log":"[2020/07/31 01:14:22] [ info] [storage] normal synchronization mode, checksum disabled, max_chunks_up=128\r\n","stream":"stderr","time":"2020-07-31T01:14:22.8706857Z"}
{"log":"[2020/07/31 01:14:22] [error] [sqldb] error=unrecognized token: \"562949953718548\ufffd\ufffd\"\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [error] [input:tail:tail.0] cannot execute SQL: INSERT INTO in_tail_files (name, offset, inode, created) VALUES ('C:\\ProgramData\\Docker\\containers\\18f3c76424c1d65837842b0762c854a076c7cbdaa4bb87cc8c781b79a608c693\\18f3c76424c1d65837842b0762c854a076c7cbdaa4bb87cc8c781b79a608c693-json.log', 0, 562949953718548\ufffd\ufffd\u0001\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [debug] [input:tail:tail.0] inode=562949953718548 appended as C:\\ProgramData\\Docker\\containers\\18f3c76424c1d65837842b0762c854a076c7cbdaa4bb87cc8c781b79a608c693\\18f3c76424c1d65837842b0762c854a076c7cbdaa4bb87cc8c781b79a608c693-json.log\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [error] [sqldb] error=unrecognized token: \"506654958115754\ufffd\ufffd\"\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [error] [input:tail:tail.0] cannot execute SQL: INSERT INTO in_tail_files (name, offset, inode, created) VALUES ('C:\\ProgramData\\Docker\\containers\\21df022d53aa8fb3631c1e207313b0e4a13d6283489138d1f215227f6847b403\\21df022d53aa8fb3631c1e207313b0e4a13d6283489138d1f215227f6847b403-json.log', 0, 506654958115754\ufffd\ufffd\u0001\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [debug] [input:tail:tail.0] inode=5066549581157544 appended as C:\\ProgramData\\Docker\\containers\\21df022d53aa8fb3631c1e207313b0e4a13d6283489138d1f215227f6847b403\\21df022d53aa8fb3631c1e207313b0e4a13d6283489138d1f215227f6847b403-json.log\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [error] [sqldb] error=unrecognized token: \"182958734864219\ufffd\ufffd\"\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [error] [input:tail:tail.0] cannot execute SQL: INSERT INTO in_tail_files (name, offset, inode, created) VALUES ('C:\\ProgramData\\Docker\\containers\\2ab0af50735f741154af2935563f83efcb44d4a7ff802a36767b67ab90b5f62c\\2ab0af50735f741154af2935563f83efcb44d4a7ff802a36767b67ab90b5f62c-json.log', 0, 182958734864219\ufffd\ufffd\u0001\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [debug] [input:tail:tail.0] inode=18295873486421967 appended as C:\\ProgramData\\Docker\\containers\\2ab0af50735f741154af2935563f83efcb44d4a7ff802a36767b67ab90b5f62c\\2ab0af50735f741154af2935563f83efcb44d4a7ff802a36767b67ab90b5f62c-json.log\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [error] [sqldb] error=unrecognized token: \"591097451128946\ufffd\ufffd\"\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [error] [input:tail:tail.0] cannot execute SQL: INSERT INTO in_tail_files (name, offset, inode, created) VALUES ('C:\\ProgramData\\Docker\\containers\\2d013d24e9483c1635b2c1b8ec5823f0ee5dd6664a99bae8d7554b2025cf9ed4\\2d013d24e9483c1635b2c1b8ec5823f0ee5dd6664a99bae8d7554b2025cf9ed4-json.log', 0, 591097451128946\ufffd\ufffd\u0001\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [debug] [input:tail:tail.0] inode=5910974511289465 appended as C:\\ProgramData\\Docker\\containers\\2d013d24e9483c1635b2c1b8ec5823f0ee5dd6664a99bae8d7554b2025cf9ed4\\2d013d24e9483c1635b2c1b8ec5823f0ee5dd6664a99bae8d7554b2025cf9ed4-json.log\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [error] [sqldb] error=unrecognized token: \"844424930429199\ufffd\ufffd\"\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [error] [input:tail:tail.0] cannot execute SQL: INSERT INTO in_tail_files (name, offset, inode, created) VALUES ('C:\\ProgramData\\Docker\\containers\\5b8fe9ebeb3770c714a3ece9b3678d1e4eace2703f5725f86998e063fca6102d\\5b8fe9ebeb3770c714a3ece9b3678d1e4eace2703f5725f86998e063fca6102d-json.log', 0, 844424930429199\ufffd\ufffd\u0001\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [debug] [input:tail:tail.0] inode=844424930429199 appended as C:\\ProgramData\\Docker\\containers\\5b8fe9ebeb3770c714a3ece9b3678d1e4eace2703f5725f86998e063fca6102d\\5b8fe9ebeb3770c714a3ece9b3678d1e4eace2703f5725f86998e063fca6102d-json.log\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9226866Z"}
{"log":"[2020/07/31 01:14:22] [error] [sqldb] error=unrecognized token: \"562949953659185\ufffd\ufffd\"\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9406844Z"}
{"log":"[2020/07/31 01:14:22] [error] [input:tail:tail.0] cannot execute SQL: INSERT INTO in_tail_files (name, offset, inode, created) VALUES ('C:\\ProgramData\\Docker\\containers\\944c8c066cc28eb0afed5ab5bd6b7abc34c3a4d08a50d556fce0ca07c300be41\\944c8c066cc28eb0afed5ab5bd6b7abc34c3a4d08a50d556fce0ca07c300be41-json.log', 0, 562949953659185\ufffd\ufffd\u0001\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9406844Z"}
{"log":"[2020/07/31 01:14:22] [debug] [input:tail:tail.0] inode=562949953659185 appended as C:\\ProgramData\\Docker\\containers\\944c8c066cc28eb0afed5ab5bd6b7abc34c3a4d08a50d556fce0ca07c300be41\\944c8c066cc28eb0afed5ab5bd6b7abc34c3a4d08a50d556fce0ca07c300be41-json.log\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9406844Z"}
{"log":"[2020/07/31 01:14:22] [error] [sqldb] error=unrecognized token: \"562949953659210\ufffd\ufffd\"\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9406844Z"}
{"log":"[2020/07/31 01:14:22] [error] [input:tail:tail.0] cannot execute SQL: INSERT INTO in_tail_files (name, offset, inode, created) VALUES ('C:\\ProgramData\\Docker\\containers\\996631e48dba2141c6b6839567201aa81e7be8485eb124661713449a8b962caa\\996631e48dba2141c6b6839567201aa81e7be8485eb124661713449a8b962caa-json.log', 0, 562949953659210\ufffd\ufffd\u0001\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9406844Z"}
{"log":"[2020/07/31 01:14:22] [debug] [input:tail:tail.0] inode=562949953659210 appended as C:\\ProgramData\\Docker\\containers\\996631e48dba2141c6b6839567201aa81e7be8485eb124661713449a8b962caa\\996631e48dba2141c6b6839567201aa81e7be8485eb124661713449a8b962caa-json.log\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9406844Z"}
{"log":"[2020/07/31 01:14:22] [debug] [input:tail:tail.0] 7 new files found on path 'C:\\ProgramData\\Docker\\containers\\*\\*-json.log'\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9406844Z"}
{"log":"[2020/07/31 01:14:22] [error] [filter:record_modifier:record_modifier.0] invalid record parameters, expects 'KEY VALUE'\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9406844Z"}
{"log":"[2020/07/31 01:14:22] [ info] [filter:kubernetes:kubernetes.1] https=1 host=kubernetes.default.svc.cluster.local port=443\r\n","stream":"stderr","time":"2020-07-31T01:14:22.9406844Z"}
{"log":"[2020/07/31 01:14:22] [ info] [filter:kubernetes:kubernetes.1] local POD info OK\r\n","stream":"stderr","time":"2020-07-31T01:14:22.962683Z"}
{"log":"[2020/07/31 01:14:22] [ info] [filter:kubernetes:kubernetes.1] testing connectivity with API server...\r\n","stream":"stderr","time":"2020-07-31T01:14:22.962683Z"}
{"log":"[2020/07/31 01:14:23] [ info] [filter:kubernetes:kubernetes.1] Wait 30 secs until DNS starts up (1/6)\r\n","stream":"stderr","time":"2020-07-31T01:14:23.0636857Z"}
{"log":"[2020/07/31 01:15:05] [ info] [filter:kubernetes:kubernetes.1] Wait 30 secs until DNS starts up (2/6)\r\n","stream":"stderr","time":"2020-07-31T01:15:05.0532265Z"}
@andrew-lozoya (CC @pragadeeshraju) I noticed that the root issue here is the choice of the log file to watch.
Path C:\\ProgramData\\Docker\\containers\\*\\*-json.log
The log files in C:\\ProgramData\\
are raw data produced by Docker,
and Fluent Bit almost always should not directly read from them.
Instead, you need to watch files in C:\var\log
. This directory is maintained
by Kubernetes and contains additional metadata on top of the Docker logs.
So in short, put the following volume mapping to your Kubernetes YAML.
spec:
containers:
- name: fluent-bit
..
volumeMounts:
- mountPath: C:\k
name: k
- mountPath: C:\var\log
name: varlog
- mountPath: C:\ProgramData
name: progdata
volumes:
- name: k
hostPath:
path: C:\k
- name: varlog
hostPath:
path: C:\var\log
- name: progdata
hostPath:
path: C:\ProgramData
And tweak fluent-bit.conf to change the log file to watch (you'll need to
remove the Kube_Tag_Prefix
option from the filter config too).
[INPUT]
Name tail
...
Path C:\\var\\log\\*.log
Regarding this issue, I posted a patch to explain how to set up Fluent Bit on Kubernetes properly.
https://github.com/fluent/fluent-bit-docs/pull/353
This manual contains the explanation of the basic logging concept (like log file layout), so I think it should be helpful for you to set up things.
Please feel free to tell me if anything is unclear.
@fujimotos thank you so much for the doc it has helped! Can you elaborate more on how c:\var\logs is used I can't seem to find any documentation on that or how the symbolic links are working.
@andrew-lozoya Sadly it is not well documented in the Kubernetes project. I don't know any user documentation that contains a detailed description.
But this is the patch that implements this logging directory structure:
https://github.com/kubernetes/kubernetes/commit/fad4672e725877830407e2e4694fed7ceae5fe0a
In short, Kubernetes creates a symlink when it starts a container, and removes it when deallocating the container.
This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 5 days. Maintainers can add the exempt-stale
label.
This issue was closed because it has been stalled for 5 days with no activity.
Fluent-bit is not loading Kubernetes FILTER
Config used
Guide Used
https://github.com/fluent/fluent-bit-kubernetes-logging
Expected behavior
Should filter the logs properly
Logs for reference
Your Environment k8s cluster v1.15.7 with windows node
Additional context let me know if more information is needed
cc @fujimotos