Closed marono closed 4 years ago
Actually, just tried it without the copy plugin and I think adding the <buffer>
section gets it to generate that error.
@marono I'll take a look and get back to you soon
@marono
copy plugin is included in fluentd core. You didn't get the error when you tried without <buffer>
section?
Unfortunately the problem has not been reproduced on my environment. Can you please share the following info?:
gem list
)ruby -v
)I've tried even:
<match system.**>
@type azure-loganalytics
customer_id XXX
shared_key XXX
log_type onedrive
<buffer time>
timekey 1m
timekey_wait 1m
</buffer>
</match>
with same results. My debug info is:
fluentd 1.7.0
gem list
*** LOCAL GEMS ***
azure-loganalytics-datacollector-api (0.1.5)
bigdecimal (1.2.8)
concurrent-ruby (1.1.5)
cool.io (1.5.4)
did_you_mean (1.0.0)
dig_rb (1.0.1)
domain_name (0.5.20190701)
fluent-plugin-azure-loganalytics (0.4.1)
fluent-plugin-td (1.0.0)
fluentd (1.7.0, 0.12.43)
http-accept (1.7.0)
http-cookie (1.0.3)
http_parser.rb (0.6.0)
httpclient (2.8.3)
io-console (0.4.5)
json (1.8.3)
mime-types (3.2.2)
mime-types-data (3.2019.0331)
minitest (5.9.0)
msgpack (1.3.1)
net-telnet (0.1.1)
netrc (0.11.0)
power_assert (0.2.7)
psych (2.1.0)
rake (10.5.0)
rdoc (4.2.1)
rest-client (2.1.0)
serverengine (2.1.1)
sigdump (0.2.4)
string-scrub (0.0.5)
strptime (0.2.3)
td-client (1.0.7)
test-unit (3.1.7)
tzinfo (2.0.0)
tzinfo-data (1.2019.2)
unf (0.1.4)
unf_ext (0.0.7.6)
yajl-ruby (1.4.1)
ruby -v
ruby 2.3.3p222 (2016-11-21) [arm-linux-gnueabihf]
Running on raspberryPI, to test I run fluentd
, generate some events in syslog and then hit Ctrl+C
, at this point it will try to ship the data points and get the failure.
All I want to do with this configuration is to decrease the shipping type to 1 min.
@marono thanks for the info! Even with the same ruby & fluentd version, the problem was not reproduced on my environment.
Can you please try this configuration and see if you still get the same error ? The purpose is to clarify that this is caused by fluent-plugin-azure-loganalytics.
<match system.**>
@type file
path /var/log/fluentd
<buffer time>
timekey 1m
timekey_wait 1m
</buffer>
</match>
Sorry for the big delay ...
Tried the file output config above and the issue doesn't reproduce. Also noticed the resulting folder contained lots of buffer.*
files, which I assume it's expected ...
@marono Sorry for my not updating this. I don't think i can reporduce the same issue, thus let me close the issue. Please create a new issue if you observe the same issue again.
Hello,
Thanks for your work on integrating fluentd with Azure Insights, I'm trying to configure the output to go to Azure insights (while adjusting the buffer settings) and stdout (for debugging purposes). But shipping to Azure Insights fails with this configuration:
The error I'm getting is:
I'm quite convinced it's my configuration as I can see log shipping without the
copy
plugin ...