mozilla-services / heka

DEPRECATED: Data collection and processing made easy.
http://hekad.readthedocs.org/
Other
3.39k stars 531 forks source link

Why can't I set maxprocs = 4? #1885

Closed wangfeiping closed 7 years ago

wangfeiping commented 8 years ago

When I set maxprocs = 4 I get this message "Diagnostics: 32 packs have been idle more than 120" Everything is ok when I set maxprocs = 1

Log data is 16000msg/s 512bytes/msg in udp.

` [hekad] maxprocs = 4 base_dir = "." share_dir = "."

[log_input] type="UdpInput" address = ":514"

[log_filter] type = "SandboxFilter" filename = "log_filter.lua" ticker_interval = 1 message_matcher = "Logger != 'log_filter'" preserve_data = false

[log_filter.config] topic_num = 4 balance_init_count = 3000 balance_release_timemillis = 60000

[PayloadEncoder]

[QianbaoKafkaOutput] type = "KafkaOutput" message_matcher = "Fields[payload_name]=='static'" partitioner = "RoundRobin" ticker_interval = 60 topic = "LOGS_SYS1" addrs = ["10.19.4.45:9092","10.19.4.46:9092"] encoder = "ProtobufEncoder"

[QianbaoKafkaOutput-1] type = "KafkaOutput" message_matcher = "Fields[payload_name]=='topic_1'" # "TRUE" partitioner = "RoundRobin" topic = "LOGS_SYS1" addrs = ["10.19.4.45:9092","10.19.4.46:9092"] encoder = "ProtobufEncoder"

[QianbaoKafkaOutput-2] type = "KafkaOutput" message_matcher = "Fields[payload_name]=='topic_2'" # "TRUE" partitioner = "RoundRobin" topic = "LOGS_BIZ1" addrs = ["10.19.4.45:9092","10.19.4.46:9092"] encoder = "ProtobufEncoder"

[QianbaoKafkaOutput-3] type = "KafkaOutput" message_matcher = "Fields[payload_name]=='topic_3'" # "TRUE" partitioner = "RoundRobin" topic = "LOGS_SYS2" addrs = ["10.19.4.45:9092","10.19.4.46:9092"] encoder = "ProtobufEncoder"

[QianbaoKafkaOutput-4] type = "KafkaOutput" message_matcher = "Fields[payload_name]=='topic_4'" # "TRUE" partitioner = "RoundRobin" topic = "LOGS_BIZ2" addrs = ["10.19.4.45:9092","10.19.4.46:9092"] encoder = "ProtobufEncoder" `

wangfeiping commented 7 years ago

https://github.com/mozilla-services/heka/issues/1954 Thanks again!