Closed deric closed 2 months ago
icingadb[3596742]: heartbeat: Lost Icinga heartbeat icingadb[3596742]: high-availability: Lost heartbeat icingadb[3596742]: history-sync: Synced 3 state history items icingadb[3596742]: heartbeat: Received Icinga heartbeat icingadb[3596742]: read tcp 127.0.0.1:38284->127.0.0.1:6380: i/o timeout can't perform "[xread count 4096 block 1000 streams icinga:history:stream:notification 1725868186446-4]" github.com/icinga/icingadb/pkg/icingaredis.WrapCmdErr github.com/icinga/icingadb/pkg/icingaredis/utils.go:121 github.com/icinga/icingadb/pkg/icingaredis.(*Client).XReadUntilResult github.com/icinga/icingadb/pkg/icingaredis/client.go:204 github.com/icinga/icingadb/pkg/icingadb/history.Sync.readFromRedis github.com/icinga/icingadb/pkg/icingadb/history/sync.go:114 github.com/icinga/icingadb/pkg/icingadb/history.Sync.Sync.func1 github.com/icinga/icingadb/pkg/icingadb/history/sync.go:83 golang.org/x/sync/errgroup.(*Group).Go.func1 golang.org/x/sync@v0.7.0/errgroup/errgroup.go:78 runtime.goexit runtime/asm_amd64.s:1695 can't read history github.com/icinga/icingadb/pkg/icingadb/history.Sync.readFromRedis github.com/icinga/icingadb/pkg/icingadb/history/sync.go:116 github.com/icinga/icingadb/pkg/icingadb/history.Sync.Sync.func1 github.com/icinga/icingadb/pkg/icingadb/history/sync.go:83 golang.org/x/sync/errgroup.(*Group).Go.func1 golang.org/x/sync@v0.7.0/errgroup/errgroup.go:78 runtime.goexit runtime/asm_amd64.s:1695
Basic config, HA setup, icinga2 is under load
icinga2
redis: host: localhost port: 6380 logging: level: info interval: 20s options: retention: history-days: 365 sla-days: 365 options:
Should not throw exception, service restart might help in this case.
Include as many relevant details about the environment you experienced the problem in
v1.2.0
r2.14.2-1
Hi @deric, thanks for reporting!
Duplicate of #786
Describe the bug
To Reproduce
Basic config, HA setup,
icinga2
is under loadExpected behavior
Should not throw exception, service restart might help in this case.
Your Environment
Include as many relevant details about the environment you experienced the problem in
v1.2.0
r2.14.2-1