Closed ngosang closed 5 years ago
i giving this priority low... but i dont think we implement this soon... allow user change frequentions may lead to many issues.
And to be honest, I don't see any issue running this task every minute. It's only a call to Sonarr API. Nothing is read from disk.
You can give the actual configuration as default and let the user change the values. If you are worried about bad configurations you can put a min-max interval and check it before save.
I still din’t see the impact of doing this vs the benefit of having near real time series added to Bazarr.
I'm running several software in a low low end hardware. 128MB RAM. Anyway I agree, this is low priority for me also.
I think this actually makes a lot of sense to implement. I run bazarr in a docker container, and it is the highest CPU consumer by far. It actually increases the power consumption of my box by 4 watts that is roughly 8 EUR/year. Although it may be a low amount, it means Bazarr keeps the CPU awake, it can't enter its lowest idle state and consumes additional energy unnecessarily. Multiplied by the number of users :)
Looking at a powertop snippet:
PowerTOP 2.8 Overview Idle stats Frequency stats Device stats Tunables
Summary: 1287.7 wakeups/second, 0.0 GPU ops/seconds, 0.0 VFS ops/sec and 8.3% CPU use
Usage Events/s Category Description
41.6 ms/s 862.1 Process python /app/bazarr/bazarr.py --no-update --config /config
2.9 ms/s 206.0 Timer tick_sched_timer
3.9 ms/s 52.0 Process powertop
13.7 ms/s 24.5 Timer hrtimer_wakeup
293.2 µs/s 21.6 Process [rcu_sched]
1.9 ms/s 19.6 Process python
1.3 ms/s 19.6 Process /usr/bin/python -u /app/bazarr/bazarr/main.py --no-update --config /config
1.1 ms/s 19.6 Process mono --debug Radarr.exe -nobrowser -data=/config
78.2 µs/s 6.9 kWork gc_worker
6.8 ms/s 3.9 Interrupt [7] sched(softirq)
251.4 µs/s 5.9 Process /usr/bin/dockerd -H unix:///var/run/docker.sock
201.8 µs/s 5.9 Interrupt [31] i915
192.5 µs/s 5.9 Process nginx: worker process
285.3 µs/s 3.9 Process /usr/bin/containerd
142.2 µs/s 3.9 Interrupt [3] net_rx(softirq)
500.5 µs/s 2.9 Process /app/Jackett/jackett --NoUpdates
106.4 µs/s 2.9 Process mono --debug NzbDrone.exe -nobrowser -data=/config
74.4 µs/s 2.9 Process /usr/bin/rrdcached -B -F -f 3600 -w 900 -b /var/lib/rrdcached/db/ -j /var/lib/rrdcached/journal/ -p /var/run/rrdcached.pid -l u
92.6 µs/s 2.0 Process /usr/sbin/watchdog
4.6 µs/s 2.0 Timer sched_rt_period_timer
1.3 ms/s 1.0 Process /usr/sbin/snmpd -Lsd -Lf /dev/null -u Debian-snmp -g Debian-snmp -I -smux mteTrigger mteTriggerConf -f
3.7 ms/s 0.00 Process [kworker/1:1]
323.2 µs/s 1.0 Process jackett
159.0 µs/s 1.0 kWork acpi_os_execute_deferred
77.3 µs/s 1.0 Process sshd: root@pts/0
49.1 µs/s 1.0 Process omv-engined
46.6 µs/s 1.0 Process /usr/sbin/ntpd -p /var/run/ntpd.pid -g -c /run/ntp.conf.dhcp -u 108:114
38.1 µs/s 1.0 Process php-fpm: master process (/etc/php/7.0/fpm/php-fpm.conf)
37.1 µs/s 1.0 Interrupt [9] acpi
31.9 µs/s 1.0 Process /usr/bin/monit -c /etc/monit/monitrc
28.1 µs/s 1.0 Process /usr/sbin/minidlnad -f /etc/minidlna.conf -P /run/minidlna/minidlna.pid
16.5 µs/s 1.0 kWork pci_pme_list_scan
15.8 µs/s 1.0 kWork free_work
7.2 µs/s 1.0 kWork flush_to_ldisc
0.0 µs/s 1.0 kWork i915_hpd_poll_init_work
664.8 µs/s 0.00 Interrupt [1] timer(softirq)
487.3 µs/s 0.00 Interrupt [9] RCU(softirq)
122.5 µs/s 0.00 Timer process_timeout
68.9 µs/s 0.00 Timer delayed_work_timer_fn
66.7 µs/s 0.00 Process [kworker/3:1]
42.6 µs/s 0.00 Interrupt [26] eno1
21.2 µs/s 0.00 Timer clocksource_watchdog
19.1 µs/s 0.00 Timer neigh_timer_handler
14.1 µs/s 0.00 Timer it_real_fn
11.8 µs/s 0.00 Process [kworker/2:2]
7.4 µs/s 0.00 kWork vmstat_update
6.4 µs/s 0.00 Process [kworker/u8:0]
5.5 µs/s 0.00 Timer dev_watchdog
2.6 µs/s 0.00 Interrupt [2] net tx(softirq)
2.3 µs/s 0.00 Timer tcp_write_timer
1.6 µs/s 0.00 Process [kworker/0:0]
1.0 µs/s 0.00 Timer intel_uncore_fw_release_timer
If I compare Bazarr with other members of the ecosystem, while it accounts for a CPU usage worth 4 watts of consumption, Jackett, Sonarr and Radarr are consuming 0.3 watts alltogether.
Can't we put a list of multipliers, let's say 2x the current periods or 3x, 4x, 5x ... instead of specifying times in minutes individually? It would be easier to implement and it would avoid extreme and faulty values.
I have a similar issue as @gszigethy showed. Bazarr is the program using the most CPU cycles from all application I run on my server at home. Seems a bit out of place for what the application essentially does. I personally would not mind it only checking for subs once a day, and would prefer to have control over when or how often it updates. My CPU is limited in resources and being able to control how they are used is a big thing for me.
You can setup to scan existing subtitles to once per week or manualy... And also if you have low resources i recommend you disable search enabled providers simultaneusly... So bazarr will search subtitles provider by provider not all providers at time...
You can setup to scan existing subtitles to once per week or manualy... And also if you have low resources i recommend you disable search enabled providers simultaneusly... So bazarr will search subtitles provider by provider not all providers at time...
First of all. Thank you so much for your work on this. It's brilliant, however I too would like more control. Please. I've already disabled simultaneous searching. Once per week is not enough and the tasks are too frequent. How about adding a simple scheduler? If I could set a range of hours something like sabnzbd does that would be perfect. I wouldn't need day of the week control. Every day would be fine if it was limited to from 1 am to 7 am when I am not using the computer for example. Thank you again. You guys rock.
This definitely checks too often. Why does it check for shows every 1 minute? Do people add new series every minute be better like once a day. Episodes every half hour since most shows are 30 minutes or 1 hour long. Search wanted subtitles every 24 hours due to reaching limit on subtitle providers and keeps checking to find limit exceeded. That could also be handled in by deactivating provider until 24 hour reset like reaching api limit with nzbhydra for indexers.
Just my suggestions. I should have no problems once my library is complete and is just searching for new subtitles.
When you will search for new series once per day and for episodes once per hour. You will have big problems when you once add new series. And there will be in bazarr episodes to non existing series. So logs full of errors. And users will complain... Every user have his own use case. And that bazarr check for series every minute is equals to open google.com every minute but less data.
Dňa st 15. 5. 2019, 3:32 Metigoth notifications@github.com napísal(a):
This definitely checks too often. Why does it check for shows every 1 minute? Do people add new series every minute be better like once a day. Episodes every half hour since most shows are 30 minutes or 1 hour long. Search wanted subtitles every 24 hours due to reaching limit on subtitle providers and keeps checking to find limit exceeded. That could also be handled in by deactivating provider until 24 hour reset like reaching api limit with nzbhydra for indexers.
Just my suggestions. I should have no problems once my library is complete and is just searching for new subtitles.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/morpheus65535/bazarr/issues/289?email_source=notifications&email_token=AEH2BTC23NHUF5WI5GGEM4LPVNR2NA5CNFSM4GSH2QAKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODVNH7DA#issuecomment-492470156, or mute the thread https://github.com/notifications/unsubscribe-auth/AEH2BTBNWEE5DWS23DRIFETPVNR2NANCNFSM4GSH2QAA .
Having the same problem as @gszigethy, bazarr always using 10% of the cpu:
Shouldn't be related with the 1m frequency check from sonar, can it be other check running inside a while(true)
?
Since this results in increased heat & power consumption, I'll stop the container and start it manually when I need some subtitles.
Anyways, thanks for your work. 👍
@carlosflorencio What hardware are you running Bazarr on?
@morpheus65535
Mini PC Z83II: CPU: Intel Atom x5-Z8359 RAM: 2GB
Bazarr have been using that kind of cpu for how long?
@morpheus65535 since I've deployed it, some hours ago. Always using 10% cpu.
This will help #457 @carlosflorencio test it in your machine if you can. Just edit bazarr.py file with the PR changes.
Don't close this issue. I still think we have to increase the execution frequency.
We are planing to completely discard those frequently running tasking to be more event oriented.
@Carlosflorienco it is probably still scanning for existing subtitles. Give it a day or two and it should lower by itself.
@halali
When you will search for new series once per day and for episodes once per hour. You will have big problems when you once add new series. And there will be in bazarr episodes to non existing series. So logs full of errors. And users will complain...
Ok, that is a good reason for not increasing the task frequency with current implementation. I'm proposing the following. For me, the annoying things are the task in blue, other tasks are fine. To fix the problems in the @halali quote, you can make the tasks in green to call the tasks in blue before doing the work. If you do that, you can remove the tasks in blue (or increase the frecuency). This will work without errors in background. This approach have two drawbacks when the user interacts with the UI:
@Nsosang if you have time please come to Discord so we can talk about your ideas.
+1 for this request. I watch few series and and films, so checking every minute is just a waste of energy for me.
+1 for me. I'm going to shut the container down and only spin it up when I add new movies or series until this feature is implemented.
A killer option could be a webhook from Sonarr and Radarr that would trigger these events. Is it something possible?
It was one of the option we were testing. My first choice having Bazarr listen to Sonarr and Radarr signalR feed but I never made it to work. Webhook was best alternative but require configuration outside of Bazarr and I’m pretty it’s going to cause me nightmare in support. :-/
It was one of the option we were testing. My first choice having Bazarr listen to Sonarr and Radarr signalR feed but I never made it to work. Webhook was best alternative but require configuration outside of Bazarr and I’m pretty it’s going to cause me nightmare in support. :-/
Yep I feel your pain ;) Possible changes on other project + it could work for expert public but for sure this would be difficult for others :/
The second point can be mitigated by a specific "expert mode" config, but the second is a risk. It seems thought to be pretty stable for Plex and others, I don't know if they make changes so often on other projects regarding these notifications features.
Take your time to evaluate it and let us know. Available for some tests if needed !
Thanks for good work and time :)
Feature Request have been moved to http://features.bazarr.media
Please go upvote so we can prioritize your feature request.
Current execution frecuency is too high for me. I have few series and refresh every minute is too much.