AzuraCast / AzuraCast

A self-hosted web radio management suite, including turnkey installer tools for the full radio software stack and a modern, easy-to-use web app to manage your stations.
https://www.azuracast.com/
GNU Affero General Public License v3.0
3.12k stars 573 forks source link

How much is the maximum number of stations that can be hosted on azuracast? #261

Closed rolradio closed 7 years ago

rolradio commented 7 years ago

A question, How much is the maximum number of stations that can be hosted on azuracast? It seems to me that for the first 10 stations (port 8000 to 8090) everything goes well. But no shoutcast server is created for the following radio stations from port 8100 and above. I run azuracast in the Docker Installation.

BusterNeece commented 7 years ago

@rolradio AzuraCast can, in theory, support as many stations as your hardware can support. In practice, my experience has shown that it's hard to really push more than 10 unique stations (especially if they each have unique mount points) from a single server without running into bottlenecks, particularly at CPU usage for audio encoding.

The reason for the artificial cap at 10 stations on Docker installations is because of a long-running issue with Docker, where each single port proxied (even if it's forwarded in a range, like 8000-8999) is a single process. Previously, I had the Docker Compose file set up to forward the aforementioned range, and the sheer RAM used by those thousand processes was 512MB by itself, causing the Docker install to be unusable on smaller servers. The new cap significantly reduces that consumption and makes the system work on more commodity hardware again.

If you have the hardware and can handle the overhead of forwarding out that many ports, I do ship a docker-compose.stationports.yml file along with AzuraCast. You can spin up your containers using:

docker-compose -f docker-compose.yml -f docker-compose.stationports.yml up -d

...and it will use a much broader port range, from 8000 to 8500.

rolradio commented 7 years ago

Okay that worked. With 10 stations I had a CPU of +/- 26% (4 cores) By releasing the ports and creating the 2 additional radio stations, the average CPU goes to +/- 50% with outliers to 100%, which is indeed a lot. Is this also possible to switch back to the original? For example, by changing docker-compose.stationports.yml?

BusterNeece commented 7 years ago

@rolradio You can always change the port configuration just by doing:

docker-compose -f docker-compose.yml -f docker-compose.stationports.yml down
docker-compose up -d

Just make sure the "down" command uses the same files as the "up" command you previously used, and then you can spin the containers back up using any combination of files (or, if you don't specify files, it just uses docker-compose.yml automatically).

rolradio commented 7 years ago

@SlvrEagle23 Thank you for the explanation.

arnebakken commented 6 years ago

Is there any workaround to use more stations with a Docker installation without the intense cpu usage? I love Azuracast and using it on a Docker installation on a VPS. 10 Core ( Intel® Xeon® Gold 6140) and 32 GB RAM. We have now 24 stations and the average CPU goes to +/- 50% . Never had any problems with the streaming (no dropouts and lots of users), but the administration GUI (web interface) is really slow.

You wrote: The reason for the artificial cap at 10 stations on Docker installations is because of a long-running issue with Docker, where each single port proxied (even if it's forwarded in a range, like 8000-8999) is a single process. .

My question, is Docker the problem from me or isn't my hardware strong enough to handle 25-30 stations? I would prefer to not have 2 separate server instances and I love the Docker-concept (easy to update, backup and restore). But maybe I have to do a Traditional Installation or is there any other possible way for me to solve it in another way.

I really love and appreciate your work with Azuracast!

Kind regards, Arne

CodeSteele commented 6 years ago

Last time I eyeballed this, I think the main limitation is going transcoding audio for that many stations at once (I want to say LiquidSoap was doing a lot of CPU intensive heavy lifting).

Not sure how much effort it would be to run multiple instances of LiquidSoap on multiple servers, outside of some really sketch stuff that even a traditional install won't like (eg: dealing with configuration files across multiple instances), or if there are any configurations that'll be more CPU friendly.

I honestly don't think the one-port-per-process thing should be a major deal breaker as much as the other things that are going on (eg: we won't get to a high enough number of ports before we're burying all cores transcoding/multiplexing).


I may poke around at deploying a few instances of various sizes and seeing how they perform I'm kind of curious.

BusterNeece commented 6 years ago

@arnebakken The limitation on the total number of stations a given VPS can support isn't imposed by AzuraCast itself or anything it runs, as it tends to scale up to handle multiple stations rather well, while still occupying a fairly flat amount of resources for PHP, the web server and databases.

The bottleneck in any server's capacity is going to be the combined total number of mount points and remote relays that an installation is operating through LiquidSoap. Each of these amounts to a not insignificant amount of fairly constant CPU labor, which is the primary reason machines get bogged down when running multiple stations with multiple mount points or relays.

The limitation on our default docker-compose.yml file where only the ports for the first 10 stations are exposed is due to a long-standing issue with Docker's userland proxy consuming some small amount of RAM for every single port it handles, so opening up the entire 8000-9000 range would consume almost 1GB of RAM just for this forwarding process. You are entirely free to modify the existing docker-compose.yml file (or, better yet, create another file named docker-compose.override.yml that just contains the modifications) with an expanded array of ports if your system has the horsepower to handle it.

tonimad commented 5 years ago

I would like to reopen this debate about the number of radio stations that can be supported by Azuracast.

In this thread, you are talking about an installation with Docker. What happens if the installation is traditional (without docker)?

I am thinking of the following production scenario: A dedicated server with a traditional Azuracast installation, where each radio station uses a single mount point (the one that is set by default when creating the radio station).

In this case, is there any limitation in the number of radio stations, which Azuracast establishes?

How many radio stations is it possible to handle in this case?

BusterNeece commented 5 years ago

@tonimad There is no reason for debate about the number of radio stations AzuraCast can support; my previous answer stands universally across the board, regardless of installation method.

Servers and their resources are so wildly diverse that we can't say with any reliability what CPU percentage each station (specifically, each mount point or remote relay on each station) uses on a given web host. However, it's definitely true that this station count is the bottleneck to performance on a multi-tenant AzuraCast installation, because once the total CPU capacity of the server starts to be exhausted, transcoding performance takes a hit and services will eventually stop working.

AzuraCast doesn't impose any limit on stations, so it's really a matter of finding the combination that works best within your resource constraints.