Closed Elliander closed 6 years ago
Related to the above, when I try to set a world to 900x900 I get the following error on Generating terrain:
[48:52] Generating terrain...
Unhandled Exception: System.AggregateException: One or more errors occurred. ---> System.ArgumentException: Total world volume of 9000 x 26.5121437901235 y 9000 z = 2147483647 is larger than the max supported size of int.MaxValue (2147483647). at Eco.Shared.Math.WorldPosition3i.Initialize(Vector3i worldSize) at Eco.Shared.Voxel.ChunkGrid
1.AddChunk(TChunk chunk) at Eco.Shared.Voxel.ChunkGrid
1.GetOrAddChunk(Vector3i chunkPos) at Eco.World.WorldChunkGrid.SetBlock[T](Vector3i worldPos, Object[] args) at Eco.World.World.SetBlock[T](Vector3i worldPos, Object[] args) at Eco.WorldGenerator.TerrainGenerator.Generate(Int32 chunkPosX, Int32 chunkPosZ) at Eco.WorldGenerator.WorldGeneratorPlugin.<>cDisplayClass27_1.b0(Vector2i columnPos) at System.Threading.Tasks.Parallel.<>cDisplayClass42_0`2. 1() at System.Threading.Tasks.Task.InnerInvokeWithArg(Task childTask) at System.Threading.Tasks.Task.<>c__DisplayClass176_0.b b__0(Object ) --- End of inner exception stack trace --- at System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceledExceptions) at System.Threading.Tasks.Task.Wait(Int32 millisecondsTimeout, CancellationToken cancellationToken) at System.Threading.Tasks.Parallel.PartitionerForEachWorker[TSource,TLocal](Partitioner 1 source, ParallelOptions parallelOptions, Action
1 simpleBody, Action2 bodyWithState, Action
3 bodyWithStateAndIndex, Func4 bodyWithStateAndLocal, Func
5 bodyWithEverything, Func1 localInit, Action
1 localFinally) at System.Threading.Tasks.Parallel.ForEachWorker[TSource,TLocal](IEnumerable1 source, ParallelOptions parallelOptions, Action
1 body, Action2 bodyWithState, Action
3 bodyWithStateAndIndex, Func4 bodyWithStateAndLocal, Func
5 bodyWithEverything, Func1 localInit, Action
1 localFinally) at System.Threading.Tasks.Parallel.ForEach[TSource](IEnumerable1 source, Action
1 body) at Eco.WorldGenerator.WorldGeneratorPlugin.CreateWorld() at Eco.WorldGenerator.WorldGeneratorPlugin.Initialize(TimedTask timer) at Eco.Server.PluginManager.InitializePlugins() at Eco.Server.PluginManager..ctor() at Eco.Server.Startup.Start(String[] args) at Eco.Server.MainClass.Main(String[] args)
However, it's not saying what the max supported is. I'm assuming that since it's listing 9000, rather than 900, it's in reference to the total individual blocks. I'm also assuming that you are using int32 here, like in the world seed, so the maximum result ends up being 2,147,483,647 - which is exactly what is shown as the result in the error, meaning it should work. Unless the message is about what it can support?
I tested that, setting it to 900x900 with a height of 25 (since 9000x9000x25 is less than the max value of int32), but got the same error. (exactly the same error, actually)
860x860 - same error. 840x840 - same error. 820x820 - same error. 800x800 - same error. 700x700 - same error. 600x600 - same error.
600x600 with a height of 100 also has the same error, even though 300x300 with a height of 200 works, which should be the same total value as each other.
500x500 - same error. 400x400 - same error. 380x380 - same error. 360x360 - same error. 320x320 - WORKS! 340x340 - WORKS! so... that's the max possible world size? 11.56 km^2?
In any case, the IDE should warn players against certain combinations that would lead to going over the max size like it does when trying to set the seed too high.
EDIT: 340x340 doesn't work after all. At 8% generating terrain it triggers the same error again. So overall, there's uncertainty about the max values that can be set and even within that max there can be unhandled exceptions.
EDIT 2: So far, 320x320x200 is the largest world I was able to generate without error, and that at least works perfectly.
FYI
zEverdeen - Today at 3:27 PM A world size has to be divisible by 4 not 2 but 4 even if 4 is divided by 2. The following world sizes are known to be stable: 72X72, 100X100, 140X140, 172X172, and 200X200, 224X224, 240X240, 248X248, 272X272 296X296,300X300, 340X340 372X372. The devs have teste up to 400X400 but as they say there is really no use for a world that large. Others sizes are at your own risk at this time. If your server crashes on building or uses too much RAM, suffers from extreme drops in FPS, then you need to try a smaller sized world and/or stick with the known stable world sizes. The bigger a world the more added problems you might find you have.
It's still an issue even if those sizes end up being the only ones ever supported because it allows the end user to try combinations out of range without warning.
Thanks for letting me know that 340x340 and 372x372 is supposed to work. However, neither 340x340 nor 372x372 actually works because of the same error. With 340x340 it occurs at 8% terrain, and at 372x372 it occurs immediately:
Unhandled Exception: System.AggregateException: One or more errors occurred. ---> System.ArgumentException: Total world volume of 3720 x 155.18294362065 y 3720 z = 2147483647 is larger than the max supported size of int.MaxValue (2147483647). at Eco.Shared.Math.WorldPosition3i.Initialize(Vector3i worldSize) at Eco.Shared.Voxel.ChunkGrid
1.AddChunk(TChunk chunk) at Eco.Shared.Voxel.ChunkGrid
1.GetOrAddChunk(Vector3i chunkPos) at Eco.World.WorldChunkGrid.SetBlock(Type blockType, Vector3i worldPos, Object[] args) at Eco.WorldGenerator.TerrainGenerator.Generate(Int32 chunkPosX, Int32 chunkPosZ) at Eco.WorldGenerator.WorldGeneratorPlugin.<>cDisplayClass27_1.b0(Vector2i columnPos) at System.Threading.Tasks.Parallel.<>cDisplayClass42_0`2. 1() at System.Threading.Tasks.Task.InnerInvokeWithArg(Task childTask) at System.Threading.Tasks.Task.<>c__DisplayClass176_0.b b__0(Object ) --- End of inner exception stack trace --- at System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceledExceptions) at System.Threading.Tasks.Task.Wait(Int32 millisecondsTimeout, CancellationToken cancellationToken) at System.Threading.Tasks.Parallel.PartitionerForEachWorker[TSource,TLocal](Partitioner 1 source, ParallelOptions parallelOptions, Action
1 simpleBody, Action2 bodyWithState, Action
3 bodyWithStateAndIndex, Func4 bodyWithStateAndLocal, Func
5 bodyWithEverything, Func1 localInit, Action
1 localFinally) at System.Threading.Tasks.Parallel.ForEachWorker[TSource,TLocal](IEnumerable1 source, ParallelOptions parallelOptions, Action
1 body, Action2 bodyWithState, Action
3 bodyWithStateAndIndex, Func4 bodyWithStateAndLocal, Func
5 bodyWithEverything, Func1 localInit, Action
1 localFinally) at System.Threading.Tasks.Parallel.ForEach[TSource](IEnumerable1 source, Action
1 body) at Eco.WorldGenerator.Wo ks.Parallel.ForEach[TSource](IEnumerable1 source, Action
1 body) at Eco.WorldGenerator.WorldGeneratorPlugin.CreateWorld() at Eco.WorldGenerator.WorldGeneratorPlugin.Initialize(TimedTask timer) at Eco.Server.PluginManager.InitializePlugins() at Eco.Server.PluginManager..ctor() at Eco.Server.Startup.Start(String[] args) at Eco.Server.MainClass.Main(String[] args)
What factors would I need to consider when determining the max server size as well as max user load? I mean, aside from just generating the world - but actually running it reliably. Seems like there should be some way to calculate that reliably without having to try and see . Also, if there is a proper max number of users that can be active in a world at one time, there should also be a way to set max connections.
372x372 is working 100% because we had it running on our server once. I bet there is no official statement regarding hardware usage, but I can share my own experience after over 4 months of hosting ECO.
RAM: It is not as RAM heavy as you might think at first. 372x372 (currently largest stable size) with height set at 100 and depth at 50 requires aprox. 9GB at start and then grows according to players activity.
CPU: ECO appears to be CPU heavy. We run on 10 cores Intel Xeon 2630v4 (2.2GHz with 25MB Cache). We hosted different world sizes from 72x72 to 372x372 and its always consumes at least 10% CPU with average at 20% CPU Usage, I also havnt seen ECO occupying more than 30% CPU even at its heaviest times. I have also noticed that CPU consumption is not being affected by the amount of players connected (we had max 20 concurrent players connected until now), but rather by the amount of objects surrounding connected players.
You can set max connections in Config Window under Users Tab.
P.S. Hardware stats provided are from Normal Operation. We did have spikes of even 100% CPU that caused crashes, but thats coming from software bugs that got fixed or hopefully will be fixed in the future.
I ran some additional tests, and it looks like 372x372 with a height of 100 does work, but at a height of 200 does not work, so it is about volume. (again, when it leaves beta and goes to final release all the possible good values will need to be checked against so tests like these are still important). When I build a taller world with the same seed the mountains end up spawning taller with it, which is very interesting especially since the meteor will fly through the mountains.
When I run 340x340; height: 200 I use less than 5 GB of RAM for the server.
I have similar CPU results with my tests running an Intel i7-5820k 3.30 Ghz overclocked to 4.60 Ghz with 6 cores and 2 threads per core water cooled. I only see it going high when actually building a new world (then it's near 100%, but only for certain steps, then falls to less than half a percent) although I'm surprised they don't seem to use the GPU for any of that, since my water cooled and overclocked 980 Ti is sitting at 2% right now.
I haven't done tests with concurrent players, but those results are reassuring. What about bandwidth constraints? How much upload/download speeds are needed to avoid lag for players at certain thresholds? and if I ever decided to build a second higher power machine to act as a dedicated linux server is there any way for me to host and play on different machines at the same time?
5kb/s-150kb/s/player with average at 10kb/s depends on what the player is doing and the amount of objects surrounding the player. Highest I've seen is 680mb/s, dont know if it was caused by a bug or it happened naturally.
P.S. In your original post you said you have plans on upgrading to 128GB of RAM. Your processor only supports up to 64GB, so i wouldnt recommend wasting money on additional 64GB.
10 kb/s per player is not bad at all. Currently, I'm averaging 135+ Mbps download and around 12 Mbps upload. A single Megabit would likely support up to a hundred players, meaning my real bottleneck is going to be processing by the sounds of things, but with what you told me that shouldn't be a problem either.
680 Mbps would be a nightmare - especially if that's the upload speed. It probably was a bug, like I imagine the object duplication glitch going around is absolutely killing servers left and right.
Although the CPU specs claim that only 64 GB is supported, it will actually support 128 GB if the motherboard is designed to handle the situation:
https://vdr.one/does-intel-i7-5820k-support-128gb-ram/
I actually selected the parts with the intention of gradually upgrading to 128 GB from the start. Not that it would seem to matter in this case since it uses so little RAM to begin with.
At present, the largest world I have been able to generate is 372x372x140 - I've noticed that the landscape stretches along the Z axis the taller I make a world, although I can't be sure to what extent because when I generate a new world from the same configuration files things tend to look a little different each time. In my current test of a world that size it's using less than 4 GB of RAM and very low processing, but then I don't have active users in it to help me test yet.
Megabit(Mbps) ≠ Megabyte(MB/s). A 12 Mbps connection will yield ~1.5MB/s in "user friendly" speed(MB/s) (https://www.gbmb.org/mbps-to-mbs). My guess is that you will be able to host around 30-50 concurrent players, but thats just a wild guess based on 20 players server population. I wouldnt put too much trust in such low upload capability on a server.
Out of curiosity... If you built your machine with only one goal in mind - dedicated server hosting, why'd you go for i7 instead of server-designed Xeon and what for have you installed a GTX 980 TI? I mean for local use you could have installed any cheap video card that is capable of loading up Windows Desktop, but thats pretty much all the capability thats needed, why 980 TI?
ahh, I thought you meant megabits. OK, those are still good figures to work with.
Actually, this was built to be a development machine for school and work, specifically for C++ / C# unreal / unity development with virtual reality and neural interface devices. As it stands, I barely meet the minimum requirements for the latest VR, but when it was built a few years ago it ranked in the top 1% of desktops according to benchmarks. However, practically, as a full time student, I rarely have the time to do more than homework. It was meant to increase the GPU and RAM count gradually as needed, not to act as a dedicated server. I also run multiple concurrent virtual machines for alternative OS environments, sometimes simultaneously each with their own input devices. I was also considering an augmented reality device to replace my monitors, but they don't seem ready yet for my needs to justify the cost.
That being said, I'd like to construct a dedicated server in the near future which is why I was wondering about being logged on as a server on a separate machine to being logged in as a user. If I did built one I'd likely go with a network load handler so I could run more than one ISP connection to the same machine. Do you have a suggested build list, if it's only goal would be to run Eco 24/7 with a 372x372x140 map for 100+ concurrent players? I'd probably prefer a Linux based OS for something like that.
If I generate a server in windows would it port well to linux?
To be honest I dont know why would one set up a dedicated hardware to run ECO server only... You can run it on a cheap laptop with enough RAM installed.
Dont expect your server to have massive population to handle. All those third party game servers providers such as pingperfect, gameservers, multiplay etc. created an unfriendly environment for games such ECO. In ECO there is currently over 1000 online servers with less than 2000 online players. Most of those games die quickly due to this problem that devs seem to not be able to see. What meant to be a multiplayer game turns in to thousands empty servers.
I have no knowledge in linux operation, but if its using the same files, then I dont see a reason why it wouldnt work.
Well, the game is still in Beta and it's already better than minecraft which is massively popular. I fully expect that when the game is fully fleshed out and in final public release we'll see a major influx of new players - most of whom won't want the expense of a dedicated server that's always online.
Although, in regards to that, maybe servers could be ranked according to % uptime, or list if it's only online during certain hours of each day.
It seems to me that worlds die due to the mind set that once the meteor is destroyed the game is over and it's time to move on. The over fixation with the disaster event seems to keep people from sticking around. Then, on the flip side, a world that has been around for a while has more restrictions so there's less of a reason to join meaning people spread out through all the forums evenly make it feel empty.
closing, but to the original question: world size does have a cap, so this is expected behavior, and as of right now the world size must be divisible by 4
I'm trying to generate a large world server, because I had 64 GB of RAM and can easily upgrade to 128 GB of RAM, but facing some odd troubles.
When I set it to 110 there is an unhanded exception because the TunaCapacity must divide evenly divide the world size for some reason, meaning only certain world sizes are valid.
(110 * 110) / 40 = 302.5 , so clearly the issue is that at this world size it's trying to hit half a Tuna when it shouldn't.
I observed this behavior in both v7.3.3 and v0.7.4.0-beta-staging-b79d89de
So a world size of 1.21 km^2 is not allowed. After reading the error, I tested with 120x120 for a 1.44km^2 world (since 120*120 is divisible evenly by 40) and that worked without error.
With this in mind, I checked the next value that was divisible by 40, 140x140 and it worked no problem, so then I set it to 300x300 - a 9 km^2 world evenly divisible by 40 - and that worked as well.
What's weird about this is that the default world size is 72x72, which is not divisible by 40 either. Then again, I also set the world height to 200 and ocean height to 60 which might be a factor.
Additionally, I'm not seeing a "Regenerate" setting under world so I'm regenerating by extracting a new set of files and pasting in the settings files.
In all my tests I decided to use Pi as my seed - well, the 10 digits after the decimal point, since it would otherwise be too large for an int:
A helpful feature would be a system to suggest recommended configurations based on system configuration. For example, if someone wants a huge world, it could tell them what their computer can handle rather than leaving them with guesswork, and if they want more it could tell them why more is not recommended - so they could buy better hardware. It should also be able to read the download/upload speeds while the server is running to tell them about how many players should be the max for the world.
I'm going to keep trying a variety of options and if I find anything else I'll report that as well.