I've made three scripts that do same amount of work and PBLimiter's counter values varies greatly for each of them. PBLimiter counts extra average time for any PBs that do not execute every game frame, blowing up PBs that generate less stress. That's may be good for microlag prevention, but it also encourage script writers to use Update1 frequency where Update10 or Update100 frequencies should be used, causing extra server load.
Scripts
// Update1
public Program()
{
Runtime.UpdateFrequency = UpdateFrequency.Update1;
}
public void Main(string argument, UpdateType updateSource)
{
Me.CustomData="TEST";
}
// Update10
public Program()
{
Runtime.UpdateFrequency = UpdateFrequency.Update10;
}
public void Main(string argument, UpdateType updateSource)
{
for(int i = 0; i < 10; i++) {
Me.CustomData="TEST"+i;
}
}
// Update1 working every 10th tick
public Program()
{
Runtime.UpdateFrequency = UpdateFrequency.Update1;
}
int Tick = 0;
public void Main(string argument, UpdateType updateSource)
{
Tick++;
if (Tick % 10 != 0) {
return;
}
for(int i = 0; i < 10; i++) {
Me.CustomData="TEST"+i;
}
}
I've made three scripts that do same amount of work and PBLimiter's counter values varies greatly for each of them. PBLimiter counts extra average time for any PBs that do not execute every game frame, blowing up PBs that generate less stress. That's may be good for microlag prevention, but it also encourage script writers to use Update1 frequency where Update10 or Update100 frequencies should be used, causing extra server load.
Scripts
Averages