lucko / spark

A performance profiler for Minecraft clients, servers, and proxies.
https://spark.lucko.me/
GNU General Public License v3.0
1.04k stars 142 forks source link

Another profiler is already active - /sparkc profiler start #351

Open Treeways opened 1 year ago

Treeways commented 1 year ago

Yesterday, I ran a spark profile. It worked just fine, and I was able to get the resource stack.

Today, I can't run /sparkc profiler start at all - it tells me it is "starting a new profiler", and then throws the error below. Maybe I closed out of the game before stopping the profiler or something, I don't remember exactly. But that session I did yesterday shows up in /sparkc activity. Canceling it with /sparkc profiler cancel tells me in the game chat that there isn't an active profiler running.

There are no files in my tmp folder.

Here's my latest.log. All I did was load into a previously generated world, ran the spark profiler and exited. I did also add a couple of mods after entering the world today: https://mclo.gs/YkTGNjB Generating a new world does not stop the error.

Here's the error I get:

[15:06:45] [spark-worker-pool-1-thread-3/ERROR]: Exception occurred whilst executing a spark command
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]: java.lang.IllegalStateException: Another profiler is already active: me.lucko.spark.common.sampler.async.AsyncProfilerJob@5ac761d8
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at me.lucko.spark.common.sampler.async.AsyncProfilerJob.createNew(AsyncProfilerJob.java:64)
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at me.lucko.spark.common.sampler.async.AsyncProfilerAccess.startNewProfilerJob(AsyncProfilerAccess.java:103)
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at me.lucko.spark.common.sampler.async.AsyncSampler.start(AsyncSampler.java:94)
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at me.lucko.spark.common.sampler.SamplerBuilder.start(SamplerBuilder.java:147)
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at me.lucko.spark.common.command.modules.SamplerModule.profilerStart(SamplerModule.java:251)
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at me.lucko.spark.common.command.modules.SamplerModule.profiler(SamplerModule.java:146)
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at me.lucko.spark.common.SparkPlatform.executeCommand0(SparkPlatform.java:430)
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at me.lucko.spark.common.SparkPlatform.lambda$executeCommand$2(SparkPlatform.java:339)
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
[15:06:45] [spark-worker-pool-1-thread-3/INFO]: [STDERR]:   at java.base/java.lang.Thread.run(Thread.java:833)
Treeways commented 1 year ago

As stated in the link to the log file posted above, I'm running Quilt 1.20.1 with a variety of different mods. It could be possible that one of them is interacting with spark in a weird way... Hard to tell.

Treeways commented 1 year ago

I disabled all of my mods besides Spark and Fabric API - the bug still happen, same error. I seem to have gotten the mod into a state where I just can't profile again...

However, I did find the cause of the error! Disabling the background profiler in the config with "backgroundProfiler": false lets me create a sparkc profiler again in async mode.

(Why it worked the first time, and never again until disabling the background profiler, is beyond me.)

lucko commented 1 year ago

Ah yeah, because of the default profiler running on the server (/spark), you'll need to disable it there before starting a profile on the client

e.g.

/spark profiler cancel
/sparkc profiler start

There's probably some things that can be done in spark to make this a bit more intuitive/automated

Treeways commented 1 year ago

I tried that sequence of commands, but oddly enough the start command didn't seem responsive even afterwards. I'm not sure how it happened but it seemed like the background profiler was just not stopping, or I got it into some other weird state...