Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to use spark: permission not registered: spark.all #448

Open
CynicalBusiness opened this issue Aug 25, 2024 · 2 comments
Open

Unable to use spark: permission not registered: spark.all #448

CynicalBusiness opened this issue Aug 25, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@CynicalBusiness
Copy link

CynicalBusiness commented Aug 25, 2024

Description

We currently have a Forge-powered server running behind a Velocity proxy, but are unable to use Spark on the server for profiling.

We want to profile the Forge server itself, but usage of any spark commands results in an exception (see below). We are using LuckPerms, and have granted the relevant users both OP and any permissions we could think of (spark.all, spark.*, spark, spark.profiler, etc.) but none of these are working. It also does not work from the server console for the same reason.

While sparkv and sparkc do work fine, they profile the proxy server and client, respectively, which does not help us.

Reproduction Steps

  1. Execute spark profiler start
  2. See error in server console

Expected Behaviour

Spark's profiler starts

Platform Information

  • Minecraft Version: 1.20.1
  • Platform Type: Server/Proxy
  • Platform Brand: Forge
  • Platform Version: Forge 47.2.20 / Velocity 3.3.0

Spark Version

v1.10.53

Logs and Configs

The exception stack trace is the following:

java.lang.IllegalStateException: spark permission not registered: spark.all
[21:34:40] [spark-worker-pool-1-thread-2/ERROR] [spark/]: Exception occurred whilst executing a spark command
	at TRANSFORMER/spark@1.10.53/me.lucko.spark.forge.plugin.ForgeServerSparkPlugin.hasPermission(ForgeServerSparkPlugin.java:204)
	at TRANSFORMER/spark@1.10.53/me.lucko.spark.forge.ForgeCommandSender.hasPermission(ForgeCommandSender.java:76)
	at TRANSFORMER/spark@1.10.53/me.lucko.spark.common.command.CommandResponseHandler.lambda$allSenders$0(CommandResponseHandler.java:74)
	at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:178)
	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
	at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
	at java.base/java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:734)
	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
	at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682)
	at TRANSFORMER/spark@1.10.53/me.lucko.spark.common.command.CommandResponseHandler.allSenders(CommandResponseHandler.java:75)
	at TRANSFORMER/spark@1.10.53/me.lucko.spark.common.command.CommandResponseHandler.broadcast(CommandResponseHandler.java:92)
	at TRANSFORMER/spark@1.10.53/me.lucko.spark.common.command.CommandResponseHandler.broadcastPrefixed(CommandResponseHandler.java:112)
	at TRANSFORMER/spark@1.10.53/me.lucko.spark.common.command.modules.SamplerModule.profilerStart(SamplerModule.java:229)
	at TRANSFORMER/spark@1.10.53/me.lucko.spark.common.command.modules.SamplerModule.profiler(SamplerModule.java:144)
	at TRANSFORMER/spark@1.10.53/me.lucko.spark.common.SparkPlatform.executeCommand0(SparkPlatform.java:431)
	at TRANSFORMER/spark@1.10.53/me.lucko.spark.common.SparkPlatform.lambda$executeCommand$2(SparkPlatform.java:336)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:840)

Sparks's LuckPerms permissions tree:
image
Indeed spark.all is not present.

Extra Details

All configs are their defaults.
This occurred after we moved the server to new hardware; it had worked fine before this, AFAIK.

@CynicalBusiness CynicalBusiness added the bug Something isn't working label Aug 25, 2024
@lucko
Copy link
Owner

lucko commented Aug 26, 2024

You're using an old version of spark, this issue should be fixed in newer versions

@lucko lucko closed this as completed Aug 26, 2024
@CynicalBusiness
Copy link
Author

CynicalBusiness commented Aug 28, 2024

You're using an old version of spark, this issue should be fixed in newer versions

We are running the latest version of Spark available for this version of Minecraft. Even still, after some testing, the issue persists even on the latest version with MC 1.21.

We were able to "solve" the issue by patching the offending method with a bit of a bodge, but it works enough to let use Spark again.

    public boolean hasPermission(CommandSource sender, String permission) {
        if (sender instanceof ServerPlayer) {
            return ((ServerPlayer) sender).hasPermissions(2);
        } else {
            return true;
        }
    }

@lucko lucko reopened this Aug 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants