-
-
Notifications
You must be signed in to change notification settings - Fork 400
Adds Spark Support! #7035
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds Spark Support! #7035
Conversation
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprCpuUsage.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprCpuUsage.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprCpuUsage.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprCpuUsage.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprCpuUsage.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprCpuUsage.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprCpuUsage.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprCpuUsage.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprCpuUsage.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprCpuUsage.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprCpuUsage.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprMSPT.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprMSPT.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprMSPT.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprMSPT.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprMSPT.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprCpuUsage.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprMSPT.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprMSPT.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprMSPT.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprMSPT.java
Outdated
Show resolved
Hide resolved
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprMSPT.java
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm sure you have enough comments to work on already so I won't add anything to that.
I think this is a nice idea, but I am a little concerned by the 'weirdness' of Spark's actual statistics collection & implementation. It feels like the stats that can be provided are pretty limited and I'm not sure how much a user will be able to do with this information (aside from printing out exactly what they're given).
Do you think this might be better for an addon?
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprMSPT.java
Outdated
Show resolved
Hide resolved
I don't think this should be done as an addon since spark is built into most server jars (other than spigot) |
suggested changes + introduces rounding for better accuracy
src/main/java/ch/njol/skript/hooks/spark/expressions/ExprMSPT.java
Outdated
Show resolved
Hide resolved
src/main/java/org/skriptlang/skript/bukkit/spark/expressions/ExprCPUUsage.java
Outdated
Show resolved
Hide resolved
src/main/java/org/skriptlang/skript/bukkit/spark/expressions/ExprCPUUsage.java
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A couple strange things from design standpoint and other small things.
public Double[] get(DoubleAverageInfo...infos) { | ||
if (this.type == MsptStatisticType.PERCENTILE) | ||
return Arrays.stream(infos).map(info -> type.getPercentileFunction().apply(info, value)).toArray(Double[]::new); | ||
return Arrays.stream(infos).map(type.getFunction()).toArray(Double[]::new); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it be better for this condition to be performed by MsptStatisticType? i.e. have a method in there that takes in the info and the value.
Here, that would simply be:
Arrays.stream(infos).map(info -> type.getProperty(info, value)).toArray(Double[]::new);
The enum, getProperty(...):
if (function != null) return function.apply(info)
return percentileFunction.apply(info)
Just an ownership concern (the enum should generally own and control how its own fields are used).
Also, while the enum is closed, it would still benefit from having null safety around percentileFunction and function. Especially in this scenario, it took me a while to realize there's a contractual reason it won't run into a null check.
src/main/java/org/skriptlang/skript/bukkit/spark/SparkUtils.java
Outdated
Show resolved
Hide resolved
src/main/java/org/skriptlang/skript/bukkit/spark/expressions/ExprCPUUsage.java
Show resolved
Hide resolved
case 1 -> "1 minute"; | ||
case 2 -> "15 minutes"; | ||
case 3 -> "all available periods"; | ||
default -> throw new IllegalStateException("Invalid CPU usage expression"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe it's just me but it's a little scary when a toString could throw an exception
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well what should I put in the default then?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This may be something to quickly ask the team whether they prefer keeping exceptions out of "special" methods like toString()
. My reasoning is that it helps immensely with debugging if I can always view a quick representation of an object in a debugger.
/** | ||
* Utility class for accessing Spark's API. | ||
*/ | ||
public class SparkUtils { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this class would benefit from having a better lifecycle, mainly to orchestrate its state a bit better.
What I recommend is adding and refactoring the following:
- rename to something else. This is hardly a utility class, it's a stateful singleton.
- Add an initialize() method that consumes the desired Statistic classes (process cpu usage, system cpu usage, mspt, tps)
- Add a dispose() method that de-initializes the state and prepares this singleton to be re-initialized at another time
- Manage this lifecycle from somewhere, probably SparkModule. Ultimately, onEnable should call initialize() and onDisable should call dispose()
This will clean up pretty much all of your synchronized blocks, which barely needed to be there in the first place.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't even know how to do all of that, but it works fine as it is, so I will leave it.
src/main/java/org/skriptlang/skript/bukkit/spark/expressions/ExprMSPT.java
Outdated
Show resolved
Hide resolved
part 1, waiting on Sovde to reply
I don't think you should be resolving conflicts after you reply when it seems a discussion should be made. Rather than "I can't/won't do this" then resolve it. |
I only marked them as resolved as it works fine on it's own and with my almost a year of java experience, I only know so much and figured it was fine the way it was. I left some unresolved as it requires Sovde's comment as this PR was entirely recoded by Sovde and I don't really understand his thinking/thought process in some of this even when looking at the code over and over again. |
I am marking this PR as draft incase someone else wants to take this over and better improve the PR. Once that happens, I will close this PR. You can view the entire conversation here. |
Update: |
Description
This PR simply adds features from Spark into Skript!
Target Minecraft Versions: 1.19+
Requirements: Spark plugin
Related Issues: none