Skip to content
This repository has been archived by the owner on Oct 8, 2024. It is now read-only.

Continuous Difficulty Calcuation #86

Open
tr3sleches opened this issue Feb 28, 2019 · 15 comments
Open

Continuous Difficulty Calcuation #86

tr3sleches opened this issue Feb 28, 2019 · 15 comments

Comments

@tr3sleches
Copy link

tr3sleches commented Feb 28, 2019

(It's technically not continuous because of the Heaviside step functions involved when adding spacing weights, but whatever)
About 2 days ago someone on reddit mentioned to me that Xexxar was trying to do per hit object difficulty calculation. I was surprised when this hadn't been done before, so I decided to take a crack at it.

I'm going to try to keep this as simple as I can

Using the decay and strains, you can easily create and solve a diffEQ to model strain.

ds/dt = ln(0.15)/1000*s+ 26.25*sum( (D_i^0.99/∆t_i) * Dirac Delta(t-t_i) )
where ∆t_i is the delta time associated with a hit object
t_i is the time associated with the hit object
D_i is the distance associated with the hit object
s is strain
(you can actually use this to create a slider rework but that is a discussion for another time)

From there, I created sets of strains from the hit objects until right before the next one. Note that these sets are continuous along the real number line. I took all the infimums and supremums of those sets and used those to calculate the frequency that a certain strain occurs (this is probably the thing I'm most proud of in this entire thing). The frequency shows how many times the strain function hits a certain strain. It is the amount of elements in the intersection between the collection of strain sets and a certain strain. Then I inverted the homogenous portion of the strain diffEQ to find the differential time at each strain.
ds/dt=ln(0.15)/1000*s => dt=-1000/ln(0.15) * ds/s
you have to add a negative because why would you want negative time?
The strain function spends dt time at strain s.
To get the probability density function, you multiply the frequency of a strain multiplied by the differential time, then divide the whole thing by the total time of the beatmap in milliseconds. You integrate that to get the cumulative density function.
Something like:
CDF(s)=-1000/(T*ln(0.15))*sum(frequency(s)*H(s-x_i)*ln(min(s,x_(i+1))/x_i)
which is the probability that the strain in the beatmap is less than or equal to s
where T = total time of beatmap
s = strain
x_i & x_(i+1) are discrete strains from hit objects
H(x) is the Heaviside step function
After this, you space out the probabilities evenly and add to find your weighted strain or integrate
weighted avg strain= integral(s * 0.9^((1-CDF(s))*N)ds)/integral(0.9^((1-CDF(s))*N)ds)
which is fairly easy because a lot cancels (and a lot is constant) and the exponential of a logarithm is a power.
I already implemented this in R.
Edit: Note that all of the 0.15 can be replaced with 0.3 for speed or any other decay constant
Edit:fixed particular on first EQ
Edit 3: fixed weighted strain integral

@tr3sleches
Copy link
Author

tr3sleches commented Feb 28, 2019

Here is a sample of what a CDF would look like
6b481b4c-fccb-4ab5-bb7f-5cd47b241f75
Here is an example of what the frequency function would look like
d6bdd9d6-cefe-4ecd-abe7-cb2d1f644b90

Edit: Forgot to multiply the radius normalization multiplier to the distances. Doesn't change the concept though

@ThereGoesMySanity
Copy link

Here's some LaTeX versions of your equations for readability. Let me know if I got anything wrong
eq1
eq2
eq3
eq4

@abraker95
Copy link

Using the decay and strains, you can easily create and solve a diffEQ to model strain.
ds/dt = ln(0.15)/1000*s+ sum( (D_i^0.99/∆t_i) * H(t-t_i) )

Please explain how you got here.

@tr3sleches
Copy link
Author

Using the decay and strains, you can easily create and solve a diffEQ to model strain.
ds/dt = ln(0.15)/1000*s+ sum( (D_i^0.99/∆t_i) * H(t-t_i) )

Please explain how you got here.

...That’s literally just converting the strains into a function to manipulate. Kind of like the code in a formulaic notation. I just turned the discrete equation into a continuous one.

@tr3sleches
Copy link
Author

tr3sleches commented Feb 28, 2019

Here's some LaTeX versions of your equations for readability. Let me know if I got anything wrong
eq1
eq2
eq3
eq4

I Forgot to put 26.25 in front of the sigma
I also forgot that the Heaviside is supposed to be a Dirac delta in the first equation
This doesn’t matter that much, but I wrote the arrow with the equal sign on purpose because I wanted it to be a mathematical implication. Doesn’t affect things to much
I changed the weighted strain integral as well.

@tr3sleches
Copy link
Author

Repository with some of the files:
https://github.com/tr3sleches/Continuous-Difficulty-Calculation/
I'll organize it into folders later.
In R just enter the map name as the argument to strainAimFind or strainSpeedFind

@tr3sleches
Copy link
Author

Here are some aim star changes
Note that the after star ratings do not include angle bonus (because it's super broken):
These are all nomod
Fiery's Lonely Go: 3.79 --> 3.316
Yume Tourou Taki: 2.85 --> 2.325
Yume Tourou Mitsuha: 2.81 --> 2.323
Yume Chizu [My Dream]: 4.4 --> 3.684
Shiori [Apex]: 3.01 --> 2.647
Sotarks pp Compilation: 3.12 --> 2.607
Louder than Steel [ok this is epic]: 3.9 --> 3.016
Kimi no Bouken (Speed up Ver.) [pp adventure]: 5.5 --> 4.75
Sotarks Raise my Sword: 3.38 --> 2.771

@hi-mei
Copy link

hi-mei commented Mar 6, 2019

SeemsGood

@tr3sleches
Copy link
Author

tr3sleches commented Mar 6, 2019

I wrote explanations for the process from start to finish. Hope this clears some things up.
cf4a44ab-b20c-43a5-a560-c80564609b8e
a21abbbb-3ae8-4e69-bf04-7d0a4cc846f5
5f913846-c316-47fc-a5b9-7b4a94a2058d
4a958ad9-477b-46f5-9589-522d188ce54e
5044690a-9c12-4748-8a59-1b09e5554ec4
624a1bb8-9572-4abe-b47e-c29b41890a80
e01560ef-689a-4b71-b9e3-a2a551b726e7
24a309bc-2d47-4bc1-a737-f2a014090065
ade21ccb-cdfa-4a6c-9f80-0730f8646055

@tr3sleches
Copy link
Author

tr3sleches commented Mar 6, 2019

Length rework
Not sure if this should be its own separate issue, but since these are similar, I’ll post it here instead.
After I calculate the SR with the above, I will have all that I need to make a length bonus. I basically weighed normalized strain values.
a and b are the minimum and maximum multiplier respectfully.
20436d81-c11a-406a-9199-427e90782ac6

@joseph-ireland
Copy link

I've gone through your post/R code and checked through all the maths. Amazing. BTW you had to add a negative in the ds formula because you swapped the limits of integration around :)

For those less mathematically inclined, he's very cleverly calculating what the current code would do with an infinitely small section_length, rather than a section_length of 400ms.

It'd actually be a good way to test if everything is working. In the original C# code, set the section_length to 1ms or so and fix the DifficultyValue function to be independent of section_length (just change the 0.9 to (0.9^(section_length/400), and rescale at the end). It'll be slower than what you've got but it should give a very similar result.

BTW I think I've found a subtle error: you're using decay[i] to calculate infimum[i], but thats based on the delta time from the previous hit object to the current, not the time from the current to the next. I think you should use decay[i+1]. You'd have to add a bit of time onto the end so that the last strain has time to decay and contribute to the integral (or you could take the limit to infinity for even more fun!).

There's also one weirdness in your implementation: when calculating the CDF you recalculate the total from scratch every time rather than reusing the partial sums to do it in one pass. I also don't like the is.element() stuff, I'd rather have a separate column in the sorted data to distinguish inf and sup, but that doesn't matter as much, and I guess it's just a prototype so neither of these is a big deal.

Also there was some stuff on discord about removing the breaks to get a more accurate result. I don't think that should affect things, since the lowest strains should have negligible weight. I think you're forgetting to update the total time somewhere when you're manually truncating sections, so you're kind of stretching the map out and weighting the hard stuff more than you should.

Now for my personal opinion:
I don't like the length bonus for a few reasons. It still depends on object count, so long stream maps get more boost than long aim maps. Also I don't feel like s_min should be there. It's probably going to be close to 0 anyway, but if it wasn't then that wouldn't make the map is easier, would be better to leave it out.
That pretty much leaves new_bonus = old_bonus * (x + y*(star_rating/0.0675)^2/peak_strain), but that doesn't seem like a good measure of map length to me.

I'm also unsure if this whole thing is a good idea to begin with. It's insanely more complex than what we've currently got and will potentially make future changes much harder. There'll be work to do rebalancing to take into account the decreased star ratings, and there isn't even necessarily a concrete gain vs the old system. And even if there was, we could almost fully realise it by just decreasing section_length a bit. This is definitely more elegant, but in software engineering, simplicity trumps elegance every time.

I'd say its more of a performance optimisation if it is decided that decreasing section_length is even a good idea. We'd need to benchmark both and show that the speedup is worth the increased complexity.

@tr3sleches
Copy link
Author

tr3sleches commented Mar 8, 2019

joseph-Ireland, Thanks for the response.
I did switch the order of integration, I was not thinking about the fact that the homogenous function is always decreasing, so lower time would be on top and higher time will be on bottom.
I know my R code is iffy as hell (I'm still a beginner at R).
You are right about that error though, decay[i] does need to be changed to decay[i+1], but adding time at the end will mess up the integral because maps are not played after the last object and that will give star rating where star rating is not due. 2) a limit to infinite time would literally span the map over infinite time and mess up the beatmap time I have for the sum of the weighted strains integral. Instead to fix that, I will change decay[i] to decay[i+1] and add a strain at the time of the first object equal to 1. I will then have strains[1] = 1, supremum[1]=1, and infimum[1] = decay[1]. I will concatenate these values with their respective vectors.

Also there was some stuff on discord about removing the breaks to get a more accurate result. I don't think that should affect things, since the lowest strains should have negligible weight.
While the lower strains have negligible weight it's the time that affects everything else that is the problem. I don't remember forgetting about the time somewhere but it shouldn't be a problem. I'll look into it.

I took another look at the length bonus, and admit its not really all the way there and doesn't address all the problems length bonus currently has. Though I think the (x + y*(star_rating/0.0675)^2/peak_strain) part can be used as part of a future rework, as good balancing relative to the strains on the map.

I'm also unsure if this whole thing is a good idea to begin with. It's insanely more complex than what we've currently got and will potentially make future changes much harder. There'll be work to do rebalancing to take into account the decreased star ratings, and there isn't even necessarily a concrete gain vs the old system. And even if there was, we could almost fully realise it by just decreasing section length a bit. This is definitely more elegant, but in software engineering, simplicity trumps elegance every time.

I argue the opposite.

  1. The reworks right now are so simple they will fit effortlessly in. Since osu code is object based, I'm going to have to assign those values to the function anyway, I could add the bonus to the strains at the objects as necessary. Most (I won't say all) reworks others will probably come up with will change the strain value at an object some kind of way, which is a non-issue in my system, as you can still do the same.
  2. This allows for less restrictions and limits on reworks that can be made and more reworks derived from the ground up to be made. For example, a little while ago, I derived slider aim as infinitesimally small jumps. I can use the code they did to get travel distance, divide that by the slider duration to get a slider velocity. Convolute that with the decay exponential, and you have a convolution, to which you plug in the time elapsed plus decay time after finishing (if it's there) and plug that in to whatever object is next (https://www.desmos.com/calculator/spo0yz3k2g) (Hopefully this will let you see what I mean).
    Boom, you've got slider aim. While you can do this with chunks, it is much more annoying. When things are more continuous like this, mathematical reasoning can be directly applied. I actually was discussing with abraker about his rhythmic complexity and came up with a solution. While it can be implemented in chunks, it's much more seamless when you go from object to object instead of within intervals to find max strain.

Thank you for your comment, I am so glad to get specific critiques because it's my first time doing something like this. I would be glad to discuss this more in the future.

@joseph-ireland
Copy link

joseph-ireland commented Mar 9, 2019

While the lower strains have negligible weight it's the time that affects everything else that is the problem. I don't remember forgetting about the time somewhere but it shouldn't be a problem. I'll look into it.

The total time shouldn't be a problem. Suppose you add a zero strain section the length of the whole map. That would make the length double, i.e. N = 2 prev_N, and the CDF will be squished up, i.e. CDF(0) = 0.5, and CDF(s) = 0.5(1+prev_CDF(s))

If you plug both of those into integral s * 0.9^(N*(1-CDF(s))) ds you get

so the integral is unchanged if you add blank sections. In reality breaks will have a very small strain, not zero, but the results will be extremely close.

You are right about that error though, decay[i] does need to be changed to decay[i+1], but adding time at the end will mess up the integral because maps are not played after the last object and that will give star rating where star rating is not due.

I feel like this is a bit disingenuous. It's a continuous function. You spend infinitely more time not pressing buttons in between pressing buttons, and you are counting all the gaps in between as star rating, why not the gap after the last note? As per the previous point, adding empty sections doesn't affect the end result, so you're pretty much just clipping off the last note for no reason. Yes, a limit to infinity is a bit silly, but it's doable! Not worth the hassle though, since it'll hardly change the end result. Not sure about adding strains[1]=1, doesn't seem like it'd change anything significantly, but I don't see the reasoning either.

The reworks right now are so simple they will fit effortlessly in.

showing some bias here, but my star rating changes here which replace length bonus don't effortlessly fit in, because it gets rid of the 0.9^n stuff entirely. I'm sure something similar could fit in, but again, not sure if it's worth it. I've not had much time to work on it, but I've recently made some balancing tweaks that I'm close to pushing up and I'm very pleased with the results so far.

For example, a little while ago, I derived slider aim as infinitesimally small jumps....

I can see where you're going with that, but I'll be honest, I clicked the link and I've got no idea what's going on there. I'm sure you could explain it to me in 15 minutes face-to-face with a blackboard, but I'm confident I could understand the current slider logic just by reading the code just by looking at it, and you're not always going to be there to explain stuff to future maintainers.

If I were you I'd look to make some nicely typeset documents explaining your derivations, let's say at a level where a first/second year maths undergrad could understand. Don't skip any steps, and link to relevant web pages where people can find out more about particular methods you are using. Otherwise most people aren't going to have a clue what's going on, and it's hard for the osu team to merge in code when they have no idea how it works, and no way of verifying that it's working as intended. I guess I should also follow my own advice here :)

BTW, I'm more of an outsider than you here, so everything I've said is just my honest opinion, which doesn't really carry much weight on its own.

edit:formatting

@tr3sleches
Copy link
Author

tr3sleches commented Mar 9, 2019

I guess what I’m afraid of is something like this:
Suppose there is a diff spike at the end of a map such that the last object has the highest strain. If you take into account time after the after this, say to infinity, every single strain would have a frequency added to it despite the song not increasing in difficulty in any way.

Infinite time is not relevant after looking back at it. I just forgot that, the time part cancels when you divide it by the weight because I look at those two in separate areas of my code.

showing some bias here, but my star rating changes here which replace length bonus don't effortlessly fit in, because it gets rid of the 0.9^n stuff entirely. I'm sure something similar could fit in, but again, not sure if it's worth it. I've not had much time to work on it, but I've recently made some balancing tweaks that I'm close to pushing up and I'm very pleased with the results so far

I guess, even so that bias was not without merit because it remains true (thus far). I just tried yours out. It’s even simpler than mine tbh. Was very straight forward and worth it. I really should make a step by step to transform strain sum (and functions of them) to continuous functions. Since this is a difficulty calculation, naturally you would replace the one I had with yours (I just did the equivalent of the old system after all). After following all of my steps up until finding dt in terms of d(strain) (or whatever variable you have for that), apply your formula.

3091B1F3-B5EF-4169-9D0B-01A7F483167B

Assuming it doesn’t overload, it’s very doable.

I can see where you're going with that, but I'll be honest, I clicked the link and I've got no idea what's going on there. I'm sure you could explain it to me in 15 minutes face-to-face with a blackboard, but I'm confident I could understand the current slider logic just by reading the code just by looking at it, and you're not always going to be there to explain stuff to future maintainers.

I know, that was just me graphing and testing it out and not an explanation of it at all. Just wanted to demonstrate how easy it is to solve problems such as slider strain by extrapolating the information we have when it is continuous.

If I were you I'd look to make some nicely typeset documents explaining your derivations, let's say at a level where a first/second year maths undergrad could understand. Don't skip any steps, and link to relevant web pages where people can find out more about particular methods you are using. Otherwise most people aren't going to have a clue what's going on, and it's hard for the osu team to merge in code when they have no idea how it works, and no way of verifying that it's working as intended. I guess I should also follow my own advice here :)

I guess, though I’m a third year undergrad myself so I don’t think I have an idea of what a 1st or 2nd year undergrad’s knowledge would be. A year ago, the only thing I really did not know on my original comment was the statistics and some of the topology stuff. Everything else is diffEQ or lower, which I think is 2nd year level (right?)
I might make a document of it once I finish everything on this (including the c# part).

@tr3sleches
Copy link
Author

tr3sleches commented Mar 9, 2019

The equation for your diff system in a Continuous manner is

image

B710BB15-D343-4ADC-85CD-3983B8428BF2

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants