I need help with some math.....

Suppose I need to decrease a value from 255 to 0, in x amount of milliseconds. What amount of value would I need to decrease each millisecond so it is 0 when the x amount of milliseconds have passed?
Sorry I'm not that good at math

@AHK1221 Divide 255 by the total number of milliseconds and then multiply that number by the number of milliseconds that elapse each frame and subtract it from your running total.
That seems to work in my head.

@LeeC2202 Seems to work after doing some calcs on the calculator, will test in code. Thanks!!

I'm not good at math either but, assuming you mean What amount of value would you need to decrease per second, so it is zero when X amount of milliseconds have passed. If you mean what value you would need to decrease per millisecond then you would be dealing with microseconds, and there are 1000µs(Microseconds) in a single millisecond. In that case, this would start to get confusing for me. Reducing 255 to 0, would take 255 microseconds if decreased at a rate of 1 per microsecond. It would take half that time if decreased at a rate of 2 per microsecond. I'm pretty sure I'm correct, but like I said. I'm not good at math. 😂 So let's see here.. We have a integer of 255, if we take 1 from that every 1000 milliseconds, it's gonna be at 0 in 255,000 milliseconds, or 255,000,000 microseconds. If we take 1 from it every 500 milliseconds, it would be at 0 in [Now Divide 255 by 2 and get 127.5] half the time.
255; =1/s(1000ms)=[4.250 Minutes] 255s {255000ms}
255;=2/s(1000ms)= [2.125 Minutes] 127.5s {127500ms}
255;=3/s(1000ms)= [1.416 Minutes] 84.96s {84960ms}
255;=4/s(1000ms)= [1.060 Minutes] 63.60s {65600ms}
255;=5/s(1000ms)= [0.850 Minutes] 51.00s {51000ms}So if you reduce 255 by 5 every 1000ms, those are the times it would take to reach 0.
Reducing 255 to 0 takes 255,000 milliseconds if reduced by 1 every 1000 milliseconds.
Reducing 255 to 0 takes 127,500 milliseconds if reduced by 2 every 1000 milliseconds.
Reducing 255 to 0 takes 84,960 milliseconds if reduced by 3 every 1000 milliseconds.
Reducing 255 to 0 takes 65,600 milliseconds if reduced by 4 every 1000 milliseconds.
Reducing 255 to 0 takes 51,000 milliseconds if reduced by 5 every 1000 milliseconds.Do you have any Ibuprofen now?

My approach to this problem was starting like this.
We know that if we have 255, that it will be 0,
If decreased by 1 every millisecond, that it would take 255 milliseconds to get to down to 0.If we divide 255 by 2, it would take half the time. Now we're reducing it twice as fast.
If decreased by 2 every millisecond, it would take 127.5 milliseconds.You could start cutting the milliseconds in half if you needed to, 1 millisecond is 1000 microseconds. Reduce it by 1 every millisecond or reduce it by 1 every half a millisecond if you can only take 1 at a time away. There's many ways to do it, it depends on exactly what you need to do.
So from that point you just divide 255 by 3, 4 or 5 and that's how many milliseconds it would take to reduce 255 to 0.
255  5 every millisecond = 51 milliseconds.So, what hellacious script are you working on now? It's not that car spawning script is it?

@JZersche He isn't looking for the time it would take for 255 to get to zero.
Time is a variable factor, 255 is a fixed constant and what he was looking for was the amount to decrease 255 by with a variable time factor.
To break that down into amount per millisecond, you simply divided 255 by the total number of milliseconds. On a 60Hz display, each frame could be either 16ms or 17ms (1000 / 60), so multiplying that number gives you the amount to decrement by per display update.
Basically:
AmountToDecrementBy = (255 / TotalTimeInMilliseconds) * elapsedMillisecondsPerFrame
So no matter what you change the time to, 255 will always end up at 0 at the end of that time period.

@LeeC2202
"What amount of value would I need to decrease each millisecond so it is 0 when the x amount of milliseconds have passed?"
You would need to decrease 1 each millisecond so it's 0 when 255 amount of milliseconds have passed.
You would need to decrease 0.5 each millisecond so it's 0 when 500 amount of milliseconds have passed.
You would need to decrease 0.3 each millisecond so it's 0 when 765 amount of milliseconds have passed.
You would need to decrease 0.25 each millisecond so it's 0 when 1020 amount of milliseconds have passed.
That's the way I read it.
The way you did it is more logical, that's just the way I would think of working it out, I couldn't figure out the procedure in such a simple formula in my head like you could, I have to think about it.

@JZersche x is an unknown factor. Therefore x could be 1000 milliseconds or it could be 4000 milliseconds.
So what he is asking, is what percentage of 255 do I need to decrease by, to make sure that no matter how long x is, it is always 0 at the end of that time period.

@LeeC2202
Yeah it's whatever, Math gives me a headache. I was providing a way to figure it out by using 255 as a reference.
By starting at 255, you could divide 255 and cut it in half or double it, and that's another way of working that out. I followed the same procedure you did actually only I used seconds in the second place. The difference is the fact that I got a headache while doing it and you likely didn't.
255;=2/s(1000ms)
Would better be interpreted as 255 / 2000ms * (1000ms)
Although I didn't realize changing the 2nd numbers to ms, and multiplying it by the third would give the answer to his question. But at least I did the math correctly. I barely knew what I was doing and ended up with the same order of events sort of lol. I think I got confused when the amount of time passing between each decrement was unclear as he stated 'each millisecond'255 / TotalTimeInMilliseconds) * elapsedMillisecondsPerFrame

@JZersche I use a similar kind of process in my camera mod to ensure that a probe rotates through 360 degrees in a fixed period of time. One might rotate that amount every 2 seconds, another might rotate that amount in 5 seconds.
Using the elapsed time ensures that if the FPS fluctuates, the time calculations compensate to maintain a consistent movement value.

@LeeC2202
I take it you're well knowledged in programming.Edit: I just thought about the way you did it in more detail, and it makes perfect sense. This thread has made me 'slightly' better at math.

@JZersche I'm actually an artist by profession. I started working on games in 1985, it's something I have always been around. I learned to programme to get more involved in game creation. I'm not particularly good at it but I get by.

@LeeC2202 Well, theres been a lot of thought process that went into this problem.. I used your formula(?) and it works for the most part except... it takes nearly a lot of time(i didnt measure so...) when I input 1000 ms. To get the result I wanted, I had to input 20 as the timeout. Heres most of the code:
static float alpha = 255; public static void Timer_Tick(object sender, EventArgs e, int interval) { if(isShowing && timePassed <= timeout) { DrawRect(new Vector2(0, 0), 2, 2, Color.FromArgb((int)alpha, 255, 255, 255)); float numToSubtract = (255f / timeout) * interval; alpha = numToSubtract; timePassed += interval; } else if(timePassed >= timeout) { isShowing = false; timePassed = 0; timeout = 0; alpha = 255; } }
timeout is the unknown factor(x) in my original question, and timePasssed is.... timePassed.

@AHK1221 Let me just copy this into a file so I can read it outside this scrolling box.
What values are getting passed in as interval? Are they something like 16 or 17?

@LeeC2202 For my sanity's sake, I've set the interval(in other script) as 1. So the interval is always 1. Though I suppose Game.LastFrameTime would work fine?

@AHK1221 Interval must be the number of milliseconds between the last time it was called and this time. An interval of 0 (for the OnTick update rate) will update every 16 or 17 milliseconds on a 60Hz display with VSync on.
I always have this in my OnTick to calculate that:
CurrentGameTime = Game.GameTime; ElapsedGameTime = CurrentGameTime  LastGameTime; LastGameTime = CurrentGameTime;
Along with these variables.
private int CurrentGameTime; private int ElapsedGameTime; private int LastGameTime;
So with a timeout of 1000ms, you should get (255 / 1000) * 16 = 4.08

@LeeC2202 So.. I need to pass Game.GameTime as the Interval?

@AHK1221 You need to pass the difference between the last Game.GameTime and the current Game.GameTime, or calculate it in that function.
I think there is also Game.LastFrameTime, which usually reports back as something like 0.016, which you can also use as a multiplier. 255 * 0.016 = 4.08. So if you had the timeout in seconds, rather than milliseconds, for 4 seconds you could do something like this.
int NumSecs = 4; var NumToDec = (255f * Game.LastFrameTime) / NumSecs;
I think that should work the same... I haven't done it that way though.

@LeeC2202 I'm sorry, I really dont get the gist of this.
timeout is in milliseconds, so that doesnt work for me.
Passing in the ElapsedGameTime results in the same way Game.GameTime does... it appears for what appears to be a second? 500 ms? A quick flash. Even if the value is increased, to say, 10 seconds.

@AHK1221 Give me a couple of ticks and I'll see if I can put it in a better format.
Going off my inability to explain things today, I could be adding to the confusion... will only be a couple of minutes.

If noone can figure this out. I have a buddy I could talk to, he'll know. He's a last resort though, he's always busy and no promised response, but he'll definitely know how to do whatever it is you're trying to do in your script. I'm guessing this is C++ or C? What is the script for particularly? (Message me that information if you want)

@LeeC2202 ATM I'm using the same method that made me pass in 30 for it to stay for around 23 secs. At least it works.. right?

@AHK1221 Okay, paste this into a .cs file.
using System; using GTA; namespace TimeOutTest { public class cTimeOutTest : Script { private int CurrentGameTime; private int ElapsedGameTime; private int LastGameTime; float PseudoAlpha1; float PseudoAlpha2; const int TimeOut = 10000; const int TimeOutSecs = 5; public cTimeOutTest() { this.Tick += onTick; Interval = 0; PseudoAlpha1 = 255; PseudoAlpha2 = 255; LastGameTime = Game.GameTime; } private void onTick(object sender, EventArgs e) { CurrentGameTime = Game.GameTime; ElapsedGameTime = CurrentGameTime  LastGameTime; LastGameTime = CurrentGameTime; DoAlpha1(ElapsedGameTime); DoAlpha2(); string display = string.Format("Alpha 1: {0}, Alpha 2: {1}", PseudoAlpha1, PseudoAlpha2); UI.ShowSubtitle(display); } private void DoAlpha1(int ElapsedGameTime) { float NumToDec = (255f / TimeOut) * ElapsedGameTime; PseudoAlpha1 = NumToDec; if (PseudoAlpha1 < 0) PseudoAlpha1 = 255; } private void DoAlpha2() { float NumToDec = (255f * Game.LastFrameTime) / TimeOutSecs; PseudoAlpha2 = NumToDec; if (PseudoAlpha2 < 0) PseudoAlpha2 = 255; } } }
That shows both ways of decrementing the value, using both the ElapsedTime and the Game.GameTime method.

@LeeC2202 Wait nvm im stuped .

@AHK1221 Look at the consts, Alpha 2 is on a 5 second fade, Alpha 1 is on 10 seconds. I just made them different so you could see something different happening.