I have a loop that decreases in growth as the iterations go up. I need to calculate the number of iterations it'll go through (I explain why I need this at the bottom of this question). The first 6 steps are
0.50, 1.50, 1.83, 2.11, 2.34, 2.55, ...
X-axis: iterations, Y-axis: count
The count starts at 0.5 and grows at a decreasing rate until it reaches 20. The loop boils down to this:
var SCALE = 0.5; // Starting value, also affects increment
var MAX = 20; // Maximum value
var i = 0; // Just for counting
for (var count = SCALE; count < MAX; count += SCALE / count) {
console.log(count, i);
i++;
}
You can see the graph grows more slowly as it progresses because of count += SCALE / count
, so as count increases, the denominator increases too.
I thought it followed an exponential pow(MAX, 1 / SCALE)
line, but not quite:
MAX = 5 : 23 iterations Math.pow(5, 2) = 25
MAX = 10 : 97 iterations Math.pow(10, 2) = 100
MAX = 15 : 222 iterations Math.pow(15, 2) = 225
MAX = 20 : 397 iterations Math.pow(20, 2) = 400
Plus this approach falls apart when SCALE
isn't 0.5.
Question
What equation can I use that takes both SCALE
and MAX
into account to get the iteration count?
Why do I need this?
I'm trying to convert the sample code at the bottom of this article into GLSL shader code. The problem is that graphics cards can only perform for-loops with integers counting up to a constant, so I need to know how many iterations the loop will take before starting the loop.
I need something like this: for(int i = 0; i < MAX_COUNT; i++)
but first I need to know what MAX_COUNT
will be.
As suggested by several people in the comments, this pattern doesn't follow a log graph, a harmonic graph, or any recognizable pattern. To solve it I had to go with a "brute-force" approach of just running the loop once, then using the iteration count as the result. There was no elegant math equation.