Is there a way to work around time fluctuations when testing runtime of an algorithm?

47 views Asked by At

I wanted to test runtime of ShellSort algorithm.

#include <iostream>
#include <vector>
#include <random>
#include <chrono>

using namespace std;

int main()
{
    int N = 2;
    vector<int> steps = {1, 2, 8, 32, 128, 512, 2048, 8192};
    while (N <= 5000){
        vector <int> A(N, 0);   

        mt19937 mt(time(nullptr));

        for (int i = 0; i < N; ++i){
            A[i] = (mt() % 1000);}
        int k = steps.size();

        while (N < steps[k]){k--;}

        auto t1 = std::chrono::high_resolution_clock::now();
        for (; k >= 0; k--) {
            for (int i = steps[k]; i < N; i += 1) { 
                int temp = A[i]; 
                int j;             
                for (j = i; j >= steps[k] && A[j - steps[k]] > temp; j -= steps[k]) 
                    A[j] = A[j - steps[k]]; 
                A[j] = temp; 
            } 
        } 
        auto t2 = std::chrono::high_resolution_clock::now();
        auto duration = std::chrono::duration_cast<std::chrono::nanoseconds>(t2 - t1);
        cout << N << " " << duration.count() << "\n";
        N++;
}
return 0;
}

It works fine, but when I make a graph using output data, it looks like this:

Graph with fluctuations

There are fluctuations. They are not caused by an array sorted. You can see that graph is lifted by some constant value in some areas.

I tried to use different sorting sequences, but those "graph lifts" are not changing. I really don't know what I can do at this point.

I want to see raw sorting time, without anything additional, is it possible? What can I do?

0

There are 0 answers