I am working on encryption of realtime data. I have developed encryption and decryption algorithm. Now i want to measure the execution time of the same on Linux platform in C. How can i correctly measure it ?. I have tried it as below
gettimeofday(&tv1, NULL);
/* Algorithm Implementation Code*/
gettimeofday(&tv2, NULL);
Total_Runtime=(tv2.tv_usec - tv1.tv_usec) +
(tv2.tv_sec - tv1.tv_sec)*1000000);
which gives me time in microseconds. Is it correct way of time measurement or i should use some other function? Any hint will be appreciated.
Measuring the execution time of a proper encryption code is simple although a bit tedious. The runtime of a good encryption code is independent of the quality of the input--no matter what you throw at it, it always needs the same amount of operations per chunk of input. If it doesn't you have a problem called a timing-attack.
So the only thing you need to do is to unroll all loops, count the opcodes and multiply the individual opcodes with their amount of clock-ticks to get the exact runtime. There is one problem: some CPUs have a variable amount of clock-ticks for some of their operations and you might have to change those to operations that have a fixed amount of clock-ticks. A pain in the behind, admitted.
If the single thing you want is to know if the code runs fast enough to fit into a slot of your real-time OS you can simply take to maximum and fill cases below with NOOPs (Your RTOS might have a routine for that).