What is the impact of different JPEG's type on different JPEG decoding sub-process.?

528 views Asked by At

As we all know, the JPEG decoding process is shown in the following:

  • VLD - Variable length decoding,
  • ZZ - Zigzag scan,
  • DQ - Dequantization,
  • IDCT - Inverse discrete cosine transform,
  • Color conversion (YUV to RGB) and reorder.

My question is: for different characters of different JPEG images, which of the above decoding process will take more time?

For example:

For decoding this type of image with noise, which of the above Five process will take relatively more time?

Another example:

For two same images wit different quality, which one of the above Five processes will take more time when decoding a image with higher quality?

1

There are 1 answers

0
user3344003 On

JPEG tends to compress linearly with size (timewise). The major factor in that would affect the decoding time is whether you use sequential or progressive scans. In sequential, each component is processed once. In progressive, it is at least 2 and possibly as many as 500 (but that would be absurd) for each component.

For your specific questions:

VLD - Variable length decoding,

It depends upon whether you do this once (sequential) or multiple times (progressive)

ZZ - Zigzag scan

Easy to do. Array indices.

DQ - Dequantization

Depends upon how many times you do it. Once for sequential. It could be done multiple times for progressive (but does not need to be unless you want continuous updates on the screen).

IDCT - Inverse discrete cosine transform,

This depends a lot on the algorithm used, whether it is done using scaled integers or floating, and if it is down multiple times (as may (or may not) be done with a progressive JPEG)

Color conversion (YUV to RGB) and reorder.

You only have to do this once. However, if there is sampling, it gets more complicated.

In other words, the decoding time is the same no matter what the image is. However, the decoding time depends upon how that image is encoded.

I qualify that by saying that a smaller file tends to decode faster than a larger one simply because of the time to read off the disk. The more random the data is the larger the file tends to be. It often takes more time to read and display a large BMP file than a JPEG of the same image because of the file size.