unable to do progressive rendering

532 views Asked by At

Years ago I wrote CGI programs at previous jobs that relied on progressive rendering, since those CGI programs could take a long time (minutes) to run, producing a line of output approx every second. I find that today, I cannot get progressive rendering to occur even with the simplest example.

I've seen a lot of suggestions on this topic about where to put CSS, scripts, etc. However, the trivial example below has none of that.

I don't see anywhere where browsers have an option to affect progressive rendering. I have tried this on several systems/devices with several browsers (chrome, firefox, opera), all with the same result.

Below is a trivial example that I expect to produce some output every 2 seconds, but instead it renders when the whole document is complete. Am I missing something obvious?

#!/usr/bin/env perl

select(STDOUT); $| = 1;     # don't buffer stdout

print "Content-Type: text/html\; charset=ISO-8859-1\n\n" ;
print "<html> <head> <title> Testing </title> </head> <body>\n" ;

my $message = "<code>" .
    "Why doesn't this render immediately? <br>\n" x 5 .
    "</code>\n" ;

for ( my $i=0 ; $i < 5 ; $i++ ) {
    print "$message\n" ;
    sleep(2) ;
}
print "</body></html>\n" ;
3

There are 3 answers

1
ccm On BEST ANSWER

Your web server is likely buffering the response. $| = 1; sets up STDOUT to be autoflushed each time you print, eliminating the effects of the buffering in your script, but you also need to consider buffering that's happening in your web server.

There's no command or character sequence to flush the buffer, but you could simply just send a sufficient amount of data to fill the buffer so that it flushes on its own.

Just send inconsequential content, like a bunch of spaces:

print " " x 1024 * 8;

How much data you need to send depends on how large the buffers are configured to be in your web server. Typical buffer sizes are 4KiB or 8KiB, but be aware that if your server gzips your script's response then you will need to print considerably more (perhaps around 8MiB of space characters) in order to fill the server's buffers because the buffers will be filled with the compressed response.

Of course, you could also just disable the buffering in your server. How you do that depends on the web server. For nginx, take a look at X-Accel-Buffering.

1
simbabque On

You also need your webserver to keep the CGI script running long enough. A default Apache has a 1 minute timeout.

You already turned off buffering with $_, which is fine. There is no more control you can have from the script. The connection needs to stay open, even if you want to do chuncked transfer, which isn't really needed in your example.

It's the server who shuts it down after a while. And once the connection is gone the webserver doesn't send the remainder of your response over the wire because that connection is gone and the CGI handle is detached, so nothing is reading your output to pass it along.

Conclusion: Set the timeout to a higher value.

Anecdote: I used to work on a system with a timeout setting of about an hour, where a CGI based back office application did large DB queries over a huge MySQL database and stuff took long. People who worked with that tool typically started it and went to grab coffee or have lunch.

0
RJ White On

The response from ccm didn't work for me, but it set me on the right track to find the problem. The solution was to add the following to my Apache config:

SetEnvIfNoCase Request_URI \.cgi$ no-gzip dont-vary

which I found from Prevent output buffering with PHP and Apache

While trying out what @ccm suggested, it appears the buffer size is 1K, which is fine with me.

Thanks a huge amount to @ccm for setting me on the right path!