There is a small web page that requests an XML file from a web server. The web server updates this file twice a second with new data to be displayed on the web page.
After the page is left running for a while, it will slow down the entire browser whenever it gets the new data from the server (twice a second). The problem is intermittent - sometimes it lasts a few seconds, sometimes it lasts longer.
The problem exists in Firefox.
function getData() {
var dataVal = parseFloat($("#data").html().substring(0, 100));
doSomethingWithVal(XTEval); // seemingly irrelevant, as it lags without this function too
}
t = 0.5;
function process_xml() {
var xmlhttp = GetXmlHttpObject();
if (xmlhttp === null) {
alert("Your browser does not support AJAX!");
return;
}
xmlhttp.onreadystatechange = function() {
if (xmlhttp.readyState === 4) {
document.getElementById("data").innerHTML = xmlhttp.responseText;
getData();
setTimeout('process_xml()', t * 1000); // moved from end of function to here based on a suggestion
}
};
xmlhttp.open("GET", "data.shtml", true);
xmlhttp.send("");
}
function GetXmlHttpObject() {
if (window.XMLHttpRequest)
return new XMLHttpRequest();
if (window.ActiveXObject)
return new ActiveXObject("Microsoft.XMLHTTP");
return null;
}
data.shtml simply contains:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<!--#echo var="someVal.val" -->
Perhaps there is a different/better approach than using AJAX?
I don't know that this is the problem, but if the round-trip to the server, plus parsing and everything else you're doing in
process_xml
takes more than 500ms, then you'll wind up with several requests in-flight at the same time, which will eventually clog the pipes (browsers have a limit to how many the can handle concurrently). You're better off calling yoursetTimeout
in theonreadystatechanged
handler, after all the work in there is complete.