I have over 2000 url calls to make and with the code below it is taking almost 2 minutes to complete. Could someone help me to speed the process up?
private void button4_Click(object sender, EventArgs e)
{
WebRequest req;
WebResponse res;
string[] lines = File.ReadAllLines(@"c:\data\temp.txt");
for (int i = 0; i < lines.Count(); i++)
{
req = WebRequest.Create(lines[i]);
res = req.GetResponse();
StreamReader rd = new StreamReader(res.GetResponseStream(), Encoding.ASCII);
rd.Close();
res.Close();
textBox1.Text += ".";
}
}
Many thanks
Since you don't specify a framework version I'll assume you are using at least 4.5.
You can use ActionBlock to easily execute multiple calls concurrently. An ActionBlock executes its action method in a single thread and multiple executions can be performed concurrently.
You could use something like this:
You can control how many requests are made concurrently by changing the MaxDegreeOfParallelism method.
You can also call
GetResponseAsync
to execute the request asynchronously. This won't make them go faster but it will reduce the number of ThreadPool threads used to serve the same number of requests. This means that less CPU is wasted while blocking and context switching.Disposing requests and responses is important. Unless you dispose the response, the connection to the server remains active. .NET enforces a 2 concurrent request per domain (ie URL) limit so orphaned responses can cause delays until the garbage collector runs and collects them. While you can override the limit, it's best to always dispose of the responses.