I'm trying to request multiple APIs as fast as possible. So i've tried curl_multi. But I get slower results than with foreach and file_get_contents. What am I doing wrong?
With file_get_contents:
<?php
$start = microtime(true);
$urls = array("https://www.example1.com/", "https://www.example2.com/", "https://www.example3.com/");
foreach ($urls as $url) {
$result = file_get_contents($url);
}
echo microtime(true) - $start;
?>
With curl_multi:
<?php
$start = microtime(true);
$urls = array("https://www.example1.com/", "https://www.example2.com/", "https://www.example3.com/");
$urls_count = count($urls);
$curl_arr = array();
$master = curl_multi_init();
for($i = 0; $i < $urls_count; $i++)
{
$curl_arr[$i] = curl_init($urls[$i]);
curl_setopt($curl_arr[$i], CURLOPT_RETURNTRANSFER, true);
curl_multi_add_handle($master, $curl_arr[$i]);
}
do {
curl_multi_exec($master,$running);
} while($running > 0);
for($i = 1; $i < $urls_count; $i++)
{
$results = curl_multi_getcontent ( $curl_arr[$i] );
}
echo microtime(true) - $start;
?>
This issue is that
curl_multihas a lot of overhead. I am assuming that it would have to create a shell process for each request and then execute curl in that process and then finally return the contents to the script requesting the action.file_get_contentsis optimized and inherent to the PHP language:This is a good learning experience of when to use a library vs native functionality in a language. Also there might be an option for the library to be multithreaded and take advantage of multi-core processors which might speed up the requests. Something to look up and test for yourself.