How can the PHP script be optimized to efficiently handle a large number of URL checks without experiencing timeouts or failures?
To optimize the PHP script for handling a large number of URL checks without timeouts or failures, you can implement multi-threading using PHP's built-in functions like curl_multi_init() to process multiple URLs concurrently. This approach will improve the script's efficiency by executing checks in parallel, reducing the overall processing time.
// List of URLs to check
$urls = array("https://example.com", "https://google.com", "https://stackoverflow.com");
// Initialize curl multi handler
$mh = curl_multi_init();
$curl_handles = array();
foreach ($urls as $url) {
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_multi_add_handle($mh, $ch);
$curl_handles[] = $ch;
}
// Execute the multi handle
$running = null;
do {
curl_multi_exec($mh, $running);
} while ($running > 0);
// Get the results
foreach ($curl_handles as $ch) {
$url = curl_getinfo($ch, CURLINFO_EFFECTIVE_URL);
$http_code = curl_getinfo($ch, CURLINFO_HTTP_CODE);
echo "URL: $url - HTTP Code: $http_code" . PHP_EOL;
curl_multi_remove_handle($mh, $ch);
curl_close($ch);
}
// Close the multi handle
curl_multi_close($mh);