How can PHP developers optimize the retrieval and processing of data from multiple websites to avoid server timeouts or errors?

When retrieving and processing data from multiple websites in PHP, developers can optimize the process by using asynchronous requests to prevent server timeouts or errors. One way to achieve this is by using libraries like Guzzle or cURL to make concurrent requests and handle responses efficiently.

<?php

// Using Guzzle library for asynchronous requests
require 'vendor/autoload.php';

$client = new GuzzleHttp\Client();

$urls = ['http://example.com', 'http://example2.com', 'http://example3.com'];

$promises = [];
foreach ($urls as $url) {
    $promises[] = $client->getAsync($url);
}

$results = GuzzleHttp\Promise\settle($promises)->wait();

foreach ($results as $result) {
    if ($result['state'] === 'fulfilled') {
        $response = $result['value'];
        // Process the response data here
    } else {
        $exception = $result['reason'];
        // Handle any errors or timeouts
    }
}