How can PHP be used to chunk large file downloads to prevent timeouts and ensure successful downloads for users?

When downloading large files in PHP, timeouts can occur due to the time it takes to transfer the file. To prevent this, the file can be chunked into smaller parts that are sent sequentially to the user's browser. This ensures that the download progresses steadily and reduces the likelihood of timeouts.

$file = 'path/to/large/file.zip';
$chunkSize = 1024 * 1024; // 1 MB chunks

header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="' . basename($file) . '"');

$handle = fopen($file, 'rb');
while (!feof($handle)) {
    echo fread($handle, $chunkSize);
    ob_flush();
    flush();
}
fclose($handle);