What are the implications of using file_get_contents for downloading large files in the context of the provided PHP script?
Using file_get_contents to download large files can lead to memory exhaustion issues as the entire contents of the file are loaded into memory. To avoid this problem, it's recommended to use a stream-based approach like fopen and fread to read the file in smaller chunks.
<?php
$fileUrl = 'http://example.com/largefile.zip';
$destination = 'downloaded_file.zip';
$remoteFile = fopen($fileUrl, 'rb');
$localFile = fopen($destination, 'wb');
if ($remoteFile && $localFile) {
while (!feof($remoteFile)) {
fwrite($localFile, fread($remoteFile, 1024));
}
fclose($remoteFile);
fclose($localFile);
echo 'File downloaded successfully.';
} else {
echo 'Error downloading file.';
}
?>