What are best practices for efficiently handling memory usage when working with large files in PHP?

When working with large files in PHP, it is important to handle memory usage efficiently to prevent performance issues or crashes. One way to do this is by reading the file line by line instead of loading the entire file into memory at once. This can be achieved using functions like `fgets()` or `SplFileObject` to process the file incrementally.

$filename = 'large_file.txt';

// Open the file for reading
$file = fopen($filename, 'r');

// Read the file line by line
while (($line = fgets($file)) !== false) {
    // Process the line here
}

// Close the file
fclose($file);