How can PHP scripts be optimized for efficiency when processing large text files?
When processing large text files in PHP, it is important to optimize the script for efficiency to prevent memory issues. One way to achieve this is by reading the file line by line instead of loading the entire file into memory at once. This can be done using functions like `fgets()` or `SplFileObject` to process the file incrementally.
$file = new SplFileObject('large_text_file.txt');
while (!$file->eof()) {
$line = $file->fgets();
// Process each line of the file here
}
$file = null; // Clean up resources