How can PHP be optimized for processing large files, such as XML files with thousands of data records?
When processing large files like XML files with thousands of data records in PHP, it is essential to optimize the code for memory usage and processing speed. One way to achieve this is by using XMLReader instead of simplexml_load_file, as XMLReader is more memory-efficient and allows for streaming processing of the XML file.
$reader = new XMLReader();
$reader->open('large_file.xml');
while ($reader->read()) {
// process each node here
}
$reader->close();