How can PHP be optimized to handle CSV files with large amounts of data and extract only the necessary information efficiently?

When dealing with large CSV files in PHP, it is essential to optimize the code to extract only the necessary information efficiently. One way to achieve this is by using the fgetcsv() function to read the file line by line and extract only the required data without loading the entire file into memory. Additionally, utilizing indexing or filtering techniques can help improve performance when handling large amounts of data.

// Open the CSV file for reading
$handle = fopen('data.csv', 'r');

// Loop through the file line by line and extract necessary information
while (($data = fgetcsv($handle)) !== false) {
    // Extract only the necessary information (e.g., column 1 and column 3)
    $necessaryData = array($data[0], $data[2]);
    
    // Process the necessary data as needed
    // ...
}

// Close the file handle
fclose($handle);