How can the PHP script be optimized to handle larger tables for CSV export efficiently?
When dealing with larger tables for CSV export in PHP, it is important to optimize the script to handle the data efficiently. One way to do this is by using PHP's `fputcsv()` function to write rows to the CSV file one at a time, rather than loading the entire dataset into memory before writing. This approach helps reduce memory usage and improve performance when working with large tables.
// Connect to your database and fetch data from the table
$pdo = new PDO("mysql:host=localhost;dbname=mydatabase", "username", "password");
$statement = $pdo->query("SELECT * FROM mytable");
// Set headers for CSV file
header('Content-Type: text/csv');
header('Content-Disposition: attachment; filename="export.csv"');
// Open a file pointer for writing CSV data
$fp = fopen('php://output', 'w');
// Write headers to the CSV file
fputcsv($fp, array('Column1', 'Column2', 'Column3'));
// Write rows to the CSV file
while ($row = $statement->fetch(PDO::FETCH_ASSOC)) {
fputcsv($fp, $row);
}
// Close the file pointer
fclose($fp);