How can PHP be used to efficiently handle large data sets from a database without exceeding memory limits?
When handling large data sets from a database in PHP, it's important to avoid loading all data into memory at once to prevent exceeding memory limits. One way to efficiently handle large data sets is to use pagination, fetching and processing data in smaller chunks. This can be achieved by using LIMIT and OFFSET clauses in SQL queries to retrieve a subset of data at a time.
// Establish a database connection
$pdo = new PDO('mysql:host=localhost;dbname=database', 'username', 'password');
// Define pagination parameters
$limit = 100; // Number of records to fetch per page
$page = isset($_GET['page']) ? $_GET['page'] : 1; // Current page number
$offset = ($page - 1) * $limit; // Calculate offset
// Fetch data using LIMIT and OFFSET
$stmt = $pdo->prepare("SELECT * FROM table_name LIMIT :limit OFFSET :offset");
$stmt->bindParam(':limit', $limit, PDO::PARAM_INT);
$stmt->bindParam(':offset', $offset, PDO::PARAM_INT);
$stmt->execute();
$data = $stmt->fetchAll();
// Process and display fetched data
foreach ($data as $row) {
// Process each row of data
echo $row['column_name'] . "<br>";
}
// Display pagination links
// You can implement pagination links based on the total number of records in the database