What are some best practices for optimizing PHP code to handle and display large datasets from a database efficiently?
When handling and displaying large datasets from a database in PHP, it's important to optimize the code to ensure efficient performance. Some best practices include using pagination to limit the number of records fetched at a time, using indexes on database columns to speed up queries, and caching data where possible to reduce database queries.
// Example code snippet for implementing pagination in PHP to handle large datasets efficiently
// Set the limit of records to fetch per page
$limit = 10;
// Calculate the offset based on the current page number
$page = isset($_GET['page']) ? $_GET['page'] : 1;
$offset = ($page - 1) * $limit;
// Query the database with the limit and offset
$query = "SELECT * FROM table_name LIMIT $limit OFFSET $offset";
$result = mysqli_query($connection, $query);
// Loop through the results and display them
while ($row = mysqli_fetch_assoc($result)) {
// Display data here
}
// Add pagination links for navigating through the dataset
$total_records = // Get total number of records from database
$total_pages = ceil($total_records / $limit);
for ($i = 1; $i <= $total_pages; $i++) {
echo "<a href='?page=$i'>$i</a> ";
}