How can PHP developers optimize their code when fetching data from a MySQL database to prevent browser crashes?

PHP developers can optimize their code when fetching data from a MySQL database by limiting the amount of data retrieved at once, using pagination techniques, and optimizing their SQL queries to only fetch the necessary data. This can prevent browser crashes by reducing the amount of data being processed and displayed at one time.

// Example of implementing pagination in PHP when fetching data from a MySQL database

// Set the number of results to display per page
$results_per_page = 10;

// Calculate the offset based on the current page number
if (isset($_GET['page'])) {
    $page = $_GET['page'];
} else {
    $page = 1;
}
$offset = ($page - 1) * $results_per_page;

// Query to fetch data with pagination
$sql = "SELECT * FROM table_name LIMIT $offset, $results_per_page";
$result = mysqli_query($conn, $sql);

// Display the fetched data
while ($row = mysqli_fetch_assoc($result)) {
    echo $row['column_name'] . "<br>";
}

// Pagination links
$sql = "SELECT COUNT(*) AS total FROM table_name";
$result = mysqli_query($conn, $sql);
$row = mysqli_fetch_assoc($result);
$total_pages = ceil($row['total'] / $results_per_page);

for ($i = 1; $i <= $total_pages; $i++) {
    echo "<a href='?page=$i'>$i</a> ";
}