How can PHP developers effectively optimize database queries for large datasets with over 10 million entries?

When dealing with large datasets with over 10 million entries, PHP developers can effectively optimize database queries by utilizing indexing, limiting the number of retrieved rows, and using pagination to fetch data in smaller chunks. Additionally, developers can optimize queries by avoiding unnecessary joins, optimizing query structure, and using caching mechanisms to reduce database load.

// Example of optimizing database query for large datasets
$limit = 100; // Number of rows to retrieve at a time
$page = isset($_GET['page']) ? $_GET['page'] : 1; // Current page number

$offset = ($page - 1) * $limit; // Calculate offset based on current page

$query = "SELECT * FROM large_table LIMIT $limit OFFSET $offset";
$result = mysqli_query($connection, $query);

while ($row = mysqli_fetch_assoc($result)) {
    // Process each row
}