What are some strategies for optimizing performance when dealing with large datasets in PHP?

When dealing with large datasets in PHP, it's important to optimize performance to prevent memory exhaustion and slow processing times. Some strategies for optimizing performance include using pagination to limit the number of records fetched at once, utilizing indexes in your database queries, caching frequently accessed data, and optimizing your code for efficiency.

// Example of implementing pagination to limit the number of records fetched at once
$page = isset($_GET['page']) ? $_GET['page'] : 1;
$limit = 10;
$offset = ($page - 1) * $limit;

$query = "SELECT * FROM large_table LIMIT $limit OFFSET $offset";
$result = mysqli_query($connection, $query);

while ($row = mysqli_fetch_assoc($result)) {
    // Process each row here
}