How can one efficiently handle large datasets in PHP when converting and encoding data for JSON output?

When handling large datasets in PHP for JSON output, it's important to efficiently convert and encode the data to prevent memory issues. One way to do this is by using generators to process the data in chunks rather than loading the entire dataset into memory at once. This can help improve performance and reduce memory usage when dealing with large amounts of data.

// Example of efficiently handling large datasets in PHP for JSON output using generators

// Function to retrieve and process data in chunks
function getDataChunk($data, $chunkSize) {
    for ($i = 0; $i < count($data); $i += $chunkSize) {
        yield array_slice($data, $i, $chunkSize);
    }
}

// Sample large dataset
$largeDataset = range(1, 1000000);

// Set chunk size
$chunkSize = 1000;

// Initialize an empty array to store processed data
$processedData = [];

// Process data in chunks using generator
foreach (getDataChunk($largeDataset, $chunkSize) as $chunk) {
    // Process each chunk of data here
    $processedData[] = $chunk; // For demonstration purposes, just adding the chunk to the processed data array
}

// Encode processed data to JSON
$jsonOutput = json_encode($processedData);

// Output JSON data
echo $jsonOutput;