What are some potential pitfalls of using array_search in PHP when trying to filter duplicate entries in a database query result?

Using array_search in PHP to filter duplicate entries in a database query result may not be efficient for large datasets as it requires iterating over the array multiple times. It can also lead to performance issues and increased memory usage. A better approach would be to use SQL queries to filter out duplicate entries directly from the database.

// Assume $results is an array of database query results with potential duplicate entries

// Create an empty array to store unique entries
$uniqueResults = [];

// Iterate over the results and filter out duplicates
foreach ($results as $result) {
    if (!in_array($result, $uniqueResults)) {
        $uniqueResults[] = $result;
    }
}

// $uniqueResults now contains only unique entries