What is the purpose of using a robots.txt file in PHP web development?

The purpose of using a robots.txt file in PHP web development is to control the behavior of search engine crawlers on a website. By specifying which directories or files should not be indexed by search engines, developers can prevent sensitive information from being exposed or prevent duplicate content issues. This file allows developers to communicate with search engine bots and instruct them on how to crawl and index the website.

<?php
// Create a robots.txt file to prevent search engines from indexing certain directories
$robotsTxt = "User-agent: *
Disallow: /admin/
Disallow: /private/";

file_put_contents('robots.txt', $robotsTxt);