Some site owners are choosing to block AI crawlers, such as ChatGPT and Bard from crawling their site in order to prevent it from learning from or using their website content. You can block these AI user-agents in a similar manner as you would block Google crawlers; by replacing the default robots.txt file with a new file that specifies
disallow rules for specific AI user-agents.
To block both ChatGPT and Google-Extended crawlers:
Create a new robots.txt file. We recommend following Google’s instructions on how to create a robots.txt file.
Add the following code to the new robots.txt file. Note that crawlers process robots.txt from top to bottom, so we do not recommend adding the wildcard directive at the top.
# Sitemap is also available on /sitemap.xml
(Optional) If you need to add other groups, follow the same format of:
And add it before the wildcard
Replace the default robots.txt file with the new file. To learn how, see Custom sitemap, robots.txt & other files. It is important to note that in order to replace the default file, the Source URL must match the file name exactly.