Getting Started
10. Robots.txt
Configure website Robots rules to adjust crawling rules.
Introduction
robots.txt tells search engines whether they can index your site. Which pages we need to protect and not include in indexing. Or when we're in development stage, don't index yet. So it's quite useful.
Block Search During Development
.env Configuration
# robots
NUXT_SITE_ENV=preview
Enable Indexing
- Remove the
NUXT_SITE_ENVconfiguration - Configure
nuxt.config.tsfilerobots: { disallow: ['/api', '/account'] },Block the
/api/accountdirectories - View robots.txt
https://saas-fast.ducafecat.com/robots.txt
# START nuxt-robots (indexable) User-agent: * Disallow: /api Disallow: /account Sitemap: https://saas-fast.ducafecat.com/sitemap.xml # END nuxt-robots
end