Getting Started
10. Robots.txt
Configure website Robots rules to adjust crawling rules.
Introduction
robots.txt tells search engines whether they can index your site. Which pages we need to protect and not include in indexing. Or when we're in development stage, don't index yet. So it's quite useful.
Block Search During Development
.env
Configuration
# robots
NUXT_SITE_ENV=preview
Enable Indexing
- Remove the
NUXT_SITE_ENV
configuration - Configure
nuxt.config.ts
filerobots: { disallow: ['/api', '/account'] },
Block the
/api
/account
directories - View robots.txt
https://saas-fast.ducafecat.com/robots.txt
# START nuxt-robots (indexable) User-agent: * Disallow: /api Disallow: /account Sitemap: https://saas-fast.ducafecat.com/sitemap.xml # END nuxt-robots
end