It creates simplified robots.txt for your Angular project. 🚀
User-agent: *
Allow: /
Sitemap: https://www.mydomain.com/sitemap.xml
or
User-agent: *
Disallow: /
{
...,
"projects": {
"your-project-name": {
...,
"architect": {
...,
"robots": {
"builder": "ngx-devkit-builders:robots",
"options": {
"sitemap": "https://www.mydomain.com/sitemap.xml",
"verbose": false
},
"configurations": {
"production": {
"allow": true
},
"development": {
"allow": false
}
}
}
}
}
}
}# Production configuration (allows crawling)
ng run your-project-name:robots:production
# Development configuration (disallows crawling)
ng run your-project-name:robots:development| Option | Type | Default | Description |
|---|---|---|---|
sitemap |
string |
- | URL to sitemap.xml |
allow |
boolean |
- | Whether to allow or disallow web crawling |
verbose |
boolean |
false |
Show detailed output |