A simple Python CLI tool that checks the availability and validity of sitemap.xml and robots.txt — two crucial files for search engine crawling and indexing.
- Automatically constructs URLs for:
sitemap.xmlrobots.txt
- Checks if the files are present and accessible
- Displays:
- HTTP status codes
- Content-Type
- Errors (if any)
- CLI-based: Paste any domain and get results in seconds!
git clone https://github.com/Aditi-Rani/sitemap_robots_validatorcd sitemap-robots-validatorpip install -r requirements.txtpython sitemap_robots_validator.pyEnter the website URL (include https://): https://example.com