Go robots.txt parser
-
Updated
Nov 27, 2017 - Go
Go robots.txt parser
Robots.txt parser and generator - Work in progress
Package robots implements robots.txt file parsing and matching based on Google's specification.
A simple and flexible web crawler that follows the robots.txt policies and crawl delays.
grobotstxt is a native Go port of Google's robots.txt parser and matcher library.
The robots.txt exclusion protocol implementation for Go language
Enumerate old versions of robots.txt paths using Wayback Machine for content discovery
Go language library for parsing Sitemaps
Parsero is a free script written in Golang which reads the Robots.txt file of a web server and looks at the Disallow entries. The Disallow entries tell the search engines what directories or files hosted on a web server mustn't be indexed.
Add a description, image, and links to the robots-txt topic page so that developers can more easily learn about it.
To associate your repository with the robots-txt topic, visit your repo's landing page and select "manage topics."