Binary package “libwww-robotrules-perl” in ubuntu xenial
database of robots.txt-derived permissions
WWW::RobotRules parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at <http://
can use the /robots.txt file to forbid conforming robots from accessing parts
of their web site.
.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed /robots.txt files
on any number of hosts.
Source package
Published versions
- libwww-robotrules-perl 6.01-1 in amd64 (Release)
- libwww-robotrules-perl 6.01-1 in arm64 (Release)
- libwww-robotrules-perl 6.01-1 in armhf (Release)
- libwww-robotrules-perl 6.01-1 in i386 (Release)
- libwww-robotrules-perl 6.01-1 in powerpc (Release)
- libwww-robotrules-perl 6.01-1 in ppc64el (Release)
- libwww-robotrules-perl 6.01-1 in s390x (Release)