libwww-robotrules-perl binary package in Ubuntu Trusty i386

 WWW::RobotRules parses /robots.txt files as specified in "A Standard for
 Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>. Webmasters
 can use the /robots.txt file to forbid conforming robots from accessing parts
 of their web site.
 .
 The parsed files are kept in a WWW::RobotRules object, and this object
 provides methods to check if access to a given URL is prohibited. The same
 WWW::RobotRules object can be used for one or more parsed /robots.txt files
 on any number of hosts.

Publishing history

Date Status Target Pocket Component Section Priority Phased updates Version
  2013-10-18 14:25:21 UTC Published Ubuntu Trusty i386 release main perl Optional 6.01-1
  • Published
  • Copied from ubuntu oneiric-release i386 in Primary Archive for Ubuntu