home:favogt:symbolictw -Bsymbolic-functions https://download.opensuse.org/repositories/home:/favogt:/symbolictw/standard/ openSUSE:Factory:Staging:A https://download.opensuse.org/repositories/openSUSE:/Factory:/Staging:/A/standard/ openSUSE:Factory:Staging:A https://download.opensuse.org/repositories/openSUSE:/Factory:/Staging:/A/bootstrap_copy/ openSUSE:Factory:Staging openSUSE Factory Staging This is just a namespace for various staging projects https://download.opensuse.org/repositories/openSUSE:/Factory:/Staging/standard/ openSUSE:Factory The next openSUSE distribution Any user who wishes to have the newest packages that include, but are not limited to, the Linux kernel, SAMBA, git, desktops, office applications and many other packages, will want Tumbleweed. Tumbleweed appeals to Power Users, Software Developers and openSUSE Contributors. If you require the latest software stacks and Integrated Development Environment or need a stable platform closest to bleeding edge Linux, Tumbleweed is the best choice for you. Staging dashboard is located at: https://build.opensuse.org/staging_workflows/openSUSE:Factory List of known devel projects: https://build.opensuse.org/package/view_file/openSUSE:Factory:Staging/dashboard/devel_projects Have a look at http://en.opensuse.org/Portal:Factory for more details. https://download.opensuse.org/repositories/openSUSE:/Factory/ports/ perl-WWW-RobotRules database of robots.txt-derived permissions This module parses _/robots.txt_ files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the _/robots.txt_ file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed _/robots.txt_ files on any number of hosts. The following methods are provided: * $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. * $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the _/robots.txt_ file, and the contents of the file. * $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. * $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache.