<metapackage xmlns:os="http://opensuse.org/Standards/One_Click_Install" xmlns="http://opensuse.org/Standards/One_Click_Install">
  <group>
    <repositories>
      <repository recommended="true">
        <name>home:Levitating</name>
        <summary>Levitating repository</summary>
        <description>My personal repository. </description>
        <url>https://download.opensuse.org/repositories/home:/Levitating/16.0/</url>
      </repository>
      <repository recommended="true">
        <name>openSUSE:Leap:16.0</name>
        <summary>openSUSE Leap 16.0 based on SLFO</summary>
        <description>Leap 16.0 based on SLES 16.0 (specifically SLFO:1.2)</description>
        <url>https://download.opensuse.org/distribution/leap/16.0/repo/oss/</url>
      </repository>
      <repository recommended="true">
        <name>openSUSE:Backports:SLE-16.0</name>
        <summary>Community packages for SLE-16.0</summary>
        <description>Community packages for SLE-16.0</description>
        <url>https://download.opensuse.org/repositories/openSUSE:/Backports:/SLE-16.0/standard/</url>
      </repository>
      <repository recommended="false">
        <name>SUSE:SLFO:1.2</name>
        <summary>SLFO 1.2 (the base for openSUSE 16.0 and SLES 16.0)</summary>
        <description></description>
        <url>https://download.opensuse.org/repositories/SUSE:/SLFO:/1.2/standard/</url>
      </repository>
    </repositories>
    <software>
      <item>
        <name>anubis</name>
        <summary>Weighs the soul of incoming HTTP requests using proof-of-work to stop AI crawlers</summary>
        <description>Anubis weighs the soul of your connection using a sha256 proof-of-work
challenge in order to protect upstream resources from scraper bots.

Installing and using this will likely result in your website not being indexed
by some search engines. This is considered a feature of Anubis, not a bug.

This is a bit of a nuclear response, but AI scraper bots scraping so
aggressively have forced my hand. I hate that I have to do this, but this is
what we get for the modern Internet because bots don't conform to standards
like robots.txt, even when they claim to.

In most cases, you should not need this and can probably get by using
Cloudflare to protect a given origin. However, for circumstances where you
can't or won't use Cloudflare, Anubis is there for you.</description>
      </item>
    </software>
  </group>
</metapackage>
