A small web crawler named aranea (Latin for spider). https://www.bananas-playground.net/projekt/aranea/

Banana 77931ec1eb more info 1 week ago
documentation 9137d46a9b updated requirements 1 week ago
lib 7271145682 license change and new config value 1 month ago
storage 24fb355861 fetch.pl 2 years ago
.gitignore 546a78d2ab ignore config file 1 week ago
CHANGELOG 7271145682 license change and new config value 1 month ago
COPYING 7271145682 license change and new config value 1 month ago
LICENSE 7271145682 license change and new config value 1 month ago
README.md a9e407d96a url fixed 1 week ago
TODO a8bdf7dc15 do not use finish/( 1 week ago
VERSION 17aef3b5ab cleanup of the code and some paperwork 2 years ago
cleanup.pl a8bdf7dc15 do not use finish/( 1 week ago
config.default.txt cf64b1e828 adding default config 1 week ago
fetch.pl a8bdf7dc15 do not use finish/( 1 week ago
parse-results.pl 77931ec1eb more info 1 week ago
setup.sql cfdca6000e project cleanup and updated project website links 2 years ago

README.md

aranea

https://www.bananas-playground.net/projekt/aranea

A small web crawler named aranea (Latin for spider). The aim is to gather unique domains to show what is out there.

Fetch

It starts with a given set of URL(s) and parses them for more URLs. Stores them and fetches them too. -> fetch.pl

Parse

Each URL result (Stored result from the call) will be parsed for other URLs to follow. -> parse-results.pl

Cleanup

After a run cleanup will gather all the unique Domains into a table. Removes URLs from the fetch table which are already enough. -> cleanup.pl

Ignores

The table url_to_ignore does have a small amount of domains and part of domains which will be ignored. Adding a global SPAM list would be overkill.

A good idea is to run it with a DNS filter, which has a good blocklist.