Afuzz: automated web path fuzzing tool
Afuzz – An automated web path fuzzing tool
Afuzz is an automated web path fuzzing tool for the Bug Bounty projects.
- Afuzz automatically detects the development language used by the website and generates extensions according to the language
- Uses blacklist to filter invalid pages
- Uses a whitelist to find content that bug bounty hunters are interested in on the page
- filters random content on the page
- judges 404 error pages in multiple ways
- perform statistical analysis on the results after scanning to obtain the final result.
- support HTTP2
git clone https://github.com/rapiddns/Afuzz.git
python setup.py install
- Wordlist is a text file, each line is a path.
- About extensions, Afuzz replaces the
%EXT%keyword with extensions from -e flag.If no flag -e, the default is used.
- Generate a dictionary based on domain names. Afuzz replaces %subdomain% with host, %rootdomain% with a root domain, %sub% with a subdomain, and %domain% with the domain. And generated according to %ext%
- Normal extensions
Passing asp and aspx extensions will generate the following dictionary:
Passing https://test-www.hackerone.com and php extension will generate the following dictionary:
Some examples of how to use Afuzz – are the most common arguments. If you need all, just use the -h argument.
afuzz -u https://target
afuzz -e php,html,js,json -u https://target
afuzz -e php,html,js -u https://target -d 3
The thread number (-t | –threads) reflects the number of separated brute force processes. And so the bigger the thread number is, the faster afuzz runs. By default, the number of threads is 10, but you can increase it if you want to speed up the progress.
In spite of that, the speed still depends a lot on the response time of the server. And as a warning, we advise you to keep the thread number not too big because it can cause DoS.
afuzz -e aspx,jsp,php,htm,js,bak,zip,txt,xml -u https://target -t 50
The blacklist.txt and bad_string.txt files in the /db directory are blacklists, which can filter some pages
The blacklist.txt file is the same as dirsearch.
The bad_stirng.txt file is a text file, one per line. The format is position==content. With == as the separator, the position has the following options: header, body, regex, title
The language.txt is the detection language rule, the format is consistent with bad_string.txt. Development language detection for website usage.