Hakrawler – Simple, Fast Web Application Crawler
Hakrawler is a Go web crawler designed for easy, quick discovery of endpoints and assets within a web application. It can be used to discover:
- Forms
- Endpoints
- Subdomains
- Related domains
- JavaScript files

The goal is to create the tool in a way that it can be easily chained with other tools such as subdomain enumeration tools and vulnerability scanners in order to facilitate tool chaining.
Some of the features are:
- Unlimited, fast web crawling for endpoint discovery
- Fuzzy matching for domain discovery
- robots.txt parsing
- sitemap.xml parsing
- Plain output for easy parsing into other tools
- Accept domains from stdin for easier tool chaining
- SQLMap-friendly output format
- Link gathering from JavaScript files
There are a large list of options supported with this tool to make the scanning running smother including:
- -all Include everything in output – this is the default, so this option is superfluous (default true)
- -auth string The value of this will be included as a Authorization header
- -cookie string The value of this will be included as a Cookie header
- -depth int Maximum depth to crawl,
- -domain string The domain that you wish to crawl
- -forms Include form actions in output
- -js Include links to utilised JavaScript files
- -linkfinder Run linkfinder on javascript files.
You can read more and download this tool over here: https://github.com/hakluke/hakrawler
Subscribe
0 Comments