Hey Folks, today in this tutorial we are going to discuss a web application security testing tool called “hakrawler“. hakrawler is a Go web crawler designed for easy, quick discovery of endpoints and assets within a web application. It can be used to discover :
- Forms
- Endpoints
- Subdomains
- Related domains
- JavaScript files
Let’s take a look 😛 !!
Install Golang
The tool is coded in the Go language, so we have to configure the go utility by using the following command to operate this tool.
1 | apt install golang |
Installation of Hakrawler
Now the time has come to install this tool using Go utility. We only have to execute the following command to install this tool.
1 | go get github.com/hakluke/hakrawler |
The tool has automatically reached to the binary location of kali linux which means that we can access it from anywhere.
1 | hakrawler -h |
Robots.txt Parsing
Basically robots.txt is a standard used by websites to communicate with web crawlers and other web robots which we can find with the help of this tool.
1 | hakrawler -url < website > -robots |
Subdomains
Finding a sub-domain is a common feature but you can also use it.
1 | hakrawler -url fintaxico.in -subs |
Depth Scan
If you want to crawl the website completely with depth then you can use the following command and also increase the depth accordingly.
1 | hakrawler -url secnhack.in -depth 10 |
Likewise, this tool has many features that can give you a good experience while crawling any web application.
A keen learner and passionate IT student. He has done Web designing, CCNA, RedHat, Ethical hacking, Network & web penetration testing. Currently, he is completing his graduation and learning about Red teaming, CTF challenges & Blue teaming.