Web App Enumeration

Enumerate using Google

Using Google or other search engines we may be able to gather some valuable information.
We can search for:

  • Config files
  • SQL File
  • Username, Private keys, and even passwords
  • Error messages
  • Any other technical messages

Mostly I use the following queries:

#Find pages
site: site.com

#Find Subdomain
site: site.com -www

#Find files php/jsp/aspx/asp/cfm/sql
site: site.com filetype:php

#Find the page if match keywords in title
site: site.com intitle: admin login

#if the title match our keyword
site: site.com intitle: "index of backup.php"

#Find files containing passwords
intitle: "Index of ftp passwords"

#Find page if url has our keywords
site: site.com inurl:?id=

#pages containing login
site: site.com inurl:admin/reset.php -github

For more Google Dorks: Google Hacking Database!

Gather info from Social Site

Basically, I would search for Employee details, Technical posts, and some other information.

What we can do with that information?

  • Getting an idea about the company
  • Social Engineering
  • Username/Password Generate

If we found an employee’s name, we can search for that name on Google, Peoples directory to find out more about him.

Example query:

  • LinkedIn – site: linkedin.com intitle: Employee Name
  • Twitter – site: twitter.com intitle: Employee Name
  • Facebook – site: facebook.com intitle: Employee Name
  • Google – 'Employee Name Company_name'
  • Get the Employee List of the company from LinkedIn

Banner Grabbing

Banner Grabbing is useful to find an existing vulnerability.


whatweb domain.com


nmap -v -p80,443 -sV domain.com


nc -vvv domain.com 80

Send malformed request:

nc -vvv domain.com 80

Explore Target site

DNS Enumeration

Retrieve Common information:

#Check if robots.txt exist
curl -O -Ss http://www.domain.com/robots.txt

#Get IP address
nslookup domain.com

#Get IP, NS, MX etc
nslookup -querytype=ANY domain.com

#Same thing as nslookup using $ host
host domain.com
host -t ns domain.com
host -t mx domain.com

#Zone Transfer
host -l www.domain.com ns1.domain.com
Reverse Lookup with Bash
for iplist in $(seq 190 255); do host x.x.x.$iplist; done | grep -v "not found"
DNS Enumeration Tools

Note: Any newly found virtual host is important. Other Virtual could be vulnerable If even the main domain is not vulnerable which could allow us to move to a different virtual host.

#Zone Transfer and Brute force subdomain
dnsenum rednode.com

#Zone Transfer and Brute force subdomain
dnsreecon -a -d rednode.com

#Test for zone transfer and brute force dns
fierce --domain rednode.com

#search for virtual host, brute force dns, also look at google
theHarvester -d rednode.com -v -c -b google

Screenshot of theHarvester:

Enumerate Applications

Scan port

nmap -v -Pn -p- -sV domain.com

Manually connect to every port for banner grabbing

nc -vvv target.com 80

if any none standard http port open, explore:


See how the URL is structured. For example:

#If we have this url

#Then Try

Check Digital Certificates manually for information such as email and using sslyze

sslyze rednode.com

Check other data on the site:

  • HTTP Headers – We may get some valuable information like the framework version
  • Review HTML Source Code – Check for comments and source code structure, which may reveal what is being used or even other sensitive info
  • Cookies – Cookie structure may tell us what is being used. Such as PHPSESSIONID clearly indicating PHP is there!
  • Known files and directories – How about trying some known files or directories? /wp-admin tells us it is WordPress
  • Error Message – This may reveal the internal path, username, or other sensitive info. Try to browse something like /config.php or config.php?id[]=449

Enumerate Files and Username

Crawling and File Fuzzing is one of the most important parts of web enumeration. What we should search for?

  • Find all GET/POST method parameters
  • Brute Directory and Files


Nikto is a popular web server scanner. It searches for dangerous files and some common vulnerabilities

nikto -h rednode.com

Burp Suites


Crawl Using Burp Suite Pro

  1. Intercept the target
  2. Right-click on the target address.
  3. Engagement Tools>Discover Content
  4. Click on “Session is not running”

Now what?

  • Check all interesting links after crawling and find URL parameters
  • Manually visit the site, and submit the form to capture the parameters
Directory Brute Forcing

First, Send the target root directory / to Intruder and clear all attack points. And newly create attack points as below

GET /§§ HTTP/1.1
Files Brute Forcing
GET /§name§.§extension§ HTTP/1.1

Select attack type Cluster bomb

Go to Payloads Tab, Set payload set to 1 and load the common directory by clicking on Load button in the Payload Options section.

Web App Enumeration

Next set the payload set to 2 and provide file extension:

Web App Enumeration

Click on Start Attack

Web App Enumeration


Another free tool I use is gobuster to find hidden files and folders:

gobuster dir -u https://host/ -t 15 -w /usr/share/dirb/wordlists/common.txt -x .php,.txt,.conf -k

If we get an error something like:

Error: the server returns a status code that matches the provided options for non existing urls. => 200 (Length: 1960). To continue please exclude the status code, the length or use the --wildcard switch

Try with exclude-length:

gobuster dir -u -t 15 -w /usr/share/dirb/wordlists/common.txt -x .php,.txt,.conf -k --exclude-length 1960

Now what?

  1. Use this info to find Auth, Mis-configuration, Business logic, or Injection vulnerabilities.
  2. Make an effective password attack plan.
  3. Plan a good social engineering attack.

Without information gathering and enumeration, an effective plan is never possible!

Reference: https://owasp.org/www-project-web-security-testing-guide/v42/4-Web_Application_Security_Testing/01-Information_Gathering/README.html