Reconnaissance
1.1 Subdomain enumeration
Passive Subdomain Enumeration
Tools :
Copy Intrusionz3r0@htb[/htb]$ dnsenum --enum hackerone.com -f /usr/share/seclists/Discovery/DNS/subdomains-top1million-110000.txt -r
Intrusionz3r0@htb[/htb]$ subfinder -d target.com -all -recursive -t 200 -silent -o subfinder-rescursive.txt
Intrusionz3r0@htb[/htb]$ findomain --quiet -t target.com | tee findomain.txt
Intrusionz3r0@htb[/htb]$ amass enum -passive -d target.com -o amass.txt
Intrusionz3r0@htb[/htb]$ assetfinder -subs-only target.com | tee assetfinder.txt
Intrusionz3r0@htb[/htb]$ sublist3r -d target.com -t 50 -o sublist3r.txt
Active Subdomain Enumeration
• Tools :
Wordlist :
Copy Intrusionz3r0@htb[/htb]$ python3 subbrute.py target.com -w wordlist.txt -o brute_subs.txt
Intrusionz3r0@htb[/htb]$ subbrute.py target.com /usr/share/wordlists/2m-subdomains.txt | massdns -r /usr/share/wordlists/resolvers.txt -t A -o S -w target.com.txt
Subdomain brute-forcing
Tools :
Copy Intrusionz3r0@htb[/htb]$ ffuf -u <https://target.com> -H "Host: FUZZ.target.com" -w /usr/share/wordlists/subdomains.txt -t 100 -fc 403 | tee ffuf_subs_output.txt
1.2 DNS Resolution and Probing
Cheking which domains are live by resolving their DNS records.
Tools :
Copy Intrusionz3r0@htb[/htb]$ shuffledns -d target.com -list all_subdomains.txt -r resolvers.txt -o live_subs.txt
Intrusionz3r0@htb[/htb]$ dnsx -l all_subdomains.txt -r /usr/share/wordlists/resolvers.txt -o live_subs.txt
Reverse lookup:
Copy #Collect their associated IP addresses for future port scanning and fingerprinting.
Intrusionz3r0@htb[/htb]$ dnsx -l live_subs.txt -a -resp-only -o live_with_ips.txt
#Reverse DNS Lookups
Intrusionz3r0@htb[/htb]$ dnsx -ptr -l live_with_ips.txt -r /usr/share/wordlists/resolvers.txt | massdns -r /usr/share/wordlists/resolvers.txt -q -o S -t PTR > reverse_dns_massdns.txt
1.3 HTTP Probing (Identifying Live Web Services)
Identify which subdomains are serving websites.
Copy Intrusionz3r0@htb[/htb]$ httpx -l live_subs.txt -title -sc -location -p 80,443,8000,8080,8443 -td -cl -probe -o httpx_output.txt
1.4 Screenshotting Web Services
Take screenshots of each live web server to quickly, identify login portals, or other points of interest.
Tools:
Copy #Fast enumerating web common web application
Intrusionz3r0@htb[/htb]$ sudo nmap -p 80,443,8000,8080,8180,8888,10000 --open -oA web_discovery -iL scope_list
#Fast enumeration by eyewitness
Intrusionz3r0@htb[/htb]$ eyewitness --web -x web_discovery.xml -d inlanefreight_eyewitness
#Fast enumeration by aquatone
Intrusionz3r0@htb[/htb]$ cat web_discovery.xml | ./aquatone -nmap
#Fast enumeration by gowitness
Intrusionz3r0@htb[/htb]$ gowitness scan file -f probed_domains.txt --threads 10 --screenshot-path screenshots/ --write-db
1.5 Content Discovery (Brute Forcing)
Identify hidden directories and files.
Copy Intrusionz3r0@htb[/htb]$ feroxbuster -u <https://10.10.10.60/> -x php,html,txt -w /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt -k -t 100
Intrusionz3r0@htb[/htb]$ ffuf -w /usr/share/wordlists/custom.txt -t 75 -ac -mc 200,405,401,415,302,301 -u <http://assets.engage.tesla.com/FUZZ>
Intrusionz3r0@htb[/htb]$ dirsearch -w /usr/share/wordlists/custom.txt --full-url --random-agent -x 404,400 -e php,html,js,json,ini -u <https://target.com/>
Intrusionz3r0@htb[/htb]$ dirsearch -e php,asp,aspx,jsp,py,txt,conf,config,bak,backup,swp,old,db,sql,asp,aspx,asp~,py~,rb,rb~,php~,bak,bkp,cache,cgi,conf,csv,html,inc,jar,js,json,jsp~,lock,log,rar,old,sql.gz,sql.zip,sql.tar.gz,sql~,swp~,tar,tar.bz2,tar.gz,txt,wadl,zip -i 200 --full-url --deep-recursive -w /usr/share/wordlists/custom.txt --exclude-subdirs .well-known/,wp-includes/,wp-json/,faq/,Company/,Blog/,Careers/,Contact/,About/,IMAGE/,Images/,Logos/,Videos/,feed/,resources/,banner/,assets/,css/,fonts/,img/,images/,js/,media/,static/,templates/,uploads/,vendor/ --exclude-sizes 0B --skip-on-status 429 --random-agent -u <http://target.com/>
1.6 Parameter Discovery
Tools:
Copy Intrusionz3r0@htb[/htb]$ arjun -u "<https://target.com>" -m get --stable
Intrusionz3r0@htb[/htb]$ ffuf -u <https://target.com/page.php?FUZZ=test> -w param_wordlist.txt
1.7 Archived URLs
Get older versions of the website that potencialy get a endpoins or parameters that arent available on the live site from Wayback machine Tools :
Copy Intrusionz3r0@htb[/htb]$ gau target.com | anew gau_urls.txt
Intrusionz3r0@htb[/htb]$ waybackurls target.com | anew wayback_urls.txt
Intrusionz3r0@htb[/htb]$ katana -passive -pss waybackarchive,commoncrawl,alienvault -f qurl -u target.com | anew katana_urls.txt
1.8 Filtering Interesting URLs
Tools:
Copy Intrusionz3r0@htb[/htb]$ cat gau_urls.txt | gf xss | anew xss_candidates.txt
Intrusionz3r0@htb[/htb]$ cat gau_urls.txt | gf sqli | anew sqli_candidates.txt
1.9 Crawling and Spidering
Crawl the target to discover deeper endpoints, hidden forms or parameters.
<aside> 💡
Crawling is useful to expand the attack surface by identifying all reachable URLs, forms, or parameters for fuzzing.
</aside>
Copy Intrusionz3r0@htb[/htb]$ katana -list probed_domains.txt -silent -o katana_crawl.txt
Intrusionz3r0@htb[/htb]$ gospider -s <https://target.com> -d 1 -o gospider_crawl.txt
1.10 ASN and IP range enumeration
Identify the target IP range or subnets
Copy Intrusionz3r0@htb[/htb]$ amass intel -asn <ASN_Number> -o asn_targets.txt
1.11 Cloud Asset Enumeration
Tools:
Google Dorks:
*site:<http://amazonaws.com> inurl:".s3.amazonaws.com/"*
*site:<http://s3.amazonaws.com> intitle:index.of.bucket*
Copy Intrusionz3r0@htb[/htb]$ cloud_enum -k tesla.com
Intrusionz3r0@htb[/htb]$ subfinder -d disney.com -all -silent | httpx -silent -webserver -threads 100 | grep -i AmazonS3
Intrusionz3r0@htb[/htb]$ subfinder -d disney.com -all -silent | httpx -silent -webserver -threads 100 -match-string "AccessDenied"
1.10 Fingerprinting Web Technologies
Tools:
Copy Intrusionz3r0@htb[/htb]$ whatweb target.com
Mapping the Attack Surface
JS File Analysis
Tools:
Copy Intrusionz3r0@htb[/htb]$ python3 linkfinder.py -i <https://target.com/app.js> -o cli
Intrusionz3r0@htb[/htb]$ subjs -i <https://target.com> | anew js_endpoints.txt
Intrusionz3r0@htb[/htb]$ katana -list probed_domains.txt -jc | grep "\\.js"
Resources: