The way I do my recon on web applications both vertical and horizontal reconnaissance is to combine a couple of tools into one bash script. Instead of creating my own custom recon script and re-inventing the wheel just I just reuse tools.
Executing the script
I am showing few ways I run the script:
# This is the host file I am using.
~$ cat targets.txt
yahoo.com
paypal.com
# Basic run script with targets.txt
~$ rickjms-recon.sh -f targets.txt -o output-folder
# This will execute script and output all contents: automation-output.07.23.21
~$ rickjms-recon.sh -f targets.txt -o automation-output.07.23.21
# This will execute script as dryrun showing all commands executed.
~$ rickjms-recon.sh -f targets.txt -o automation-output.07.23.21 -n
-o output folder everything will be saved into
-f file containing list of domains to search on
Directory Structure
This script will build a directory structure to hold and separate all the information.
./post-scanning
./post-scanning/website-crawling
./post-scanning/waybackurls
./post-scanning/haktrails
./post-scanning/alive-urls
./post-scanning/dnmasscan
./maybe-out-scope
./scans
scans/
This folder will contain the recon tools output, assetfinder, crtsh ...etc, I then use all the information within this directory that will be used within our content discovery tools (which I need to improve on)
output-folder: ls scans/
assetfinder.host.out crtsh.host.out hosts-amass.host.out ips-amass.out subfinder.host.out sublister.host.out tls_bufferover.out webaddress_scan_data.host.out
post-scanning/
All the content discovery data will be stored here, I know there are things i can add here this portion of the script can be improved. I'll add ideas below that I plan on adding.
ls post-scanning/
alive-urls/ dnmasscan/ fff-output/ haktrails/ nmap.out.gnmap nmap.out.nmap nmap.out.xml waybackurls/ website-crawling/
Additional ideas
Add in some recon tricks I learned within pentesterlabs recon badge.
I highly recommend this learning services, not only does he tech you how to hack he builds his own exploits for web application vulnerabilities, also keeping them abstract so you can reuse them when you are looking for bugs.
I will not disclose any techniques but they will eventually be added.
Add some javascript endpoint search (golinkfinder maybe?)
aquatone, however i would like this script to be run on a VPS which makes it more annoying to pull all the screenshots down. Also not to mention my goal is to have a lightweight script output, screenshots take up space quickly.
Look for endpoints / parameters in waybackmachine (zseano)
Add some waybackurl robots.txt (zseano)
Recon Tools used and techniques
assetfinder (tomnomnom)
Custom script (ill show below) the uses crtsh.sh (https://crt.sh/?q=domain-name-here). This pulls all domains subdomains associated with the domain. (@Nahamsec)
Custom Script (code below) that uses tls.bufferover.run/dns this pulls ips with associated domains. (I forget who i got this I apologize i think its @Jhaddix or @NahamSec)
subfinder (projectdiscovery)
sublist3r (aboul3la)
Content discovery phase (Not completed I want MOAR)
This phase I try to get content discovery tools to run on the above domains subdomains, I make use of tomnomnom's `inscope` tools to ensure I do NOT get out of scope.
I first consolidate all the scan data into one file using two tools from tomnomnom
anew (to prevent duplicates from getting added)
inscope (to keep domains / subdomains in scope)
waybackurls (tomnomnom)
httprobe (tomnomnom) Find alive hosts (used for other tools)
hakrawler (hakluke)
Fast golang web crawler for gathering URLs and JavaSript file locations. This is basically a simple implementation of the awesome Gocolly library.
haktrails (hakluke) Disabled this can quickly burn through your api quota
This uses securitytrails api. Would I recommend it? yes if and only if you can afford it, this service can be expensive especially if you try to hit verizonmedia. I burnt my entire api in one amass search.
subjack (haccer) Not really needed just here for an easy win
Subjack is a Subdomain Takeover tool written in Go designed to scan a list of subdomains concurrently and identify ones that are able to be hijacked.
nmap
nmap -iL ALIVE_HOSTS.txt -T4 -oA output.txt
Debating putting in naffys recon trick
nmap -T 4 -iL hosts.txt -Pn --script=http-title -p80,4443,4080,443 --open
fff (tomnomnom) The Fairly Fast Fetcher. Requests a bunch of URLs provided on stdin fairly quickly.
I also output the command for amass, I opted out from running this command since its super slow, you can easily run this in a tmux session later.
echo "amass enum -max-dns-queries 9 -o amass.out -src -ip -df $FINAL_TARGETS -brute -config ~/.config/amass/config.ini -active"
I also added something @Jhaddix suggested in one of his presentations using dnmasscan.
echo "sudo dnmasscan $ALL_HOST_DATA $DNSCAN/dnmasscan.out -p$PORTS_TO_SCAN -oG $DNSCAN/masscan.out"
Note: $ALL_HOST_DATA = all the alive hosts (httprobe post scan data) this makes sure we only scan hosts that are alive.
Other tools used within script
I plan on creating an install script that will auto install all the tools used by my recon script however right now its not complete. I will list the none recon tools I use within my script that need to be installed
anew
inscope
Executing the script
The script was designed to automate my repetitive tasks allowing me to spend less time on running the command I generally run and focus on other more manual tasks.
rickjms@rickjms-20:~$ ./rickjms-recon.sh -h
Usage:
-h Display help menu
-f FILENAME Run recon with file of target domains
-n Execute a dry run listing all the commands executed and tools used
-o PATH/TO/OUTPUT Change the output directory default is current directory
-t USER_TARGET Run recon against single domain
-s Silent Mode do not post output
-d Enable Debugging mode
-l LIGHT SCAN Mode Only run the quick scans (assetfinder, crt.sh, tls.bufferover.run)
-w Skip the waybackurl lookup.
automation.07.11.21-2$ ls */
maybe-out-scope/:
post-scanning/:
alive-urls dnmasscan fff-output haktrails waybackurls website-crawling
scans/:
assetfinder.host.out subfinder.host.out tls_bufferover.out
crtsh.host.out sublister.host.out webaddress_scan_data.host.out
The script is far from perfect it does have a happy path does not take in wildcards input. The argument for passing in a file -f is a list of domains (Below test.txt). This script is just something I put together over a weekend and decided to add functionality when I feel something is missing.
test.txt
:~$ cat test.txt
example.com
yahoo.com
pentesterlabs.com
Running the -n flag will execute the script in "Dry mode" this will only display the sub-commands that would execute if you were to run it with those parameters.
Example output (most is redacted else will fill entire page with output) essentially if you wanted to run the script manually you can execute every command listed and it will produce same output.
[...]
subfinder -dL automation/automation.07.14.21-3/25ea18dd-USERTARGETS.txt -t 100 -o automation/automation.07.14.21-3/scans/subfinder.host.out -nW
python3 ~/tools/recon/Sublist3r/sublist3r.py -d example.com
python3 ~/tools/recon/Sublist3r/sublist3r.py -d pentesterlabs.com
python3 ~/tools/recon/Sublist3r/sublist3r.py -d yahoo.com
[WARN] Please ensure you have .scope file to ensure youre within scope
[INFO] Running 'inscope' to ensure targets are inscope
[INFO] Running httprobe This might take awhile please be patient...
cat automation/automation.07.14.21-3/scans/webaddress_scan_data.host.out [...]
I need to clean up the script and will be made public shortly, on my github page
Installation (install-rickjms-tools.sh)
The obvious installing python, and golang to get all the tools up and running, I did my best at creating the install script. The script has not been fully tested especially around the git modules installation this might have to be a manual process. It is not hard just follow the code.
Disclaimer
See any mistakes leave a message I will correct it thank you.
コメント