My Recon Process — DNS Enumeration – Noobhax – Medium
2019-05-01 00:14:17 Author: medium.com(查看原文) 阅读量:578 收藏

Go to the profile of Noobhax

This is my first post in a series where I will go into detail on how I conduct my reconnaissance.

The idea spawned from a conversation I had with someone in the bug bounty community, and from my personal learning experience. There are a lot of articles out there, but I feel that most of them only touches parts of the subject. So I decided to go about writing in detail what, how and why I do things.

So without further ado, let’s get started.

The Workflow

The very first thing I always do is to check if the domain has a wildcard configuration. This is important information when deciding on how I initiate any automated tools such as OWASP Amass.

dig @1.1.1.1 A,CNAME {test321123,testingforwildcard,plsdontgimmearesult}.<domain> +short | wc -l
NOTE: The values test321123, testingforwildcard, plsdontgimmearesult, can be changed to anything. Just make sure it’s something that is very unlikely to be found as a valid sub domain.

If the output of the command is greater than 1 it’s a good indicator that there might be a wildcard configuration. What this means for the next steps is that brute forcing will return a lot of false positives. The reason for this is that the wildcard will match anything that is not a registered subdomain and point all of them to the same place.

To learn more, check out the Wikipedia page for Wildcard DNS record.

There are a large number of tools developed to aid in the process of DNS enumeration and recon, but my favorites are OWASP Amass, MassDNS and masscan. So these will be the ones I focus on in this article.

OWASP Amass

This is always the first thing I do after the wildcard check. Amass can in some cases take some time, so it’s nice to have this running in the background while I do some manual work.

Earlier I wrote about the wildcard configuration, and this is one of the places where that type of information proves to be important. If I know that no matter what I enter as a sub domain it will resolve, it doesn’t make sense to brute force. If this is the case, I will remove the -brute argument when launching amass.

$ amass -src -ip -active -brute -d <domain>

If for some reason the value of remaining names keeps increasing into the millions and never stops, then you should play around with the -active, -brute, -noalts and -norecursion arguments. Some times the only working option is to run:

$ amass -src -ip -d <domain>

I really wish I had a solid answer for why this happens on some domains, but so far I haven’t found any.

Parsing The Data

By default amass will create a folder amass_output where it will put any output, logs, etc. The default output filename is amass.txt and with either of the commands above it will look something like this:

[Entrust]         autodiscover.tesla.com 209.11.133.61
[Bing] mobile.tesla.com 209.133.79.82
[HackerTarget] comparison.tesla.com 64.125.183.133
[Crtsh] mfamobile-dev.tesla.com 205.234.27.209
[Forward DNS] tesla.com 209.133.79.61
....

The first part shows which source returned the sub domain, followed by the host and then a comma separated list of IP’s.

I extract the hosts from this file, to create a file named hosts-amass.txt.

$ cat amass_output/amass.txt | cut -d']' -f 2 | awk '{print $1}' | sort -u > hosts-amass.txt

This sends the content of amass.txt to cut which splits at closing square bracket, and outputs the second field. Which, in this case, means host and IP. It then uses awk to return the host. awk splits on space by default. Finally it sorts with the -u argument to filter out any duplicates. Now we’re left with an output like the one below:

autodiscover.tesla.com
xmail.tesla.com
mobile.tesla.com
...

To get the IP’s I change the $1 to $2 in awk, and use tr before sort -u to replace commas with newline then write the output to a file named ips-amass.txt

> cat amass_output/amass.txt | cut -d']' -f2 | awk '{print $2}' | tr ',' '\n' | sort -u > ips-amass.txt
209.11.133.61
204.74.99.100
209.133.79.82
...

This output contains both IPv4 and IPv6 results. If I only want a list of IPv4 addresses I add a final grep to the above command after sort -u.

grep -oE "\b([0-9]{1,3}\.){3}[0-9]{1,3}\b"

Manual work

While amass is doing its magic, I do some manual labor. This means making some word lists that will all play a part in the final word list.

First I use two online services, crt.sh and Cert Spotter. These are services which monitors certificate updates, and provides us with a searchable database of hosts.

The following command will search crt.sh and return a list of hosts. The jq is a command line JSON processor that is used to extract certain pieces of data from the returned response. Then sed is used to remove double quotes and hosts prepended with *.. The output is then sorted and duplicates removed before it’s written to a file named hosts-crtsh.txt

$ curl -s "https://crt.sh/?q=%.<domain>&output=json" | jq '.[].name_value' | sed 's/\"//g' | sed 's/\*\.//g' | sort -u > hosts-crtsh.txt

Next, this will do the same as the above, but for Cert Spotter.

curl -s https://certspotter.com/api/v0/certs\?domain\=$1 | jq '.[].dns_names[]' | sed 's/\"//g' | sed 's/\*\.//g' | sort -u > hosts-certspotter.txt

Finally I create a final wordlist using a dictionary list. There’s a lot of good lists out there, but I normally look at the lists in DNS discovery from SecLists.

$ sed 's/$/.<domain>/' subdomains-top1mil-20000.txt > hosts-wordlist.txt

This command will append .<domain> to every line and write the output to hosts-wordlist.txt

$ cat subdomains-top1mil-20000.txt
www
mail
ftp
...
$ sed 's/$/.example.com/' subdomains-top1mil-20000.txt > hosts-wordlist.txt
$ cat hosts-wordlist.txt
www.example.com
mail.example.com
ftp.example.com
...

The Final Wordlist

After amass has completed, and the manual wordlists has been created, it’s time to create the final list which will be the one I use for MassDNS.

I now have the following files

  • hosts-amass.txt
  • hosts-crtsh.txt
  • hosts-certspotter.txt
  • hosts-wordlist.txt

This will be merged, sorted and duplicates will be removed and written to a file named hosts-all.txt.

$ cat hosts-amass.txt hosts-crtsh.txt hosts-certspotter.txt hosts-wordlist.txt | sort -u > hosts-all.txt

Staying in scope

Most of the times there are some domains and hosts that are considered out of scope, meaning that it’s against the rules to perform any type of testing on them. So to avoid bringing these with me further in the process, I remove these from the final list of hosts.

I start off by adding the out of scope domain to a file named hosts-ignore.txt.

foo\.example\.com$
bar\.example\.com$
...

Then I use grep to remove out of scope hosts and create a hosts-inscope.txt.

$ grep -vf hosts-ignore.txt hosts-all.txt > hosts-inscope.txt

A quick explanation of the -Fxvf arguments

-v - Show only the results that does not match
-f - Read patterns from file

In human words this means, read pattern as strings from hosts-ignore.txt return everything from hosts-all.txt that is not a 100% match to any of the patterns.

MassDNS

Note: This step could be accomplished with Masscan alone, but from my experience it will be slower because. Another reason why I use MassDNS is because Masscan is only needed if I need to look for additional ports.

After having created a final list to work with, I turn to MassDNS to determine which of the hosts are actually online.

$ massdns -r resolvers.txt -t A -o S -w massdns.out hosts-inscope.txt
Note: It’s important to keep in mind that things can get funky for automated tools, so they should never be trusted 100%. Manual verification is often highly recommended and some times required.

The MassDNS output looks like this:

inside-stg.tesla.com. A 209.10.208.14
events.tesla.com. A 13.111.47.195
lr.tesla.com. A 36.86.63.182
marketing.tesla.com. A 13.111.47.196
...

The first part is the host, second part is the DNS record type such as A, CNAME, etc. And the final part is the IP/host which the host resolved to.

Then again it’s time to parse some data.

$ cat massdns.out | awk '{print $1}' | sed 's/.$//' | sort -u > hosts-online.txt

Here I take the output of massdns.out, using awk to get the first part, then I use sed to remove any trailing dots, and sort and remove duplicates before it’s written to hosts-online.txt.

This marks the end for my current host enumeration workflow. The next phase for the data retrieved up to this point will be documented in part two of this series where I will focus on web reconnaissance.

Bonus Steps

Masscan

After determining which hosts are online, the next interesting part is to find open ports. I could use nmap for this, but it’s way too slow for this, and masscan is perfect.

Before I get started I need to get all the IP’s from the previous massdns.out

$ cat massdns.out | awk '{print $3}' | sort -u | grep -oE "\b([0-9]{1,3}\.){3}[0-9]{1,3}\b" > ips-online.txt

Finally it’s time to run masscan. Providing it with the list of online IP’s. The following command will send 10,000 packets every second and check all 65535 ports. It will report back only the ones that are seen as open, and output will be written to a simple list file named masscan.out.

$ sudo masscan -iL ips-online.txt --rate 10000 -p1-65535 --only-open -oL masscan.out

The content of masscan.out will look something similar to this:

$ cat masscan.out 
#masscan
open tcp 25 13.111.18.27 1556466271
open tcp 80 209.133.79.61 1556466775
open tcp 443 209.133.79.66 1556467350
...

Once this is complete I can use nmap to see which services that are running on each port:

$ nmap -sV -p[port,port,...] [ip]

Final Words

That’s it for my very first article, and the first post in my recon series.

I hope you enjoyed this article and that you maybe have picked up an idea or two from this. If not, then thanks for taking the time to read it anyway. I will continue this blog series, and share what I learn.

If you have any questions or suggestions, don’t hesitate to drop a comment below!


文章来源: https://medium.com/@noobhax/my-recon-process-dns-enumeration-d0e288f81a8a
如有侵权请联系:admin#unsafe.sh