Advanced Reconnaissance: Compiling gathered information

On my previous posts we went over several different reconnaissance tactics and tools of the trade. It was just a start as many of these techniques are manual and require a fair amount of time to execute. But the gathered information is now scattered over different files, with different formats. It is time to take this to the next level, compiling the data to make the next steps faster and more efficient. There are many compilation options and it all is depending on the intended use of the information. However, I’ll show you how to organize your information in a robust and flexible way using as an example the data collected on SANS, namely the subdomains.

Gathering all the files

If you have been paying attention to my previous posts, you know that I’ve been collecting data on SANS and putting the individual files inside a single folder.

Files gathered under a single folder

Looking at the files, the most important ones are:

  • final-sans.org.txt – output from Osmedeus subdomain module
  • SANS-Maltego.csv – Exported results from Maltego
  • stash.sqlite – output from theHarvester
  • sublister.txt – output from Sublist3r

Apart from these files, we also have the results from recon-ng (data.db), still on the original folder.

Recon-ng output folder

To have an individual folder for each target is a question of choice, or style if you prefer. I find it convenient because it saves me time.

Creating a database

Inside the folder destined to this target, I am now going to create a new SQLite database using the tool shipped with Kali Linux.

Finding DB Browser for SQLite on Kali's menu

  • Open the tool and create a new database inside the target’s folder. I called it “SANS.db

Creating a new database

  • Create a table. I called mine “AllDomains”. Add two text fields to the table
    • Host
    • Origin

Creating a new table

  • The database should look like this:

New database

  • Import the result from the text files into new tables

Importing text files

  • Just for greater simplicity, name the new tables according to the originator application

Naming the tables

  • Now the database should look like this:

Database with the imported tables

  • Write Changes
  • Now attach the SQLite database from theHarvester

Attaching theHarvester's database

  • I always name things properly

Naming the attached database

  • Now you have an additional database to get results from

theHarvester database attached

  • Let’s attach recon-ng’s database too, shall we?

Attaching recon-ng's database

  • And now we have 5 sources of data on subdomains of the sans.org domain:

New database with all external sources

Compiling the data into the new table

The goal now is to gather the relevant data from all available sources and place it all in a single location; the field host of the AllDomains table.

Take the time to study your data sources and you will realize that theHarvester collects a lot of URLs and mixes them with the hosts. Therefore, we must filter the data by selecting the records ending with “sans.org”. Besides, we only want the hosts and not the emails and other data.

  • This can be achieved by running a single SQL command:

Using SQL to combine all useful records

  • We still have some strange domains that need to be expunged.

Deleting bad records

Now you have a table with all the subdomains and that can be the embryo for some serious information gathering on your target.

  • If you want, just for tracking purposes, state the origin of your data:

UPDATE AllDomains SET Origin="Compiled"

Final table with all data compiled

This is obviously just a simple example the illustrate the basics of my compilation method. In a real-life scenario, I would add the IP addresses, open ports, etc.

Feel free to expand this method for emails, contacts, etc.

Compiling the data into recon-ng

Another possibility, and it is something I usually do, is to send all this data back to recon-ng in order to use to dig a bit deeper using the nice scripts available in the tool. There re at least two distinct possibilities:

Adding the data to recon-ng

Let’s start by checking out how many duplicates we have in the hosts table

Checking for duplicates

  • Insert data from external sources directly into the recon-ng hosts table

Adding new records to recon-ng

  • How many duplicates do we have now?

Checking for new number of duplicates

This might look like a bad outcome but you can easily remove the duplicates if you want to.

Replacing the data in recon-ng

But why not start with a fresh set of hosts, without duplicates and with no extra information?

Let’s imagine you don’t have a compiled results table yet. You can create a new one, compile all the available data there, delete everything from the host table, and copy everything back to the empty table

  • This can be done sequentially on a single SQL run

Replacing all recon-ng hosts

  • This is the result

New recon-ng hosts table

We had 494 hosts, now we have 770.

All we have to do now is go back to recon-ng and run some of the modules taking advantage of the new set of hosts found by the other footprinting tools.

This is the advanced way of doing reconnaissance; iteration after iteration, compiling, filtering and analyzing.


Next post: Introduction to Scanning

3 comments:

sak said...

cool stuff you have and you keep overhaul every one of us
Ethical hacking Online Training

Leads Seller said...

Hi All!

I'm selling fresh & genuine SSN Leads, with good connectivity. All data properly checked & verified.
Headers in Leads:

First Name | Last Name | SSN | Dob | Address | State | City | Zip | Phone Number | Account Number | Bank Name | DL Number | Routing Number | IP Address | Reference | Email | Rental/Owner |

*You can ask for sample before any deal
*Each lead will be cost $1
*Premium Lead will be cost $5
*If anyone wants in bulk I will negotiate
*Sampling is just for serious buyers

Hope for the long term deal
For detailed information please contact me on:

Whatsapp > +923172721122
email > leads.sellers1212@gmail.com
telegram > @leadsupplier
ICQ > 752822040

Anonymous said...

TESTIMONY ON HOW I GOT MY LOAN AMOUNT FROM A RELIABLE AND TRUSTED LOAN COMPANY LAST WEEK. Email for immediate response drbenjaminfinance@gmail.com

Hello everyone, My name is Mrs. Carolin Glowski, I'm from Europe, am here to testify of how i got my loan from BENJAMIN LOAN FINANCE after i applied Two times from various loan lenders who claimed to be lenders right here this forum, i thought their lending where real and i applied but they never gave me loan until a friend of mine introduce me to {Dr. Benjamin Scarlet Owen} the C.E.O of BENJAMIN LOAN FINANCE who promised to help me with a loan of my desire and he really did as he promised without any form of delay, I never thought there are still reliable loan lenders until i met {Dr. Benjamin Scarlet Owen} who really helped me with my loan and changed my life for the better. I don't know if you are in need of an urgent loan also, So feel free to contact Dr. Benjamin Scarlet Owen on his email address drbenjaminfinance@gmail.com


THANKS