Web application security - the fast guide 1.1 | Page 73
Chapter 4 - Be the attacker
P a g e | 73
http://theSiteName.com/old/en/about
…….
As example adding Robots.txt to your brute force directory might end
with being able to get this file if existed which will provide a very good
source for information as attacker might be able to map special folders or
file depending on indexing rules set in that file.
If the file contains the (Disallow: /something) rule this will tell for sure
that (something) might contains a sensitive contents or refers to
administrative page that administrator does not want it to be index.
4.9 Other source of public information:
Figure 25 search results for archived copies of a website on archive.org
Many information that you can benefit from are available publicly about the
functionality and content outside the website those information can be reached
through search engines and cached copies , a post on development forum or
using web archives like the one exist on www.archive.org
To be able to use search engines effectively try to use the special search features
like the following that can be namely used with google:
Site: www.theExploredSite which return all references indexed by google.
Site: www.theExploredSite login that returns all pages containing login
Link: www.theExploredSite returns all pages on other websites that has link to
that specific site.
Related: www.theExploredSite returns similar web pages.
Another valuable source of information is special purpose search engines that
embed some intelligence dedicated to retrieve a specific type of information.
Melissa Data can help you freely gather information on people associated with a
target web application this kind of information sometimes hold higher level of
importance to the attacker than technical information.to enrich the retrieved