Dorks With DonJuji

Download as txt, pdf, or txt
Download as txt, pdf, or txt
You are on page 1of 4
At a glance
Powered by AI
The key takeaways are that dorking involves using search engine parameters and constants to find specific types of webpages, like vulnerable databases. It was also discussed that keeping dorks simple with fewer parameters often yields better results.

Dorking involves using search engine parameters to find 'constants' that do not change across webpages. It can be used to find many things online, including vulnerable websites, files like music, and more. The document gave examples of using dorks to find Emimem MP3s and vulnerable SQL databases.

Some common parameters discussed for dorking included filetype, intitle, inurl, site, and using minus signs (-) before parameters. The document also discussed using parameters like inurl:?id= to help find pages with parameters like IDs.

The right way to make dorks by hand by DonJuji...

The dorks I see people making with tools are just...disgusting. This is a quick
tutorial on how to write decent dorks.
First of all its important to understand what a dork is. Most people do not even
seem to know what the fuck they are.
They are google searches. Thats all they are. Google allows for special syntax for
advanced searching and most dorks utilize
that syntax. However, not all dorks have special syntax. Some of the best dorks I
have found are simply plaintext.

Example: "All times are GMT" "This page was generated at"

This is a dork that will give you vbulletin 5 message boards.


The reason it works is because almost all vB 5.X message boards contain that on
every page.
It is also unique to 5.x because the wording is slightly different for that message
in previous versions.
Dorking is simply the process of identifying "constants" or words/numbers that DO
NOT CHANGE from webpage to webpage.
Dorking can be used to find many things, not just vulnerable websites.

Example: intile:"index of" "slimshady LP" mp3

This will return index pages full of eminem mp3s. Personally i use dorking to find
all sorts of things.

To dork well you must understand what the different parameters google gives us to
use are.

Here is a basic explaination of those:

cache:[url] -- Shows the version of the web page from the search engine’s cache.

related:[url] -- Finds web pages that are similar to the specified web page.

info:[url] -- Presents some information that Google has about a web page, including
similar pages, the cached version of the page, and sites linking to the page.

site:[url] -- Finds pages only within a particular domain and all its subdomains.

intitle:[text] or allintitle:[text] -- Finds pages that include a specific keyword


as part of the indexed title tag. You must include a space between the colon and
the query for the operator to work in Bing.

inurl:[text] or allinurl:[text] -- Finds pages that include a specific keyword as


part of their indexed URLs.

meta:[text] -- Finds pages that contain the specific keyword in the meta tags.

filetype:[file extension] -- Searches for specific file types.

intext:[text], allintext:[text], inbody:[text] -- Searches text of page. For Bing


and Yahoo the query is inbody:[text]. For DuckDuckGo the query is intext:[text].
For Google either intext:[text] or allintext:[text] can be used.

inanchor:[text] -- Search link anchor text

location:[iso code] or loc:[iso code] region:[region code] -- Search for specific


region. For Bing use location:[iso code] or loc:[iso code] and for DuckDuckGo use
region:[region code].

contains:[text] -- Identifies sites that contain links to filetypes specified (i.e.


contains:pdf)

altloc:[iso code] -- Searches for location in addition to one specified by language


of site (i.e. pt-us or en-us)

domain:[url] -- Wider than the site: operator, locates any subdomain containing the
“suffix” of the main website's url

feed:[feed type, i.e. rss] -- Find RSS feed related to search term

hasfeed:[url] -- Finds webpages that contain both the term or terms for which you
are querying and one or more RSS or Atom feeds.

imagesize:[digit, i.e. 600] -- Constrains the size of returned images.

ip:[ip address] -- Find sites hosted by a specific ip address

keyword:[text] -- Metaoperator; that is, an operator that is used with other


operators. Takes a simple list as a parameter. All the elements in the list are
searched as and/or pairs together. keyword:(intitle inbody)software. This example
is equivalent to intitle:software OR inbody:software.

language:[language code] -- Returns websites that match the search term in a


specified language

book:[title] -- Searches for book titles related to keywords

maps:[location] -- Searches for maps related to keywords

linkfromdomain:[url] -- Shows websites that link to the specified url (with errors)

OK now you prolly arent going to need to use all of those. Here are the ones that i
used most:

filetype:
intitle:
intext:
inurl:
site:

OK so on top of those you have modifiers like "-"

if you put - in front of one these like this..... -intitle:cats

you will get returned no results that have "cats" in title because the - in front
means "remove this" without that you will get
ONLY results with "cats" in the title

so as a practical example i will show you how to find some sql databases that bad
admins have accidentally exposed and became indexed
by google.

the filetype for sql database backups is of course ".sql" so we can start with

filetype:sql

type that in and you will get mostly example databases and a lot of them from
github

to filter those out lets use the - modifier on site:github.com and add it to our
dork

filetype:sql -site:github.com

still looks like a bunch of bullshit so lets narrow it down even more by adding
something we know is a CONSTANT in sql databases
if you look through sql databases you will notice that a lot of them have

"Table structure for table `tablenamehere`"


or something similar for each table

so im going to take this and do "Table structure for table `users`" because i want
to make sure my sql database results have users tables

so now throwing it all together we have

filetype:sql -site:github.com "Table structure for table `users`"

enter that into google and you get several public facing databases with users,
emails, passwords and more

example: https://www.bioinformatics.org/phplabware/sourceer/OrderSys/ordersys.sql

which contains
INSERT INTO `users` (`ID_user`, `name`, `md5_password`, `status`, `group`,
`comment`, `username`) VALUES (1, 'Common Lab', 'c833584a58d05124ca69af49805e6c20',
'Current', 'Administrator', '', 'root');

this is the administrator password for the site. if it hasnt been changed since
that dump it can be dehashed, used to log in, and the site can be fully owned.

and none of this so far required any tools. this is simply google we are using. no
scanners, no sqli dumper, nothing but google.

THIS is real dorking.

Now you can take these principals and apply them in bulk to scanners like the one
included in sqli dumper to get sqli results if you like.

Here are some practical examples for that:

Now first of all the main issue i see with peoples dorks is that they are entirely
too specific with way too many parameters, many of which

are totally contradicting eachother.

For sqli you usually want php files so filetype:php goes a long ways.
In fact i have gotten many of the more high quality sqli's in the past simply by
taking lists in my niche from wikipedia like

LIST OF ALL PS4 VIDEO GAMES

LIST OF ALL ANIME

etc, and just addming filetype:php next to each entry in the list

simple as that.

The constant you are looking for with sqli is pretty much php with paramaters

to ensure the results have params you can add something like

inurl:?id=
or inurl:game_id=
you dont need 12 diff complex paramters and random as fuck extentions defined. KEEP
IT SIMPLE OR YOU WILL GET LESS RESULTS

MORE PARAMETERS IS NOT MORE HQ!!!

example of a good dork:

inurl:".php?id=1" "ocarina of time"

thats gonna give you 5k fucking results or something and all of them will be
potentially vuln

example of a shit dork: allintext:csgo + intitle:leagueofegends +


inurl:gamingwebsite.aspx site:com

thats gonna give you 1-2 results if any and probably not even vuln

You might also like