1. Crawling
    Crawling the billions of documents, pages, files, news, videos, and media on the Web. Indexing
    Once the engines do their job they have all this information about each site and they store it in in massive databases. This information is then accessed when it is needed for a search query. Serving
    Google then takes all that information and runs it through a series of algorithms and determines your ranking based on a given search query.
  2. Indexing
    Once the engines do their job they have all this information about each site and they store it in in massive databases. This information is then accessed when it is needed for a search query.
  3. Serving
    Google then takes all that information and runs it through a series of algorithms and determines your ranking based on a given search query.

1. Crawling

A robot, spider, or crawler are programs whose sole purpose is to scan your website and every other in the world.

The Spiders, Crawlers, Robots begin by going to few web pages and reading and storing the content that they find and then following any links contained on those pages.

Imagine the Web as a network of stops in city’s transportation grid.

Each stop is a unique document (usually a web page, but sometimes a PDF, JPG, or other file).

The search engines need a way to get from one destination to another throughout the city. They do this by following links.

The link structure of the web serves to bind all of the pages together.

2. Indexing

Once the engines do their job they have all this information about each page and they store it in massive databases. This information is then accessed when it is needed for a search query.

When you are perform a google search, you aren’t actually searching the web.

You are searching Google’s index of the web.

  • Googlebot processes each of the pages it crawls in order to compile a massive index of all the words it sees and their location on each page.
  • In addition, google will read and process information included in key content tags and attributes, such as Title tags and ALT attributes.
  • Googlebot can process many, but not all, content types. For example, we cannot process the content of some rich media files or dynamic pages.

3. Serving

Google then takes all that indexed information and runs it through a series of algorithms.

Algorithm: a series of questions

  • How many times does this page contain your keywords?
  • Do the words appear in the title, url, content?
  • Does the page include synonyms for those words?
  • Is this page from a high quality website or a low quality?
  • What is the page’s Page Rank?
  • …and many more

So Google goes on and asks around 200 questions (in .5 seconds). Wow!

All the factors are combined together and scores are given to each page and that determines the order that the results are delivered to you.

Organic: 85-90%
Search results from the method we just talked about.

PPC: 10-15%
Advertisers bid for place in these positions via Google AdWords, or Bing AdCenter

The ONLY way to influence the rankings is with SEO

Ready to get started?

Get in touch, or signup for a service.