Creating A Google-friendly Site

If you're new here, you may want to subscribe to my RSS feed. Thanks for visiting!

Yes, in case you are doubting, it is important and necessary to create a friendly site for Google spider (also known as googlebot), unless you do not want to receive any traffic from Google.

Google is the largest player in the whole search engine market and occupies 80% of the searches market (include videos, images, blog searches). Any Internet marketers who feel that they can survive online without traffic from Google must be insane.

If you are one of those, better change your thinking now!

According to the guidelines from Google Webmaster blog, the following are required before it can be considered a ‘Google-friendly’ site:

Give visitors the information they are looking for

If you think that the content is king, then you are wrong. Content itself is of no use if they are not valuable to your visitors. In actual fact, high-quality content is king, especially in your homepage. If your site contains useful information, the content will attract many visitors and entice webmasters to link to your site.

What you can do: Do a keyword research and find out those words that users would type to find your page. Create a helpful, information-rich keyword focus site, write pages that clearly and accurately describe your topic.

Make sure that other sites link to yours

Google gauges the quality of your site through the quality of back links to your site. The number of quality backlinks help googlebot to find your site faster and can give your site greater visibility in search results.

When returning results for a search, Google combines PageRank with sophisticated text-matching techniques to display pages that are both important and relevant to each search. PageRank is a measure of the number of quality backlinks to your site. The more backlinks you have, the higher is your PageRank score.

What you can do: Provide high quality content on your site such that webmasters can’t resist linking back to you. In addition, actively source for high quality site of the same niche as your site and request a link exchange. (I strongly recommend sitesell’s value exchange).

Make your site easily accessible

This refers to the structure of your site. There shouldn’t be any dead links on any page of your site and every page should be reachable from at least one static text link.

What you can do:

  • Always use a text browser, such as Lynx, to examine your site. Most spiders, including Googlebot see your site much as Lynx would.
  • Avoid javaScript, cookies, session IDs, frames, DHTML, or Macromedia Flash. They are disregarded by the spiders.
  • If you are creating dynamics pages, consider creating static copies of it. It will make the spider crawling easier. Don’t forget to add your dynamic page to your robot.txt if you create static copies. This is to prevent the googlebot from treating them as duplicates.

What you should avoid?

  1. Avoid black hat SEO techniques, which include:
    • Keyword cloaking
    • put up “crawler only” pages
    • invisible links
  2. Don’t use images to display important names, content, or links, use ALT tags instead.
  3. Don’t create multiple copies of a page under different URLs. If you need to create a text-only or printer-friendly versions of a page, make sure you block the spider from indexing it via robot.txt or a meta “NOINDEX, NOFOLLOW” tags.
Don’t forget to subscribe to the Internet Strategy Blog RSS Feed!

Leave a Reply