Free SEO Techniques Simplified

Search Engine

Search Engine Optimization is a constantly evolving art form or science.  A genuine search engine optimization specialist is very much in demand and usually makes a lot of money.  (In my case, I’ll work for for slightly less than a lot.) It’s all about doing what is necessary to get the pages of your website indexed (added to the search engines) and then ranked.

The way the search engines and SEO works…

In order to determine how they are going to index and rank websites and their individual pages, the major search engines send out a program known as a search engine robot (a ‘bot’), or ‘spider’, to determine what each of those pages is about.

What the search engines are actually looking for when they send their spider to a website is something of a mystery, with Google in particular being unwilling to give specific guidelines on how to create a website that they are going to like. Yahoo and MSN are more transparent, but many SEO experts would suggest that their indexing and ranking system is considerably more old fashioned than Google’s.

To complicate things even further, Google changes their search algorithms on a very regular basis, at least partially because the better SEO experts become far too good at ‘second-guessing’ what they are doing!

How to optimize

The first step in optimizing your website for the search engines is on-page optimization. Although Google does not pay a lot of attention to on-page optimization, it is important  because the other major search engines like Yahoo, MSN and Ask still focus on it. On-page optimization concerns making the underlying code used to create each of the pages of your site as search engine friendly as possible.

Each web page is written in a code that is converted by your web browser into information and images that show on the screen as a web page. Most of the code used to create a webpage is there because it creates information and images on the screen, but a portion of it is created solely to provide background information to the search engine spiders.

If you are unfamiliar with how this underlying code looks, open a webpage from any site in your favorite browser, right click on the page, and click on the ‘View’ and the ‘Source’ link if you’re using Internet Explorer or ‘View’ and then ‘Page Source’ with Mozilla Firefox.  Pay particular attention to what is at the top of the page, between the <head> and </head> tags…and more specifically items labeled “meta”.  These are called Meta Tags. 

On-page optimization is about ensuring that this code is written so that it does everything possible to tell the search engines exactly what each of the pages of your site are about. The more efficiently you can do this, the more accurately they are going to index and then rank that page for the correct search term, improving the chances of that page being found by a searcher who is specifically looking for the kind of information you are providing.

As search engine gurus become increasingly savvy with “on-site” optimization, they learned to game the system which in turn makes search engine results increasingly unreliable.  When you build a webpage you include a few keywords and keyword terms that you want to target in the search engines to get traffic to your page. At one time, there was no limit to the number of keywords you could target, and you could even include keywords that had nothing at all to do with the content of your page.  Unscrupulous web developers and SEO experts would create extensive keyword lists that included many popular search terms, ensuring that individual pages showed up in searches for those fashionable terms.  This became a huge concern for Google, Yahoo and MSN.  It was annoying their advertisers, and these advertisers are the root of their business.  As a result they changed the rules.  These days if there are more than 10-20 keywords and phrases, the search engine spiders just ignore them.

Since the search engine algorithms are constantly evolving it is difficult to stay ahead of the curve.  For this reason it is best to stick with a few tried and true basics.  Stick to optimizing your site in an honest and  straightforward  way:

  • Use a domain name that represents your business and describes what you do. For example, if your company is JR Smith, Inc. and you make mousetraps, then your domain name should be something like jrsmithmousetraps.com.  It should be something that is logical and simple for your customers to remember.
  • Build the site for your human visitors. Regardless of how well optimized your site is, remember that no spider or ‘bot’ is ever going to buy your products or services, so putting all of your efforts into impressing the search engines is a waste of time.  In addition, the search engines (and Google in particular) attach a lot of importance to the value and quality of the pages that you show to the visitors they send to your site. They consider how long people spend on your site, and how often they ‘bounce’ (open your page and immediately close it).
  • Ensure that certain things are included in the HTML code of your page. pay attention to the meta-tags. They tell the search engine spider what that page is all about by including the title (a keyword rich title that can be different from the actual page title), a description and a list of keywords.  These are invisible to the average surfer, but are seen by the search engines. Avoid using the same title, description and keywords on every page.
  • Each page of your site should focus on a maximum of two keyword phrases, one should be a major keyword phrase, and the other a minor phrase. Your title, description and keyword list should feature those keywords, preferably as near to the beginning as possible.
  • Meta tags: The title should be made up of your two keyword phrases divided by a pipe, and the description should only be a couple of sentences long. Do not include more than 10-20 targeted keyword phrases, because the search ‘bot’ might ignore your keyword list or even your entire page.
  • Add ‘alt-tags’ to any and all images on your site. These are a simple text description of what the picture shows. While the search engine spiders cannot see images themselves, they can read the text description, which gives them another indication of what your page is about.  These descriptions also show up in cases where a person has their browser set to not show pictures. Make sure that the alt-tag you use is appropriate to your page, because if not, you once again risk confusing the search spider.
  • HTML: Use h1 tags for the main headline of your page. Your sub-headline should use an h2 tag. Highlight bullet points on your page using h3 or h4 and so on.  Tagging important phrases this way, indicates to the search robots that the tagged text is important and indicates what your page is about. Use <strong> keywords </strong> (or bold the text if using a WYSIWYG editor) in the body text of your page to highlight your primary keywords, but refrain from doing it every time you use your keyword terms,  Keep in mind that your page has to look attractive to your site visitors. If you try to use too many heading tags or “strong” tags, it will look messy, so resist the urge to go crazy.
  • Content: The more of it you have, the better.  Content should be keyword rich, and targeted for your primary keywords for that page. Search engine optimization is about ensuring that your webpage appears close to the top of the natural search results. You need to build your pages and content around keyword phrases that match the terms that people search. You also need to pay attention to how much competition there is for a particular keyword phrase. Using a  keyword research tool you can determine not only what search terms are popular in your niche, but how much competition there is for that particular keyword or phrase. You need to find phrases that people are using to search where there is limited competition. Ideally, in the short term, you are looking for less than 30,000 competing pages.  As an example, I used a program called WEB CEO to research the keyword “mouse trap”.  I found it had 125,700 searches, and  the competition was 949,000 changing it to “mouse traps” it had 46,000 searches and the competition is 331,000.  But if I target “humane mouse traps”, while the number of monthly searches drops to 5,000, the competition is only 19,700.  That means that I have over 300,000 less people to worry about climbing over in the search engines in order to target a potential 5,000 customers…much better numbers IMHO.
  • You can create individual pages on your website for every low competition keyword phrase you find.
  • As a matter of fact, the more keyword focused content pages you have, the higher the search engines will rank you-simply because of the numbers involved. Regardless of what your business is, use your keyword list and consistently add new content pages to your site.
  • The last necessary step of on-site optimization, is to make it as easy as possible for both your human visitors and the search spiders to find everything that is on your site. To do this, create a free site map that can be added to the homepage of your site. Every time you add more content, update your site map.  In addition, create an XML site map using the same free resource, add it to a page on your site, and link from the homepage to that XML map. You should also submit it to the major search engines like Google, Yahoo and MSN.

Easy site navigation is extremely important, so you should try to ensure that people (and the search spiders) can reach any internal page of your site in no more than three clicks and get back to the homepage in one.

This concludes on-site optimization.  In my next post I will discuss off-site optimization.

SEO software by Web CEO

Leave a Reply

Your email address will not be published. Required fields are marked *