Get Adobe Flash player

Search

Search this site for:


Related Links






Valid XHTML 1.0 Transitional

Valid CSS!





Which side are you on in the Search Engine arms race?


Discusses the escalating arms race between the search engines and web marketers trying to find loopholes in how search engine rank sites.
Ever since the birth of the web and the introduction of search engines there have been a constant battle between search engines contra marketers trying to circumvent their algorithms.
The search engines collect the text out of your pages and use it to analyze your web site. Then if someone do a search on a subject that is on your site your web site will show up on the result page.
However, if you have not optimized at least some of the pages then your site might show up far down the line and in many cases not at all.
Because the algorithm used by the search engines are done by machines and not by humans there have always been some loopholes which peoples have been able to exploit.
In the beginning there where no limits on how many keywords you could put in the keyword Meta tags. The keyword Meta tag was introduced so the web owners could guide the search engines on what the page was all about. Many web masters started to fill the Meta tags with redundant words and words which what not relevant for the web page.
Then people starting to put more and more of the search term in the text body. At the time search engines listed the sites rated according to how many times the keywords was used in the text. To hide this text people started to put the text in the same color as the background.
Soon the algorithms of the search engine started to change so that the web pages that had keywords many times in the text body went to top but were penalized if there were too many. This mean that today you should have your keywords often but not too often in your text.
Then the search engines started to weigh in the number of links linking in and out of site to calculate page relevance.
Web masters soon realized that if you have many links linking to your sites from outer sites which collect links your ranking would improve. So someone invented FFA pages. FFA stands for Free For All. These are pages with rolling links on them. The idea behind it is that if the page is found by a search engine spider, this page will be included in the search engine and the link popularity for your site will be added with one link.
The search engines counter measure was to start to look at the quality of the links linking to the sites. They start also to look if the links would come from a web sites dealing with the same subject as the site where it is linking to. Sometime in the past FFA pages may have worked, but today they are totally useless.
You can have link internally on your site between pages, but you should not have too many. There are better ways to do this. You can criss cross links between different sites with different domain to improve link popularity.
Now these techniques are all banned by the search engines. If they discover abuse, meaning a method that is only use for one purpose only and that is to increase ranking, then there is risk that you are delete from there database.
One technique which has been abused by some shadowy search engine optimization companies is cloaking. One reason is that it has been easy to do. Web optimization often is quite labor intensive and by using cloaking it is possible to do this at once and at the same present good result for clients. The get the traffic, but usually after some time they are suddenly kick out of the search engines database when the abuse have been discovered.
Cloaking means that the search engines will see an artificially large site with the important search terms optimized, but in reality the web surfers see a small site without any optimization.
More and more the search engine is able to win this battle but there will always be loopholes.
Now if we see this from the search engine point of view, what they want is to present the best possible result for the viewer. This would ideally be what the web surfer want to see.
Today it is more and more the content that determines if a site shows up in the search result. A site needs to be popular with many relevant inbound links. Their web pages need to be optimized. And the more pages it contains the better ranking it gets.
It is no coincidence that Google today is the leading free search engine company. They have been persistence in improving their relevance algorithms and fighting spam attempts from web marketers.
Per Strandberg www.catch-traffic.com Web Optimization Made Easy!
About the Author
Per Strandberg have been working with software and web development since the beginning of the web. Currently he is involved in web marketing and has created a tool for keyword distribution to help web master better optimize web sites. He has also published an e-book on search engine optimization. Currently he is working with affiliate marketing using his methods.