There are so many myths, legends and simple rumors surrounding most of search engine optimization practices. And they are growing exponentially. It seems that there is no consensus about anything in SEO nowadays. There are plenty of SEO blogs where you can read an article and find a dozen contradictory blog entries in the archives on the same site.
And what about all those horror stories of a mythical Google Sandbox, that keeps new sites without any PageRank or optimization for a year, as if it were some sort of virtual purgatory? What does a web site owner think after reading all this stuff?
Putting aside all these rumors and gossip, I can state with certainty one thing that I know for sure. If you create a web site oriented with user’s experience in mind and have a positive response from people, who like what you do and/or what you offer them, you have nothing to be afraid of. Naturally, it helps if, in addition, your site is properly optimized and you’ve done your homework on keyword research and other SEO 101.
If, on the other hand, you created a site in order to manipulate search engines and its spiders only, well, this is a different story. Or if an amateur SEO promised a web site owner to deliver great SEO results and this is just not happening, then this SEO has to come up with some story to explain why he can not deliver.
I am not here to judge or point fingers at anybody. I just remember the first half of the 90s when a lot of people with practically no experience started to apply for a “webmaster” job. There was web sites explosion on the Internet, then. And companies would hire just anybody who could create web page with visual Netscape or FrontPage editors. Eventually, employers figured out who was who, but this experience cost them dearly.
Monday, November 2, 2009
There are so many myths
Saturday, October 17, 2009
Some of the major problems
I can not emphasize enough that it is extremely important to be in complete control of the web site’s growth. Some of the major problems usually begin when the site starts getting bigger and bigger as the time goes by. Probably, this is when the first issues of web usability and optimization arise as the web site becomes more and more unfriendly and extremely difficult to navigate.
Not once I saw an evidence how site owners neglect their Internet property. Even if the main navigation web pages look fine, when I dive slightly deep, I encounter a large quantity of broken links, 404 errors, poorly coded sections. In turn, this bad maintenance makes indexing of the web pages impossible. Search engine spiders can not go through erroneous web pages and leave, because the web site can not be indexed.
So, why is this major rule easily forgotten by site owners. We encountered a lot of them who thought that it would be fun to maintain the site by themselves. Yet during the site growth maintenance started turning into routine that was not the first in their list of priorities. Other owners trying to save money did not hire professional webmasters but college students or amateurs who had limited knowledge about web development. There were also cases that even while having a professional webmaster on board, site owners felt overwhelmed and practically removed themselves from making any serious decisions regarding their web sites. They simply chose to make their webmaster a “scapegoat” for everything that they did not do on time.
Not once I saw an evidence how site owners neglect their Internet property. Even if the main navigation web pages look fine, when I dive slightly deep, I encounter a large quantity of broken links, 404 errors, poorly coded sections. In turn, this bad maintenance makes indexing of the web pages impossible. Search engine spiders can not go through erroneous web pages and leave, because the web site can not be indexed.
So, why is this major rule easily forgotten by site owners. We encountered a lot of them who thought that it would be fun to maintain the site by themselves. Yet during the site growth maintenance started turning into routine that was not the first in their list of priorities. Other owners trying to save money did not hire professional webmasters but college students or amateurs who had limited knowledge about web development. There were also cases that even while having a professional webmaster on board, site owners felt overwhelmed and practically removed themselves from making any serious decisions regarding their web sites. They simply chose to make their webmaster a “scapegoat” for everything that they did not do on time.
Labels:
development,
maintenance,
navigation,
optimization,
usability,
webmasters
Monday, September 28, 2009
We learned a very valuable lesson
Let me tell you an interesting SEO story. Several years ago our client subscribed to our SEO services. That was all that he wanted - optimization in organic listings. We did not have access to client’s site but he assured us that he has his own web servers and a couple of very savvy system administrators.
We completed keywords research and started a strong link building campaign. Our viral marketing strategy seemed to be fruitful as well. Yet, all our efforts had little impact on our client’s web site. As the guy was getting pretty restless, we provided him a freebie - we started remote monitoring of the web site’s health and check-ups of the web traffic to his site at certain intervals of time.
I don’t really want to go into little details but here is what we found. Every week there were several timeouts of the servers. Due to some erroneous configuration, and some other bad issues related to system management, web servers would suffocate easily and could not handle the incoming traffic.
The site just could not get properly indexed by search engine spiders. All our efforts were almost wasted because the landing pages of the site were constantly dropped from indexing.
We told the truth to our client and he dealt with this problem accordingly by replacing his “savvy” guys with more professional ones. Within a very short time all our positive work started to pop up , as the landing pages were consistently getting indexed by spiders.
We learned a very valuable lesson - you can be a genius of SEO, but if the uptime of the client’s machines is not at least over 98 percent, no winning web strategy would work.
We completed keywords research and started a strong link building campaign. Our viral marketing strategy seemed to be fruitful as well. Yet, all our efforts had little impact on our client’s web site. As the guy was getting pretty restless, we provided him a freebie - we started remote monitoring of the web site’s health and check-ups of the web traffic to his site at certain intervals of time.
I don’t really want to go into little details but here is what we found. Every week there were several timeouts of the servers. Due to some erroneous configuration, and some other bad issues related to system management, web servers would suffocate easily and could not handle the incoming traffic.
The site just could not get properly indexed by search engine spiders. All our efforts were almost wasted because the landing pages of the site were constantly dropped from indexing.
We told the truth to our client and he dealt with this problem accordingly by replacing his “savvy” guys with more professional ones. Within a very short time all our positive work started to pop up , as the landing pages were consistently getting indexed by spiders.
We learned a very valuable lesson - you can be a genius of SEO, but if the uptime of the client’s machines is not at least over 98 percent, no winning web strategy would work.
Sunday, February 8, 2009
The preferred format of the site map
Generally speaking, a site map is a page or pages that contain a list and link to all the other documents on the web site. Site maps gained importance as a SEO factor; they direct the search of search engine spiders to all content pages. Spiders love site maps. With lots of pages and deep link structure spiders would need to work hard to find all your pages. When you give them one single page which maps to all content, you make their job easier and make sure that nothing gets missed.
The preferred format of the site map for Google spiders (and recently for Yahoo and MSN as well) is a list written in XML language. It is possible to use other formats but XML is the most effective one. Google states that XML site map is highly scalable so it can accommodate sites of any size.
Nevertheless, it is always a good idea to have a regular HTML sitemap on the site. Having a standard sitemap makes user’s experience better, especially, if your web site is big and have a lot of pages. You don’t have to put everything in the sitemap, because you don’t want to look it like a linkfarm. Try to keep the number of the links on the page under one hundred and use some kind of preface for each link, so the page will be more content-intensive. Overall, use some common sense and a nice SEO web design of the sitemap page.
The preferred format of the site map for Google spiders (and recently for Yahoo and MSN as well) is a list written in XML language. It is possible to use other formats but XML is the most effective one. Google states that XML site map is highly scalable so it can accommodate sites of any size.
Nevertheless, it is always a good idea to have a regular HTML sitemap on the site. Having a standard sitemap makes user’s experience better, especially, if your web site is big and have a lot of pages. You don’t have to put everything in the sitemap, because you don’t want to look it like a linkfarm. Try to keep the number of the links on the page under one hundred and use some kind of preface for each link, so the page will be more content-intensive. Overall, use some common sense and a nice SEO web design of the sitemap page.
Labels:
content,
experience,
format,
links,
map,
pages,
search engine spiders,
site,
structure
Subscribe to:
Comments (Atom)