Thursday, December 9, 2010

An interesting issue for discussion

My friend, who works as a web analyst, called me today and told a story about Google Analytics. It turns out that from April 30 to May 5 of this year, Google Analytics experienced processing errors and lost a lots of valuable data. Naturally, as all corporations do when they try to save face, Google claimed that almost all data would be saved (but not all of it) and “re-processed” by May 14.

Well, to put it mildly, my friend’s valuable data was not recovered as of today. In spite of great reports that he presented to his clients previously, his GA shows 100% of bounce rate for three days in a row, although the number of visits and unique visitors were preserved and “re-processed”. Naturally, as a web analyst, my friend is frustrated, because he worked really hard to get good ratings and traffic. And now he will get a lot of questions from his clients and feel bad for them.

This unfortunate event brings an interesting issue for discussion. It also shows that web analytics is still in the process of growing up. Personally, I still think that Google Analytics is a great web analytics tool and, mind you, it is free, while other tools on the market are not. We will never find out the whole truth about what happened, but it was definitely something out of the ordinary, because I don’t remember any other time when Google Analytics let us down.

Read more ...

Tuesday, November 9, 2010

Even if you pay, this will not guarantee

There are a couple of places that are notoriously slow and problematic when it gets to submission. I would mention DMOZ, where the submission is free but it can take forever because your site needs to be reviewed by human editors. And, of course, Yahoo, when you try to register a site without paying their annual $299 for submission. Yet, even if you pay, this will not guarantee any great rankings or boost of traffic. So you might as well go and spend this money on yourself.

Anyways, there are plenty of sites that never paid for submission and never got registered at DMOZ, and, trust me, they are successful and doing fine.

If I still did not convince anybody, and you still have an urge to start submitting and resubmitting your web site, please, don’t do it on weekly or weekly basis. This will not help your web pages. If you site got into search engines databases, it won’t be thrown out even within a year. There is no reason on earth for search engines to delete hard-earned collected information out of a whim.

Wednesday, August 4, 2010

Myth of search engine submission

There is a big myth in SEO that started awhile ago but still persists nowadays. This myth states that if you submit your site to the search engines all over the world, it will boos your web site’s rankings and will bring a lot of traffic. Some of you, guys, who are knowledgeable, may shrug your shoulders. But you would not believe how deeply this myth sits in the minds of inexperienced web site owners.

Why does this myth still exist? Any web site owner receives daily a lot of useless spam from so-called SEO companies that perpetuate the myth of search engine submission. Owners usually don’t respond to that kind of junk email, but they seem to read the spam letters. Thus, this myth is not dying yet.

In fact, nothing can be further from the truth. There is no reason to submit your site all over the world. And There is positive effect witnessed by web analytics. All you are gonna do is provide freely your business information to some unknown “search engine”, let’s say, in Poland, which in turn, will highly likely make you a great candidate for receiving more spam. Especially, if you provided your e-mail address during submission.

I can count on my fingers web directories and search engines, that are worth submission of your web site. There is one one undeniable truth, though. If your site is any good and it is properly optimized, I guarantee that your web pages will be indexed by all major search engines. A couple of years ago I observed some problems with MSN indexing only for a couple of sites. Even after the submission, there was some certain delay with MSN spiders indexing your site. After applying a couple of lines in robots.txt the situation improved. But this is another story about search engine spiders, that I will tell you in another blog entry someday. So, read on.

Thursday, July 8, 2010

Going after long tail keywords

If the web site owner has a a large web site, a web analyst may count tens of thousands of these long tail keywords. He will have to weed out thousands of really weird keywords that don’t really mean anything. His next step - he will, probably, have to exclude keywords with 100 percent of bounce rate.

Yet, this is not all. He will have to process very carefully the remaining thousands of long tail keywords and compare them to the keywords that are already used in all PPC advertising campaigns. I think, it may help if a web analyst not only knows how to use Excel, databases and word processing but have solid technical background as well. He will definitely have to be savvy with regular expressions, awk, sed, and, probably Perl. But even with this technical background, I can’t imagine how much time it would take to come up with some decent and pro-active long tail keywords.

Is it worth it? I would say, that if the web site is of medium or small size, why not? But if you have one of those humongous web sites, going after long tail keywords might turn to be a huge waste of time.

Friday, June 11, 2010

About long tail keywords

I described the general keyword research in my previous blog entry and showed that sometimes the results can be pretty ineffective. It is getting harder to get to Top 10 in Google when your keywords are facing competition from billions of other web pages that are already using them.

I think, that is why a lot of SEO companies united in one chorus started talking about long tail keywords. After all, how long one can manipulate organic listings when everybody wants to be in Top 10? Besides, now that Yahoo and MSN searches combined together are way behind Google searches, it is natural to assume that the web site owners want to have their keywords on the first page of Google organic listings. Most of them would not care for their keywords to pop up on Ask.com or Dogpile or a pile of other search engines that nobody even heard about.

My problem with long tail keywords is related mostly to the combination of SEO and web analytics. We all know that long tail keywords that people search for don’t account for a great deal of searches. My question is - can they really provide significant traffic?

Read more ...

Thursday, May 13, 2010

Extensive wide keyword research

If a web site owner tries to shop around on Internet for a decent SEO contractor or a web analytics company, he will have hard time to find out who is who. He will have to do a lot of unnecessary reading about web searches that will not really tell him a lot. There is a gazillion SEO sites on Internet and each of them boasts about its ability to easily manipulate the rankings and move your keywords into top ten on leading search engines. Each will claim that the owner’s web site will become visible but none of them will guarantee the increase of your profits.

Putting all moral issues aside, I must say, that any web site owner will need a damn big magnifying glass. What if he sells Adidas or Nike shoes and need to move to the top 10 on Google keywords like “nike shoes” or “adidas sneakers”. How long do you think, it will take to move these keywords to the first page of Google? There will be offers to move synonyms instead of these direct keywords, but it is damn hard to find them and confirm that users really search for these synonyms which are as popular as keywords mentioned above.

Well, naturally, there has to be some keyword research tools and they are plenty. Some of them are free, and as you may understand, are pretty useless. Others, like Wordtracker and Keyword Discovery are more productive but they are damn expensive. An experienced SEO does not really need five hundred searches including misspellings and dubious stuff. Keyword research tools don’t really provide what you need - their results only give you a food for thought. One should not rely upon them without analysis, particularly when some of the results tend to be pretty inconsistent. And only extensive wide keyword research can bring the results that SEO is seeking.

Thursday, April 1, 2010

The beginning of our conversation about Bounce Rate

Over the years we often encountered and still do encounter web site owners who don’t have any idea what Bounce Rate really means. For some unknown reason, many still think that Bounce Rate of the web site is in fact nothing less than the Exit Rate. Well, close, but no cigar!

We find Bounce Rate quite an intriguing part of web analytics. It helps a lot when we conduct deep analysis of visitors’ behavioral patterns. Bounce rate become more important when web analysts realized that they can not really base their assumptions on the stats related to Top Exit pages. In fact, if you are using Google Analytics tool or Urchin, you will find that Top Exit pages are not the part of the default Dashboard.

Mind you, this is just the beginning of our conversation about Bounce Rate without diving deep into details. After all, what does this stats tell you? Speaking in layman terms, it means that the visitor got stuck on one page without moving to other pages. And then the so-called bounce has occurred. This information would, probably, be of the greatest value to anybody who is dealing with web pages, except for one fact ... There is no industry-standard minimum or maximum time by which a visitor must leave in order for a bounce to occur. As long as there is no consensus on that stats, leading web analysts forward all kinds of interesting and boring theories, trying to figure out the standard.

In our future blog entries we will introduce you to our web analytics approach to Bounce Rate. I plan to go through the comparative analysis on why there is so much difference in getting stuck on the page against leaving the web page. We will also explain the dependence between the Bounce Rate and web searches. So, read on.

Wednesday, March 3, 2010

Web sites with dynamic content get great rankings

We heard many similar stories from other people, including SEO specialists who were just starting and needed desperately our advice on this issue. I don’t wanna waste your time and go into details about this gray area. Here is my answer, in brief.

If you have a great content that does not duplicate articles and postings on other web sites, this can be a great boost for your web pages rankings. There is nothing wrong with the frequent changes to your content. In fact, search engine spiders favor frequent changes to your web pages because they are not fond of stale web site.

So, if you constantly update content to your web site, it is, actually, a very positive thing. However, if you make small updates to your web site and leave it be for a while and then make another small update, this can lower your rankings.

I am not going to discuss whether this is right or wrong. I just want to point out this as a reason why web sites with dynamic content prosper and get great rankings, while static pages with a good cause to exist get zero rank on the Web.

Friday, February 5, 2010

Another interesting phenomenon

There is another interesting phenomenon that we have been observing for a while - content updates. Changes to your content can improve or worsen your rankings, even though you did not do anything bad on your web site.

Our web analysts encountered this phenomenon several years ago when one of our clients that signed with us, told us a story that sounded not very believable. He had a decent web site which reached PageRank 3 with Google. One day he needed to introduce small updates to this web sites, because of some positive changes in his business. Dutifully, he opened his WYSWYG editor and changed a couple of lines on each of his web pages. Within a week or so when he opened his site in the browser, to his horror he found out that all those PR 3 web pages became unranked!

He really did not know what to do. The only solutions that came to his mind was to open his editor again and delete all the changes that he introduced on the web pages. A month or so passed but the situation did not change. At those times having a nice Google PR was considered to be an asset. That is why our future client had the same feeling as if he was mugged in the middle of the day on the public square.

Read more ...

Wednesday, January 13, 2010

The answer is simple

As I already mentioned in my previous blog entries, search engine spiders are not perfect. Like an agent Smith from the movie Matrix, they try to index the whole Internet environment, but as of now, they can not handle the job. That is why even large leading search engines can cover only a small portion of the Web. I read somewhere that this portion amounts only to sixteen percent.

So when web crawlers visit your site they will not go deeper than the fifth level, as a rule. This makes the location of files on your web site very important for proper optimization. Files that are located in the root directory on the first level usually will get better ranking. Deeply buried files will have to wait longer.

You, probably, observed the same thing that I did. Sometimes, pretty dumb and ugly web sites with boring and not properly optimized would have a pretty decent PageRank. The owner created the site in the middle of nineties of the previous century and did not really pay much attention to it. So what created this paradox?

The answer is simple. Somehow leading search engines consider old sites more trustworthy. (This just proves again my point about current imperfection of the web crawlers and spiders). This AI naive trust is based on the age of the site only because of search engines belief that the website is established and it will not vanish unexpectedly one day. And, naturally, the approach to the new web sites is much more cautious, which possibly created all these legends and myths about mysterious Sandbox.

This fact still does not mean that you should pay for your domain five years ahead. There is nothing wrong with renewing annually, because in the end this is not the decisive fact about the maturity of the web site.