Friday, December 12, 2008

Concept of keyword proximity

Mostly it helps with the keywords and their proximity. Most leading engines, such as Google, pro actively use the concept of keyword proximity as part of their ranking formulas. Keyword proximity is like the reflection of proximity search - it measures how close in the text the keywords are. The best positioning is when the keyword combination act as a unit with no other words between them. Keyword proximity is applicable for keyword phrases that consist of two or more words.

We can use as an example the keyword “Melrose funeral home”. If all these 3 words are one after the other, this is the best combination for optimization. If “Melrose” is in the first paragraph on the page, “funeral” is on the second paragraph and “home” is located on the third – search engine spiders consider them as a keyword too. But this may not be as successful as the first combination unit mentioned above.

Understanding of natural human languages

One can not be considered a serious SEO specialist without at least some understanding of Natural Language Processing. NLP studies the problems of automated generation and understanding of natural human languages. All search engine spiders and robots use NLP in a certain way for improving text processing and analysis of billions web pages on the Internet. Of course, spiders empowered by artificial intelligence and text analytics capabilities are still in the process of growing up. There are many pros and cons of artificial intelligence that I am not going to discuss in this blog entry. I just want to point out that, eventually, search engine spiders will become more and more sophisticated. And I am talking not about some distant future but of something that may happen within a couple of years.

When a search engine visitor is looking for some words and expressions and clicks on the submit button, search engine starts a certain proximity search by looking for Internet documents where two or more separately matching term occurrences are within a specified distance, where distance is the number of intermediate words or characters. Now what can this knowledge help search engine optimizers?

Read more ...

Meta tags mostly lost their relevancy

Although Meta tags mostly lost their relevancy, for the time being they still get spidered by MSN and Yahoo. Some of them still matter a little for reasons not directly related to search engine rankings. For example, description attribute of a Meta tag, where the webmaster usually put the description of the web page, helps to improve the user’s experience during searches. The contents of the description appear in the organic search engine listings. Description should not exceed 25-30 words; it must be brief and informative.

In my opinion, the Meta tag attribute keywords gets ignored almost completely by Google, although Yahoo! And MSN still notice it to some degree. Yet even the latter two may start ignoring this tag too and without a short notice. There is no harm to keep this Meta tag on the web page as long as a webmaster adds there no more than 10 – 20 keywords. These keywords need to be represented on the web page though, or this may lower down the rankings.

Note ... Apparently, there is still no clarity or consensus about usefulness of keywords attribute. I read in Wikipedia that “37 leaders in search engine optimization concluded in April 2007 that the relevance of having your keywords in the meta-attribute keyword is little to none”.

Historically, the refresh Meta element was used to refresh a web page after a certain time interval. The refresh attribute of a Meta tag is also one of the ways to redirect visitors from your site to another. When used for a long time, the refresh attribute is regarded as unethical SEO practice and this can hurt your ratings.

Meta tag and its elements

I still have mixed feelings about Meta tags. As an old school webmaster, I think that Meta tags can help when you need to provide some structured information about a web page. In the good old times metadata helped search engines to categorize web pages correctly. But the situation changed in the beginning of our century when web consultants started to pro-actively manipulate Meta tags in order to achieve higher rankings on search engines.

Meta tag and its elements (also called attributes) are still supported by a very limited number of search engines. Yet, most major search engines, like Google, don’t use them for relevance anymore. Our advice is not to take Meta tags and their attributes too seriously as a way of improving your search engine rankings. This still does not mean that we should discard Meta elements completely, because they can still be important for the proper web page structure. As an example, we can point out at language attribute of the Meta tag that gains some importance if the web site uses specific language(s). Search engines may have become more sophisticated but they still consider this tag.

Read more ...

The recommended standard density

The recommended standard density for body text is 3-7% for each keyword you optimize on that page. Basically, this means that every keyword or key phrase should repeat 3-7 times for every 100 words. If keyword density goes over 10%, it looks suspicious to search engine spiders. Each web page should generally be optimized for no more than three keywords or phrases.

When the keyword is located in HTML headings, for example between <> < /h1 > or <> < /h2 > tags, it is considered important by the search engine spiders. Yet, putting keyword in the heading tags is not good enough, if the web page does not have actual text about this particular keyword.

Keywords that are located in the beginning of the web page text also matter, although not as much as keywords in the title or headings. The most important thing here is to understand that beginning of the page does not mean the visually viewed first paragraph of the text. So, check the source of the page in order to be on the safe side.

Spiders don’t read images; they just skip them with one exception. They can read text that describes the image in the ALT attribute of the <> tag. If there are images on your page, fill in the ALT attributes with some keywords about images that you describe.

Tuesday, November 4, 2008

Strategic positioning of keywords

Even several years ago strategic positioning of keywords on the page and in the source code was the major factor for successful search engine optimization. This situation has changed: in current SEO, remote votes of confidence for the web pages mean more now. I would say that sixty percent of optimization happens off-page and forty percent takes place on the web page.

When I talk about on-page optimization, I naturally need to start with keyword density. Sometimes, it is called keywords weight instead. Keyword density is a measure of how often a keyword is found in a specific area of the Web page like a title, heading, anchor name, visible text, etc. against all other words. Since search engine spiders see web page as HTML code instead of what visitors view through a browser, you must have an understanding of the structure of a typical HTML document.

The title of the page that a webmaster adds between HTML tags is considered to be one of the most prominent places on a web page. Title should not be longer than 8 words and should contain the most important keyword. It is also highly recommended to have keywords in the URL.

Read more ...

We check these forms and see problems

Another important thing to watch for in the forms of Contact, Services and/or other pages is general look of the form and the logical flow of text fields. I will try to clarify this more. We often hear client’s complaints that although the forms are working fine, he can’t get a single lead out of them. During web usability analysis we check these forms and see problems related to user’s experience.

For example, a dozen of text fields that the visitor is supposed to fill up. Or requests for somewhat private information, that Internet visitor would not want to reveal. Web site owners tend to forget that the forms should not be huge. They should contain a delicate request that supposedly will get the permission of the visitor to open a line of communication and start a dialog. When the visitor completes his site browsing and wants to contact the site owners for whatever reason, he should be guided tactfully to the contact form that does not try to collect all-encompassing information but asks for general data and, maybe offers to subscribe to the site’s newsletter or download a free electronic book or a catalog.

Tactful dealings with the visitor will increase the conversion rates of the web site. And the opposite is also true - if the visitor will consider that the forms are too prying, he/she will experience a major turn off and can even spread negativity about your site and reference this experience to other potential clients, blogs, forums.

Look with a fresh eye

After completing the web usability tasks described in the previous entry, start reviewing all presentation pages of the first level. Usually, when the information architecture of the web site has been built correctly, these pages are called Home, About Us, Contact Us, Services, Products, etc.

Look with a fresh eye at the layout of these pages and figure whether these pages are too cluttered or not. Sometimes even when the navigation is fine, web pages of the first level can be one big mess and it is hard to figure out the usefulness of various sections. Also search for obsolete pieces of information - you will find plenty of it, because owners always have problems with updating textual material and images in an orderly manner. There will be all kind of problems with the copy and sometimes many “read more” links, which are going to be bad for your next stage search engine optimization analysis.

Carefully, go through all web forms and other means of interactive communication with potential clients on the first level. Web forms very often are prone to all kinds of errors. If they don’t deliver information properly to the owner, this may be a big issue. We usually find a gazillion of problems with JavaScript validation of the forms, or MySQL database connection on the page.

Read more ...

Web site navigation, content and design

It is important to start with general review on overall web site navigation, content and design. At this stage we encounter so many regular problems. Usually any medium size web site with a decent amount of pages has its own share of broken internal links, wrong redirects, unexpected jumps to some folders that are located on the fifth level and other stuff. There will also will be either extremely heavy web pages that take forever to download on a visitor’s computer, or short ones that don’t even contain any textual material.

So many times I encounter small business owners that commit web site “suicide”. Their site is basically consists of one page with embedded Flash multi-tasking application. This is when the client get really upset realizing that he really wasted his money on this alleged web development. Especially, he gets nervous when we explain that search engine spiders don’t read Flash and his site needs to be redesigned completely (if not started from scratch).

Very positive sign

It seems that as the time goes by, many web site owners become more savvy with Internet issues. More and more clients, especially those who invested heavily in the development of the web site, ask us to evaluate their creation and provide them with detailed web usability report. I consider this a very positive sign and big advantage for us as web analytics company. It is much easier to deal with the decent web site that was evaluated and re-worked according to our specs, than to optimize something that does not create great user experience and is hostile to search engine searches.

Recently I got read several interesting articles on approaches towards web usability evaluation of the sites. I don’t agree with every single view on this matter but still would like to point out the steps that are a must when client hires your company to conduct this type of work.

Read more ...

Monday, October 27, 2008

We don’t believe in white hat stuff

Of course, there is another way to try to improve your mass media campaign. We are talking about purchase of links, that we totally oppose. First of all, we don’t believe in white hat stuff. Second, many companies that were selling the links went belly up because they were getting the links from adding meaningless comments to all those media sources we were talking above. So, when these media sources added “no follow” attribute to the links, or simply deleted them, who do you think turned out to be the biggest loser? Customers, of course. We are not here to judge anybody, after all, everybody tries to earn a piece of bread with butter. But, this still looks risky and unstable way to earn visibility with no guaranties. Through our web analytics research we observed small businesses that lost thousands of those kind of links overnight.

And last but not least. Watch for the changes that are going with search engines right now. We are not saying that you, guys, need to sit and try to figure out search engine algorithms. But last offers to Yahoo from Microsoft and re-positioning of ASK.com just make us wonder. Right now, there are sites out there, that already got link recognition from Google and Yahoo but just don’t get noticed by MSN, or the other way around. Those who found themselves in these situation, know what we are talking about.

This is what our web analysts observe

We would not suggest small business guys to get involved with PR companies. Honestly, there is nothing you or your courageous employees can’t do without spending thousands of dollars, paying for outsourcing PR. Just surf the net and pay 10 bucks here and 20 bucks there for participation in PR feeds. Personally, we think that this trend is on its way out and is not as powerful as it used to be. At least, this is what our web analysts observe. Let’s wait and see.

It seems that old search engine preferences for all those .edu, .gov and big non-profit .org sites are still out there. If the main theme of your site somehow is relevant for these sites, then you are in luck. All you have to do is to convince them to add your link on their sites. These powerful sites can drastically change to the better your marketing campaign.

Google Blog Search and other tool that appear swiftly like mushrooms after the rain can help you find influential blogs and bloggers. By using link baiting principles you can start some sort of discussion on these blogs, or find some controversy (but be extremely careful here). The popular blogs never have stale content, they are extremely dynamic and, thus, can bring hundreds of visitors to your site and/or add a lot of relevant links.

Read more ...

It is closer to the real world

There is a number of ways to get around massive spendings on your media campaign. Big, rich corporations usually have a huge budget for all kinds of marketing campaigns, so let them be!

It is closer to the real world for small business guys. For 2008 link building opportunities remain mostly the same with some exceptions. Here are observations from our web analytics company.

It is still good to participate in various social networks. Unfortunately, some of them followed Google guidelines and combined every link with “no follow” attribute, which is not gonna help you. Most social networks jealously watch for any type of “commercial” text or other types of data in blog entries. So, beware small business guy, if they catch you, they will throw you out of the network without any mercy. Naturally, they want to get each and every subscriber for themselves and sell him all the stuff without anybody standing on their way.

It is still a good idea to find a way to prestigious sites like Washington Post and the like. The drawback is that you’ve got to be a really talented journalist with a highly interesting topic to offer them. As for the attempt to add your comments to high value media sources, well, it seems that editors watch people who add comments like hawks and clean everything on a regular basis.

Good link-baiting applications are still OK and might help your viral marketing campaign. If your apps are really interesting people can post a link from it on their sites and with each click, you will become more and more visible on the Internet. You will still need to constantly conduct your web analytics tests to see if this strategy is effective and brings you desired Internet traffic.

Read more ...

Thursday, October 23, 2008

Unreliable Time on Site

So, don’t be surprised, if the visitor went through some 8 web pages on the site before he left it and Google Analytics would show the time spent on the site as two minutes only. This is as much as we can get from this metrics nowadays. We won’t be able to know, if the visitor left the browser and did not shut it down. We would not know, if the visitor left the site by simply typing up new URL in the address text field. It might be that the client spent much more time in reality than the calculated by GA average time on site.

But what if the web site in question is a blog? Then, it is tough luck for you. Mostly, because, everything is happening on the same page of the blog and your visitor does not go deep through several web pages. It may take him a lot of time to write a comment, or to read the latest blog entries located on the same first-level page. So, the average Time on Site metric for blogs is usually extremely unreliable.

Time on Site metric

All business owners and CEO constantly worry about the metric that is called Time on Site. They just love this stat and they demand from the web analyst to establish precisely how long visitors hang around their web sites. Just several years ago, old software would give web analysts peculiar and weird results. But as the time went by, technologies for calculating this metric definitely improved.

On the surface, Time on Site metric itself is easy to understand, as well as number of visits and number of unique visitors on the site. Maybe, that’s why business owners like it so much. Yet there are still some certain drawbacks related to this metric that I am aware of.

Usually, web analytics tools, like Google Analytics, log each request of the Internet visitor who navigates to the web site. They apply time stamp to each request for all time that the prospect spends on the web pages until he exits from the site. And in the end of a certain time frame, analytics tools compute the difference between the timestamps on the pages.

Read more ...

Customers need to trust site

In order to reach their target audience, website owners have to concentrate their efforts on site’s content, copy and design. Naturally, not by building an excessive amount of Flash applets and freaky animations. Website has to find out exactly what visitors want and provide relevant response to their needs. For example, if an Internet searchers click on the sponsored link, they should land at exactly the right place on the site, not at the home page. The majority of visitors simply will not work their way from the home page to the relevant page and attempt for conversion will fail.

All factors mentioned above are not possible without trust, due to the simple fact that a website will not sell anything if it can not gain trust. Customers need to trust site enough to part with their money. So, it certainly helps if a website is equipped with privacy policy, great SSL encryption, articles and tutorials, true testimonials of other satisfied and happy customers, who already purchased from the website. Conversion rates will increase, if website’s shipping procedure does not cause headaches for the clients, and prospects can easily find contact information and get decent support in case of emergency.

So the question arises, how to verify that all this strategy is working? Or, in other words, how to consistently improve the website conversion? You guessed it right. By measuring, testing and experimenting ...

One of the ways to improve conversion rates

When a website persuades the visitor to take the intended action like submitting the form, subscribing to newsletter or services, downloading your materials or buys some product a full-blown conversion occurs. Most of the websites are looking for prospects who will eventually contact them over the phone, e-mail or through some online form. The visitors hold all the cards. Why? First of all, it is due to the fact that a website needs to get permission from them to obtain information and respect their privacy.

If a website does not provide visitors with some sort of explanation why they need to continue a dialog, an attempt to achieve conversion will fail. A website must have very convincing reason for visitors to get their permission to communicate with them. The percentage of people who are going to continue the dialog with the website will eventually increase its conversion rates.

One of the ways to improve conversion rates is to build the prominence of the website. It all starts with the domain name and continues throughout entire website short-term tactics and long-term strategy. Believe it or not, but Internet visitors notice that certain websites stand out from their competition. They do notice useful sites because so many of them tirelessly browse the Web in search of information. If the website manages to educate potential customers about its offer and even give them back something useful, Internet searchers will choose that web site over the rest of the pack.

Read more ...

People often ask our advice

People often ask our advice on how to improve website conversion rates. Unfortunately, there is no short answer to this question. If we try to sum it all up, we would say, that measuring, testing and experimenting are the primary keys to improving online conversion rates. Then we would, probably, have to re-think this answer and add to this list website prominence, its relevance to the potential customers, target marketing and trust.

Over the years Internet marketing is rapidly changing from a mixture of voodoo and educated guesses into science. Web analysts have at their disposal more or less impressive measurement tools and tracking software. They know how to properly track visitors’ behavior with these tools. They constantly analyze and decode the results of the measurements and understand why things happen. Without tracking software web analyst may not realize how effective the website is. Or, the reasons why certain web pages have high bounce rates while others don’t.

When a web analyst measures the conversion of any web site, he, basically, differentiates this process in two ways. There are certain small conversions that should be tested and measured repeatedly and full-blown conversions. Small conversions are steps that web site visitors take on the road to full-blown conversion. They guide visitor to the final website goal and can be anything from a simple call to action, like “click here” to something more valuable like “find out how to download our free eBook”.

Read more ...