Friday, December 12, 2008
Concept of keyword proximity
We can use as an example the keyword “Melrose funeral home”. If all these 3 words are one after the other, this is the best combination for optimization. If “Melrose” is in the first paragraph on the page, “funeral” is on the second paragraph and “home” is located on the third – search engine spiders consider them as a keyword too. But this may not be as successful as the first combination unit mentioned above.
Understanding of natural human languages
When a search engine visitor is looking for some words and expressions and clicks on the submit button, search engine starts a certain proximity search by looking for Internet documents where two or more separately matching term occurrences are within a specified distance, where distance is the number of intermediate words or characters. Now what can this knowledge help search engine optimizers?
Read more ...
Meta tags mostly lost their relevancy
In my opinion, the Meta tag attribute keywords gets ignored almost completely by Google, although Yahoo! And MSN still notice it to some degree. Yet even the latter two may start ignoring this tag too and without a short notice. There is no harm to keep this Meta tag on the web page as long as a webmaster adds there no more than 10 – 20 keywords. These keywords need to be represented on the web page though, or this may lower down the rankings.
Note ... Apparently, there is still no clarity or consensus about usefulness of keywords attribute. I read in Wikipedia that “37 leaders in search engine optimization concluded in April 2007 that the relevance of having your keywords in the meta-attribute keyword is little to none”.
Historically, the refresh Meta element was used to refresh a web page after a certain time interval. The refresh attribute of a Meta tag is also one of the ways to redirect visitors from your site to another. When used for a long time, the refresh attribute is regarded as unethical SEO practice and this can hurt your ratings.
Meta tag and its elements
Meta tag and its elements (also called attributes) are still supported by a very limited number of search engines. Yet, most major search engines, like Google, don’t use them for relevance anymore. Our advice is not to take Meta tags and their attributes too seriously as a way of improving your search engine rankings. This still does not mean that we should discard Meta elements completely, because they can still be important for the proper web page structure. As an example, we can point out at language attribute of the Meta tag that gains some importance if the web site uses specific language(s). Search engines may have become more sophisticated but they still consider this tag.
Read more ...
The recommended standard density
When the keyword is located in HTML headings, for example between <> < /h1 > or <> < /h2 > tags, it is considered important by the search engine spiders. Yet, putting keyword in the heading tags is not good enough, if the web page does not have actual text about this particular keyword.
Keywords that are located in the beginning of the web page text also matter, although not as much as keywords in the title or headings. The most important thing here is to understand that beginning of the page does not mean the visually viewed first paragraph of the text. So, check the source of the page in order to be on the safe side.
Spiders don’t read images; they just skip them with one exception. They can read text that describes the image in the ALT attribute of the <> tag. If there are images on your page, fill in the ALT attributes with some keywords about images that you describe.
Tuesday, November 4, 2008
Strategic positioning of keywords
When I talk about on-page optimization, I naturally need to start with keyword density. Sometimes, it is called keywords weight instead. Keyword density is a measure of how often a keyword is found in a specific area of the Web page like a title, heading, anchor name, visible text, etc. against all other words. Since search engine spiders see web page as HTML code instead of what visitors view through a browser, you must have an understanding of the structure of a typical HTML document.
The title of the page that a webmaster adds between HTML tags
Read more ...
We check these forms and see problems
For example, a dozen of text fields that the visitor is supposed to fill up. Or requests for somewhat private information, that Internet visitor would not want to reveal. Web site owners tend to forget that the forms should not be huge. They should contain a delicate request that supposedly will get the permission of the visitor to open a line of communication and start a dialog. When the visitor completes his site browsing and wants to contact the site owners for whatever reason, he should be guided tactfully to the contact form that does not try to collect all-encompassing information but asks for general data and, maybe offers to subscribe to the site’s newsletter or download a free electronic book or a catalog.
Tactful dealings with the visitor will increase the conversion rates of the web site. And the opposite is also true - if the visitor will consider that the forms are too prying, he/she will experience a major turn off and can even spread negativity about your site and reference this experience to other potential clients, blogs, forums.
Look with a fresh eye
Look with a fresh eye at the layout of these pages and figure whether these pages are too cluttered or not. Sometimes even when the navigation is fine, web pages of the first level can be one big mess and it is hard to figure out the usefulness of various sections. Also search for obsolete pieces of information - you will find plenty of it, because owners always have problems with updating textual material and images in an orderly manner. There will be all kind of problems with the copy and sometimes many “read more” links, which are going to be bad for your next stage search engine optimization analysis.
Carefully, go through all web forms and other means of interactive communication with potential clients on the first level. Web forms very often are prone to all kinds of errors. If they don’t deliver information properly to the owner, this may be a big issue. We usually find a gazillion of problems with JavaScript validation of the forms, or MySQL database connection on the page.
Read more ...
Web site navigation, content and design
It is important to start with general review on overall web site navigation, content and design. At this stage we encounter so many regular problems. Usually any medium size web site with a decent amount of pages has its own share of broken internal links, wrong redirects, unexpected jumps to some folders that are located on the fifth level and other stuff. There will also will be either extremely heavy web pages that take forever to download on a visitor’s computer, or short ones that don’t even contain any textual material.
So many times I encounter small business owners that commit web site “suicide”. Their site is basically consists of one page with embedded Flash multi-tasking application. This is when the client get really upset realizing that he really wasted his money on this alleged web development. Especially, he gets nervous when we explain that search engine spiders don’t read Flash and his site needs to be redesigned completely (if not started from scratch).
Very positive sign
Recently I got read several interesting articles on approaches towards web usability evaluation of the sites. I don’t agree with every single view on this matter but still would like to point out the steps that are a must when client hires your company to conduct this type of work.
Read more ...
Monday, October 27, 2008
We don’t believe in white hat stuff
And last but not least. Watch for the changes that are going with search engines right now. We are not saying that you, guys, need to sit and try to figure out search engine algorithms. But last offers to Yahoo from Microsoft and re-positioning of ASK.com just make us wonder. Right now, there are sites out there, that already got link recognition from Google and Yahoo but just don’t get noticed by MSN, or the other way around. Those who found themselves in these situation, know what we are talking about.
This is what our web analysts observe
It seems that old search engine preferences for all those .edu, .gov and big non-profit .org sites are still out there. If the main theme of your site somehow is relevant for these sites, then you are in luck. All you have to do is to convince them to add your link on their sites. These powerful sites can drastically change to the better your marketing campaign.
Google Blog Search and other tool that appear swiftly like mushrooms after the rain can help you find influential blogs and bloggers. By using link baiting principles you can start some sort of discussion on these blogs, or find some controversy (but be extremely careful here). The popular blogs never have stale content, they are extremely dynamic and, thus, can bring hundreds of visitors to your site and/or add a lot of relevant links.
Read more ...
It is closer to the real world
It is closer to the real world for small business guys. For 2008 link building opportunities remain mostly the same with some exceptions. Here are observations from our web analytics company.
It is still good to participate in various social networks. Unfortunately, some of them followed Google guidelines and combined every link with “no follow” attribute, which is not gonna help you. Most social networks jealously watch for any type of “commercial” text or other types of data in blog entries. So, beware small business guy, if they catch you, they will throw you out of the network without any mercy. Naturally, they want to get each and every subscriber for themselves and sell him all the stuff without anybody standing on their way.
It is still a good idea to find a way to prestigious sites like Washington Post and the like. The drawback is that you’ve got to be a really talented journalist with a highly interesting topic to offer them. As for the attempt to add your comments to high value media sources, well, it seems that editors watch people who add comments like hawks and clean everything on a regular basis.
Good link-baiting applications are still OK and might help your viral marketing campaign. If your apps are really interesting people can post a link from it on their sites and with each click, you will become more and more visible on the Internet. You will still need to constantly conduct your web analytics tests to see if this strategy is effective and brings you desired Internet traffic.
Read more ...
Thursday, October 23, 2008
Unreliable Time on Site
But what if the web site in question is a blog? Then, it is tough luck for you. Mostly, because, everything is happening on the same page of the blog and your visitor does not go deep through several web pages. It may take him a lot of time to write a comment, or to read the latest blog entries located on the same first-level page. So, the average Time on Site metric for blogs is usually extremely unreliable.
Time on Site metric
On the surface, Time on Site metric itself is easy to understand, as well as number of visits and number of unique visitors on the site. Maybe, that’s why business owners like it so much. Yet there are still some certain drawbacks related to this metric that I am aware of.
Usually, web analytics tools, like Google Analytics, log each request of the Internet visitor who navigates to the web site. They apply time stamp to each request for all time that the prospect spends on the web pages until he exits from the site. And in the end of a certain time frame, analytics tools compute the difference between the timestamps on the pages.
Read more ...
Customers need to trust site
All factors mentioned above are not possible without trust, due to the simple fact that a website will not sell anything if it can not gain trust. Customers need to trust site enough to part with their money. So, it certainly helps if a website is equipped with privacy policy, great SSL encryption, articles and tutorials, true testimonials of other satisfied and happy customers, who already purchased from the website. Conversion rates will increase, if website’s shipping procedure does not cause headaches for the clients, and prospects can easily find contact information and get decent support in case of emergency.
So the question arises, how to verify that all this strategy is working? Or, in other words, how to consistently improve the website conversion? You guessed it right. By measuring, testing and experimenting ...
One of the ways to improve conversion rates
If a website does not provide visitors with some sort of explanation why they need to continue a dialog, an attempt to achieve conversion will fail. A website must have very convincing reason for visitors to get their permission to communicate with them. The percentage of people who are going to continue the dialog with the website will eventually increase its conversion rates.
One of the ways to improve conversion rates is to build the prominence of the website. It all starts with the domain name and continues throughout entire website short-term tactics and long-term strategy. Believe it or not, but Internet visitors notice that certain websites stand out from their competition. They do notice useful sites because so many of them tirelessly browse the Web in search of information. If the website manages to educate potential customers about its offer and even give them back something useful, Internet searchers will choose that web site over the rest of the pack.
Read more ...
People often ask our advice
Over the years Internet marketing is rapidly changing from a mixture of voodoo and educated guesses into science. Web analysts have at their disposal more or less impressive measurement tools and tracking software. They know how to properly track visitors’ behavior with these tools. They constantly analyze and decode the results of the measurements and understand why things happen. Without tracking software web analyst may not realize how effective the website is. Or, the reasons why certain web pages have high bounce rates while others don’t.
When a web analyst measures the conversion of any web site, he, basically, differentiates this process in two ways. There are certain small conversions that should be tested and measured repeatedly and full-blown conversions. Small conversions are steps that web site visitors take on the road to full-blown conversion. They guide visitor to the final website goal and can be anything from a simple call to action, like “click here” to something more valuable like “find out how to download our free eBook”.
Read more ...