Tuesday, 24 January 2012

Informations About SEO (Search Engine Optimization)

Anurag Singh Rana
I. Introduction – What Is SEO
Whenever you enter a query in a search engine and hit 'enter' you get a list of web results that contain that query term. Users normally tend to visit websites that are at the top of this list as they perceive those to be more relevant to the query. If you have ever wondered why some of these websites rank better than the others then you must know that it is because of a powerful web marketing technique called Search Engine Optimization (SEO).
SEO is a technique which helps search engines find and rank your site higher than the millions of other sites in response to a search query. SEO thus helps you get traffic from search engines.
This SEO tutorial covers all the necessary information you need to know about Search Engine Optimization - what is it, how does it work and differences in the ranking criteria of major search engines.
2. How Search Engines Work
The first basic truth you need to know to learn SEO is that search engines are not humans. While this might be obvious for everybody, the differences between how humans and search engines view web pages aren't. Unlike humans, search engines are text-driven. Although technology advances rapidly, search engines are far from intelligent creatures that can feel the beauty of a cool design or enjoy the sounds and movement in movies. Instead, search engines crawl the Web, looking at particular site items (mainly text) to get an idea what a site is about. This brief explanation is not the most precise because as we will see next, search engines perform several activities in order to deliver search results – crawling, indexing, processing, calculating relevancy, and retrieving.
First, search engines crawl the Web to see what is there. This task is performed by a piece of software, called a crawler or a spider (or Googlebot, as is the case with Google). Spiders follow links from one page to another and index everything they find on their way. Having in mind the number of pages on the Web (over 20 billion), it is impossible for a spider to visit a site daily just to see if a new page has appeared or if an existing page has been modified, sometimes crawlers may not end up visiting your site for a month or two.
What you can do is to check what a crawler sees from your site. As already mentioned, crawlers are not humans and they do not see images, Flash movies, JavaScript, frames, password-protected pages and directories, so if you have tons of these on your site, you'd better run the Spider Simulator below to see if these goodies are viewable by the spider. If they are not viewable, they will not be spidered, not indexed, not processed, etc. - in a word they will be non-existent for search engines.
3. Differences Between the Major Search Engines
Although the basic principle of operation of all search engines is the same, the minor differences between them lead to major changes in results relevancy. For different search engines different factors are important. There were times, when SEO experts joked that the algorithms of Bing are intentionally made just the opposite of those of Google. While this might have a grain of truth, it is a matter a fact that the major search engines like different stuff and if you plan to conquer more than one of them, you need to optimize carefully.
There are many examples of the differences between search engines. For instance, for Yahoo! and Bing, on-page keyword factors are of primary importance, while for Google links are very, very important. Also, for Google sites are like wine – the older, the better, while Yahoo! generally has no expressed preference towards sites and domains with tradition (i.e. older ones). Thus you might need more time till your site gets mature to be admitted to the top in Google, than in Yahoo!.
II. Keywords – the Most Important Item in SEO
Keywords are the most important SEO element for every search engine, they are what search strings are matched against. Choosing the right keywords to optimize for is thus the first and most crucial step to a successful SEO campaign. If you fail on this very first step, the road ahead is very bumpy and most likely you will only waste your time and money. There are many ways to determine which keywords to optimize for and usually the final list of them is made after a careful analysis of what the online population is searching for, which keywords have your competitors chosen and above all - which are the keywords that you feel describe your site best.
1. Choosing the Right Keywords to Optimize For
It seems that the time when you could easily top the results for a one-word search string is centuries ago. Now, when the Web is so densely populated with sites, it is next to impossible to achieve constant top ratings for a one-word search string. Achieving constant top ratings for two-word or three-word search strings is a more realistic goal.
For instance, If you have a site about dogs, do NOT try and optimize for the keyword "dog" or "dogs". Instead you could try and focus on keywords like "dog obedience training", "small dog breeds", "homemade dog food", "dog food recipes" etc. Success for very popular one-two word keywords is very difficult and often not worth the trouble, it's best to focus on less competitive highly specific keywords.
The first thing you need to do is come up with keywords that describe the content of your website. Ideally, you know your users well and can correctly guess what search strings they are likely to use to search for you. You can also try the Website Keyword Suggestions Tool below to come up with an initial list of keywords. Run your inital list of keywords by the Google keyword Suggestion tool, you'll get a related list of keywords, shortlist a couple of keywords that seem relevent and have a decent global search volume.
2. Keyword Density
After you have chosen the keywords that describe your site and are supposedly of interest to your users, the next step is to make your site keyword-rich and to have good keyword density for your target keywords. Keyword density although no longer a very important factor in SEO is a common measure of how relevant a page is. Generally, the idea is that the higher the keyword density, the more relevant to the search string a page is. The recommended density is 3-7% for the major 2 or 3 keywords and 1-2% for minor keywords. Try the Keyword Density Checker below to determine the keyword density of your website.
Although there are no strict rules, try optimizing for a reasonable number of keywords – 5 or 10 is OK. If you attempt to optimize for a list of 300, you will soon see that it is just not possible to have a good keyword density for more than a few keywords, without making the text sound artificial and stuffed with keywords. And what is worse, there are severe penalties (including ban from the search engine) for keyword stuffing because this is considered an unethical practice that tries to manipulate search results.
3. Keywords in Special Places
Keywords are very important not only as quantity but as quality as well – i.e. if you have more keywords in the page title, the headings, the first paragraphs – this counts more that if you have many keywords at the bottom of the page. The reason is that the URL (and especially the domain name), file names and directory names, the page title, the headings for the separate sections are more important than ordinary text on the page and therefore, all equal, if you have the same keyword density as your competitors but you have keywords in the URL, this will boost your ranking incredibly, especially with Yahoo!.
a. Keywords in URLs and File Names
The domain name and the whole URL of a site tell a lot about it. The presumption is that if your site is about dogs, you will have “dog”, “dogs”, or “puppy” as part of your domain name. For instance, if your site is mainly about adopting dogs, it is much better to name your dog site “dog-adopt.net” than “animal-care.org”, for example, because in the first case you have two major keywords in the URL, while in the second one you have no more than one potential minor keyword.
When hunting for keyword rich domain names, don't get greedy. While from a SEO point of view it is better to have 5 keywords in the URL, just imagine how long and difficult to memorize the URL will be. So you need to strike a balance between the keywords in the URL and site usability, which says that more than 3 words in the URL is a way too much.
Probably you will not be able to come on your own with tons of good suggestions. Additionally, even if you manage to think of a couple of good domain names, they might be already taken. In such cases tools like the Tool below can come very handy.
b. Keywords in Page Titles
The page title is another special place because the contents of the <title> tag usually gets displayed in most search engines, (including Google). While it is not mandatory per the HTML specification to write something in the <title> tag (i.e. you can leave it empty and the title bar of the browser will read “Untitled Document” or similar), for SEO purposes you may not want to leave the <title> tag empty; instead, you'd better write the the page title in it.
Unlike URLs, with page titles you can get wordy. If we go on with the dog example, the <title> tag of the home page for the http://dog-adopt.net can include something like this: <title>Adopt a Dog – Save a Life and Bring Joy to Your Home</title>, <title>Everything You Need to Know About Adopting a Dog</title> or even longer.
c. Keywords in Headings
Normally headings separate paragraphs into related subtopics and from a literary point of view, it may be pointless to have a heading after every other paragraph but from SEO point of view it is extremely good to have as many headings on a page as possible, especially if they have the keywords in them.
There are no technical length limits for the contents of the <h1>, <h2>, <h3>, ... <hn> tags but common sense says that too long headings are bad for page readability. So, like with URLs, you need to be wise with the length of headings. Another issue you need to consider is how the heading will be displayed. If it is Heading 1 (<h1>), generally this means larger font size and in this case it is recommendable to have less than 7-8 words in the heading, otherwise it might spread on 2 or 3 lines, which is not good and if you can avoid it – do it.
III. Backlinks – Another Important SEO Item
What are Backlinks?
In layman's terms, there are two types of links: inbound and outbound. Outbound links start from your site and lead to an external site, while inbound links or backlinks, come from an external site to yours. e.g. if cnn.com links to yourdomain.com, the link from cnn.com is a backlink (inbound) for yourdomain.com, however the link is an outbound link from cnn.com's perspective. Backlinks are among the main building blocks to good Search Engine Optimisation (SEO).
Why Backlinks Are Important
The number of backlinks is an indication of the popularity or importance of that website. Backlinks are important for SEO because some search engines like Google, give more credit to websites that have a large number of quality backlinks, and consider those websites more relevant than others in their results pages for a search query.
Therefore, when search engines calculate the relevance of a site to a keyword, they not only consider the number of backlinks to that site but also their quality. In order to determine the quality, a search engine considers the content of the sites. When backlinks to your site come from other sites, and those sites have content related to your site, these backlinks are considered more relevant to your site. If backlinks are found on sites with unrelated content, they are considered less relevant. The higher the relevance of backlinks, the greater their quality.
For example, if a webmaster has a website about how to rescue orphaned dogs, and received a backlink from another website about dogs, then that would be more relevant in a search engine's assessment than say a link from a site about car racing. Therefore, higher the relevance of the site linking back to your website, the better the quality of the backlink.
Search engines want websites to have a level playing field, and look for natural links built slowly over time. While it is fairly easy to modify your webpages to make them more SEO friendly it is a lot harder for you to influence other websites and get them to link to your website. This is the reason search engines regard backlinks as a very important factor. Further, search engine's criteria for quality backlinks has gotten even tougher, thanks to unscrupulous webmasters trying to achieve these backlinks by deceptive or sneaky techniques, such as hidden links, or automatically generated pages whose sole purpose is to provide backlinks to websites. These pages are called link farms, and they are not only disregarded by search engines, but linking to a link farm could get your site banned entirely.
Anchor Text
When a link incorporates a keyword into the text of the hyperlink, we call this anchor text. A link's anchor text may be one of the most powerful resources a webmaster has. Backlinks from multiple websites with the anchor text "orphaned dogs" would help your website rank higher for the keyword "orphaned dogs". Using your keyword is a superior way to utilize a hyperlink as against having links with words like "click here" which do not relate to your website. The 'Backlink Anchor Text Analysis Tool' is a tool which will assist you find your backlinks and the text which is being used to link to your website. If you find that your site is being linked to from another website, but the anchor text is not being utilized properly, you should request that the website change the anchor text to something which incorporates relevant keywords. This will also help boost your rankings.
Ways to Build Backlinks
Even if plenty of backlinks come to your site the natural way, additional quality backlinks are always welcome.
1 The Backlink Builder Tool
When you enter the keywords of your choice, the Backlink Builder tool gives you a list of relevent sites from where you might get some backlinks.
2 Getting Listed in Directories
If you are serious about your Web presence, getting listed in directories like DMOZ and Yahoo is a must, not only because this is a way to get some quality backlinks for free, but also because this way you are easily noticed by both search engines and potential visitors. Generally inclusion in search directories is free but the drawback is that sometimes you have to wait a couple of months before you get listed in the categories of your choice.
3 Forums and Article Directories
Generally search engines index forums so posting in forums and blogs is also a way to get quality backlinks with the anchor text you want. If the forum or blog is a respected one, a backlink is valuable. However, in some cases the forum or blog administrator can edit your post, or even delete it if it does not fit into the forum or blog policy. Also, sometimes administrators do not allow links in posts, unless they are relevant ones. In some rare cases (which are more an exception than a rule) the owner of a forum or a blog would have banned search engines from indexing it and in this case posting backlinks there is pointless.
4 RSS Feeds
You can offer RSS feeds to interested sites for free, when the other site publishes your RSS feed you will get a backlink to your site and potentially a lot of visitors, who will come to your site for more details about the headline and the abstract they read on the other site.
5 Affiliate programs
Affiliate programs are also good for getting more visitors (and buyers) and for building quality backlinks but they tend to be an expensive way because generally the affiliate commission is in the range of 10 to 30 %. But if you have an affiliate program anyway, why not use it to get some more quality backlinks?
6 News Announcements and Press Releases
Although this is hardly an everyday way to build backlinks, it is an approach that gives good results, if handled properly. There are many sites that publish news announcements and press releases for free or for a small fee . A professionally written press release about an important event can bring you many visitors and the backlink from a respected site to yours is a good boost to your SEO efforts. The tricky part is that you cannot release press releases if there is nothing newsworthy. That is why we say that news announcements and press releases are not a commodity way to build backlinks.
Link Practices That Are To Be Avoided
There is much discussion in these last few months about reciprocal linking. In the past few Google updates, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant backlinks were ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.
There is a Google patent in the works that will deal with not only the popularity of the sites being linked to, but also how trustworthy a site is that you link to from your own website. This will mean that you could get into trouble with the search engine just for linking to a bad apple.
Many webmasters have more than one website. Sometimes these websites are related, sometimes they are not. You have to also be careful about interlinking multiple websites on the same IP. If you own seven related websites, then a link to each of those websites on a page could hurt you, as it may look like to a search engine that you are trying to do something fishy. Many webmasters have tried to manipulate backlinks in this way; and too many links to sites with the same IP address is referred to as backlink bombing.
One thing is certain, interlinking sites doesn't help you from a search engine standpoint. The only reason you may want to interlink your sites in the first place might be to provide your visitors with extra resources to visit. In this case, it would probably be okay to provide visitors with a link to another of your websites, but try to keep many instances of linking to the same IP address to a bare minimum. One or two links on a page here and there probably won't hurt you.
IV. Metatags
What are Meta tags ?
Meta tags are used to summarize information of a page for search engine crawlers. This information is not directly visibles to humans visiting your website. The most popular are the meta keywords and description tag. These meta tags to be inserted into the area of your page.
A couple of years ago meta tags were the primary tool for search engine optimization and there was a direct correlation between keywords in the meta tags and your ranking in the search results. However, algorithms have got better and today the importance of metadata is decreasing day by day.
Meta Description
The meta Description tag is are one more way for you to write a description of your site, thus pointing search engines to what themes and topics your Web site is relevant to. Some search engines (including Google) use these meta description display a summary of the listings on the search results page. So if your meta descriptions are well written you might be able to attract more traffic to your website.
For instance, for the dog adoption site, the meta Description tag could be something like this:
<Meta Name=“Description“ Content=“Adopting a dog saves a life and brings joy to your house. All you need to know when you consider adopting a dog.“>
Meta Keywords
A potential use of the Meta Keywords tags is to include a list of keywords that you think are relevant to your pages. The major search engines will not take this into account but still it is a chance for you to emphasize your target keywords. You may consider including alternative spellings (or even common misspellings of your keywords) in the meta Keywords tag. It might be a very small boost to your search engine rankings but why miss the chance?
eg.
<Meta name=“Keywords“ Content=“adopt, adoption, dog, dogs, puppy, canine, save a life, homeless animals“>
Meta Robots
In this tag you specify the pages that you do NOT want crawled and indexed. It happens that on your site you have contents that you need to keep there but you don't want it indexed. Listing this pages in the Meta Robots tag is one way to exclude them (the other way is by using a robots.txt file and generally this is the better way to do it) from being indexed.eg.
<META NAME=“ROBOTS“ CONTENT=“NOINDEX, NOFOLLOW“>
V. Content Is King
If you are new to SEO, it might be a surprise for you that text is one of the driving forces to higher rankings. But it is a fact. Search engines (and your readers) love fresh content and providing them with regularly updated, relevant content is a recipe for success. Generally, when a site is frequently updated, this increases the probability that the spider will revisit the site sooner. You can't take for sure that if you update your site daily, the spider will visit it even once a week but if you do not update your contents regularly, this will certainly drop you to from the top of search results.
For company sites that are not focused on writing but on manufacturing constantly adding text can be a problem because generally company sites are not reading rooms or online magazines that update their content daily, weekly or monthly but even for company sites there are reasonable solutions. No matter what your business is, one is for sure – it is always relevant to include a news section on your site – it can be company news or RSS feeds but this will keep the ball rolling.

1. Topical Themes or How to Frequently Add Content to Your Site
If you are doing the SEO for an online magazine, you can consider yourself lucky – fresh content is coming all the time and you just need to occasionally arrange a heading or two or a couple of paragraphs to make the site SEO-friendly. But even if you are doing a SEO for an ordinary company site, it is not all that bad - there are ways to constantly get fresh content that fits into the topic of the site.
One of the intricacies of optimizing a company site is that it has to be serious. Also, if your content smells like advertising and has no practical value for your visitors, this content is not that valuable. For instance, if you are a trade company, you can have promotional texts about your products. But have in mind that these texts must be informational, not just sales hype. And if you have a lot of products to sell, or frequently get new products, or make periodical promotions of particular products and product groups – you can post all this to your site and you will have fresh, topical content.
Also, depending on what your business is about, you can include different kinds of self-updating information like lists of hot new products, featured products, discounted items, even online calculators or order trackers. Unlike promotional pages, this might neither bring you many new visitors, nor improve your ratings but is more than nothing.
One more potential traffic trigger for company sites are news sections. Here you can include news about past and coming events, post reports about various activities, announce new undertakings, etc. Some companies even go further – their CEO keeps a blog, where he or she writes in a more informal style about what is going in the company, in the industry as a whole, or in the world in general. These blogs do attract readers, especially if the information is true, rather than the official story.
An alternative way to get fresh free content are RSS feeds. RSS feeds are gaining more and more popularity and with a little bit of searching, you can get free syndicated content for almost any topic you can think of.
2. Bold and Italic Text
When you have lots of text, the next question is how to make the important items stand out from the crowd – for both humans and search engines. While search engines (and their spiders – the programs that crawl the Web and index pages) cannot read text the way humans do, they do have ways of getting the meaning of a piece of text. Headings are one possibility, bold and italic are another way to emphasize a word or a couple of words that are important. Search engines read the <b> and <i> text and get the idea that what is in bold and/or italic is more important than the rest of the text. But do not use bold and italic too much – this will spoil the effect, rather than make the whole page a search engine favorite.
3. Duplicate Content
When you get new content, there is one important issue – is this content original? Because if it is not, i.e. it is stolen from another site, this will get you into trouble. But even if it is not illegal, i.e. you obtained it for free from an article feed, have in mind that you might not be only one on the Web, who has this particular stuff. If you have the rights to do it, you can change the text a little, so it is not an exact copy of another page and cannot be labeled “duplicate content” by search engines. If you don't manage to escape the duplicate content filter that search engines have imposed recently in their attempts to filter stolen, scrapped, or simply copied contents, your pages could be removed from search results!
Duplicate content became an issue when tricky webmasters started making multiple copies of the same page (under a different name) in order to fool search engines that they have more content than they actually do. As a result of this malpractice, search engines responded with a duplicate content filter that removes suspicious pages. Unfortunately, this filter sometimes removes quite legitimate pages, like product descriptions given from a manufacturer to all its resellers, which must be kept exactly the same.
You see, duplicate content can be a serious problem. But it is not an obstacle that cannot be overcome. First, you need to periodically check the Web for pages that are similar to yours. You can use http://copyscape.com. If you identify pages that are similar to yours (and it is not you who have illegitimately copied them), you could notify the webmaster of the respective site(s) to remove them. Also, you could change a little the text on your site, hoping that this way you will avoid the duplicate content penalty. Even with product descriptions, you can add commentary or opinion on the same page and this could be a way out.
Try the Similar Page Checker to check the similarity between two URLs.



VI. Visual Extras and SEO
As already mentioned, search engines have no means to index directly extras like images, sounds, flash movies, javascript. Instead, they rely on your to provide meaningful textual description and based on it they can index these files. In a sense, the situation is similar to that with text 10 or so years ago – you provide a description in the metatag and search engines uses this description to index and process your page. If technology advances further, one day it might be possible for search engines to index images, movies, etc. but for the time being this is just a dream.
1. Images
Images are an essential part of any Web page and from a designer point of view they are not an extra but a most mandatory item for every site. However, here designers and search engines are on two poles because for search engines every piece of information that is buried in an image is lost. When working with designers, sometimes it takes a while to explain to them that having textual links (with proper anchor text) instead of shining images is not a whim and that clear text navigation is really mandatory. Yes, it can be hard to find the right balance between artistic performance and SEO-friendliness but since even the finest site is lost in cyberspace if it cannot be found by search engines, a compromise to its visual appearance cannot be avoided.
With all that said, the idea is not to skip images at all. Sure, nowadays this is impossible because the result would be a most ugly site. Rather the idea is that images should be used for illustration and decoration, not for navigation or even worse – for displaying text (in a fancy font, for example). And the most important – in the <alt> attribute of the <img> tag, always provide a meaningful textual description of the image. The HTML specification does not require this but search engines do. Also, it does not hurt to give meaningful names to the image files themselves rather than name them image1.jpg, image2.jpg, imageN.jpg. For instance, in the next example the image file has an informative name and the alt provides enough additional information: <img src=“one_month_Jim.jpg” alt=“A picture of Jim when he was a one-month puppy”>. Well, don't go to extremes like writing 20-word <alt> tags for 1 pixel images because this also looks suspicious and starts to smell like keyword-stuffing.
2. Animation and Movies
The situation with animation and movies is similar to that with images – they are valuable from a designer's point of view but are not loved by search engines. For instance, it is still pretty common to have an impressive Flash introduction on the home page. You just cannot imagine what a disadvantage with search engines this is – it is a number one rankings killer! And it gets even worse, if you use Flash to tell a story that can be written in plain text, hence crawled and indexed by search engines. One workaround is to provide search engines with a HTML version of the Flash movie but in this case make sure that you have excluded the original Flash movie from indexing (this is done in the robots.txt file but the explanation of this file is not a beginners topic and that is why it is excluded from this tutorial), otherwise you can be penalized for duplicate content.
There are rumors that Google is building a new search technology that will allow to search inside animation and movies and that the .swf format will contain new metadata that can be used by search engines, but until then, you'd better either refrain from using (too much) Flash, or at least provide a textual description of the movie (you can use an <alt> tag to describe the movie).
3. Frames
It is a good news that frames are slowly but surely disappearing from the Web. 5 or 10 years ago they were an absolute hit with designers but never with search engines. Search engines have difficulties indexing framed pages because the URL of the page is the same, no matter which of the separate frames is open. For search engines this was a shock because actually there were 3 or 4 pages and only one URL, while for search engines 1 URL is 1 page. Of course, search engines can follow the links to the pages in the frameset and index them but this is a hurdle for them.
If you still insist on using frames, make sure that you provide a meaningful description of the site in the <noframes> tag. The following example is not for beginners but even if you do not understand everything in it, just remember that the <noframes> tag is the place to provide an alternative version (or at least a short description) of your site for search engines and users whose browsers do not support frames. If you decide to use the <noframes> tag, maybe you'd like to read more about it before you start using it.
Example: <noframes> <p> This site is best viewed in a browser that supports frames. </p><p> Welcome to our site for prospective dog adopters! Adopting a homeless dog is a most noble deed that will help save the life of the poor creature. </p></noframes>
4. JavaScript
This is another hot potato. It is known by everybody that pure HTML is powerless to make complex sites with a lot of functionality (anyway, HTML was not intended to be a programming languages for building Web applications, so nobody expects that you can use HTML to handle writing to a database or even for storing session information) as required by today's Web users and that is why other programming languages (like JavaScript, or PHP) come to enhance HTML. For now search engines just ignore JavaScript they encounter on a page. As a result of this, first if you have links that are inside the JavaScript code, chances are that they will not be spidered. Second, if JavaScript is in the HTML file itself (rather than in an external .js file that is invoked when necessary) this clutters the html file itself and spiders might just skip it and move to the next site. Just for your information, there is a <noscript> tag that allows to provide alternative to running the script in the browser but because most of its applications are pretty complicated, it is hardly suitable to explain it here.
VII. Static Versus Dynamic URLs
Based on the previous section, you might have gotten the impression that the algorithms of search engines try to humiliate every designer effort to make a site gorgeous. Well, it has been explained why search engines do not like image, movies, applets and other extras. Now, you might think that search engines are far too cheeky to dislike dynamic URLs either. Honestly, users are also not in love with URLs like http://domain.com/product.php?cid=1&pid=5 because such URLs do not tell much about the contents of the page.
There are a couple of good reasons why static URLs score better than dynamic URLs. First, dynamic URLs are not always there – i.e. the page is generated on request after the user performs some kind of action (fills a form and submits it or performs a search using the site's search engine). In a sense, such pages are nonexistent for search engines, because they index the Web by crawling it, not by filling in forms.
Second, even if a dynamic page has already been generated by a previous user request and is stored on the server, search engines might just skip it if it has too many question marks and other special symbols in it. Once upon a time search engines did not index dynamic pages at all, while today they do index them but generally slower than they index static pages.
The idea is not to revert to static HTML only. Database-driven sites are great but it will be much better if you serve your pages to the search engines and users in a format they can easily handle. One of the solutions of the dynamic URLs problem is called URL rewriting. There are special tools (different for different platforms and servers) that rewrite URLs in a friendlier format, so they appear in the browser like normal HTML pages. Try the URL Rewriting Tool below, it will convert the cryptic text from the previous example into something more readable, like http://mydomain.com/product-categoryid-1-productid-5.

VIII. Promoting Your Site to Increase Traffic
The main purpose of SEO is to make your site visible to search engines, thus leading to higher rankings in search results pages, which in turn brings more traffic to your site. And having more visitors (and above all buyers) is ultimately the goal in sites promotion. For truth's sake, SEO is only one alternative to promote your site and increase traffic – there are many other online and offline ways to do accomplish the goal of getting high traffic and reaching your target audience. We are not going to explore them in this tutorial but just keep in mind that search engines are not the only way to get visitors to your site, although they seem to be a preferable choice and a relatively easy way to do it.
1. Submitting Your Site to Search Directories, forums and special sites
After you have finished optimizing your new site, time comes to submit it to search engines. Generally, with search engines you don't have to do anything special in order to get your site included in their indices – they will come and find you. Well, it cannot be said exactly when they will visit your site for the first time and at what intervals they will visit it later but there is hardly anything that you can to do invite them. Sure, you can go to their Submit a Site pages in submit the URL of your new site but by doing this do not expect that they will hop to you right away. What is more, even if you submit your URL, most search engines reserve the right to judge whether to crawl your site or not. Anyway, here are the URLs for submitting pages in the three major search engines: Google, MSN, and Yahoo.
In addition to search engines, you may also want to have your site included in search directories as well. Although search directories also list sites that are relevant to a given topic, they are different from search engines in several aspects. First, search directories are usually maintained by humans and the sites in them are reviewed for relevancy after they have been submitted. Second, search directories do not use crawlers to get URLs, so you need to go to them and submit your site but once you do this, you can stay there forever and no more efforts on your side are necessary. Some of the most popular search directories are DMOZ and Yahoo! (the directory, not the search engine itself) and here are the URLs of their submissions pages: DMOZ and Yahoo!.
Sometimes posting a link to your site in the right forums or special sites can do miracles in terms of traffic. You need to find the forums and sites that are leaders in the fields of interest to you but generally even a simple search in Google or the other major search engines will retrieve their names. For instance, if you are a hardware freak, type “hardware forums” in the search box and in a second you will have a list of sites that are favorites to other hardware freaks. Then you need to check the sites one by one because some of them might not allow posting links to commercial sites. Posting into forums is more time-consuming than submitting to search engines but it could also be pretty rewarding.
2. Specialized Search Engines
Google, Yahoo!, and MSN are not the only search engines on Earth, nor even the only general-purpose ones. There are many other general-purpose and specialized search engines and some of them can be really helpful for reaching your target audience. You just can't imagine for how many niches specialized search engines exist – from law, to radiostations, to educational one! Some of them are actually huge sites that gather Webwide resources on a particular topic but almost all of them have sections for submitting links to external sites of interest. So, after you find the specialized search engines in your niche, go to their site and submit your URL – this could prove more trafficworthy than striving to get to the top of Google.
3. Paid Ads and Submissions
We have already mentioned some other alternatives to search engines – forums, specialized sites and search engines, search directories – but if you need to make sure that your site will be noticed, you can always resort to paid ads and submissions. Yes, paid listings are a fast and guaranteed way to appear in search results and most of the major search engines accept payment to put your URL in the Paid Links section for keywords of interest to you but you also must have in mind that users generally do not trust paid links as much as they do with the normal ones – in a sense it looks like you are bribing the search engine to place you where you can't get on your own, so think twice about the pros and cons of paying to get listed.

SEO ARTICLE
How to Optimize for Baidu
Usually SEO efforts are directed towards achieving top rankings with Google and sometimes with Yahoo and Bing. However, in addition to the Big Three, there are also other search engines that might be of interest to you. In fact, some of these search engines might prove a better option than Google, Yahoo or Bing. If these search engines are used by your target audience, they will be more efficient and it is worth to spend some time optimizing for them.
If you haven't heard about Baidu, don't worry. It is a popular search engine but its reach is not global and this is why many people don't even know about it. Still, Baidu is certainly not just one more search engine to waste your time with. Baidu is big in China and since the population of China is more than a billion, if you rank well with Baidu, this can make quite a difference. In fact, if you are operating globally, not to mention if your visitors are based mainly in China, you can't afford to miss this market. On the Chinese market, the share of Baidu is around 60% and it is the most popular Chinese language search engine. Google is less popular in China than Baidu, so if your traffic comes from the Chinese market, it pays to optimize your site for Baidu.
The algorithm of Baidu is different from the algorithms of Google, Bing, and Yahoo. In a sense, it is less sophisticated and it kinda resembles the algorithms of the other search engines from many years many years ago. Here are some tips what you should do in order to get decent rankings with Baidu:
1. Find the right Chinese keywords
Of course, as with other search engines, keywords are important for good rankings. You need to find the right Chinese keywords to optimize for. This might be a challenge because Chinese has many dialects and the same words have different meanings in different dialects. However, Pinyin Chinese is preferred by Baidu and this is why Pinyin Chinese is your best choice. You should stick to it not only for your keywords but for your content as a whole.
2. You need LOTS of content in Chinese
Even if Chinese is not the official language of your site, you need to have many pages in Chinese. With Baidu, content is king – the same as with other search engines. When generating tons of content in Chinese, follow the official guidelines for what content is acceptable in China because there are strict rules there and if you don't obey, this could cost you not only your good rankings with Baidu.
3. Metatags weight a lot
Similarly to the early days of the other search engines, with Baidu metatags are very important, so don't forget to make your metatags top-notch. However, don't abuse metatags and don't stuff them with keywords.
4. Get a Pinyin Chinese domain name and host your site on a Chinese host
Domain names are important with Baidu as well. In addition to having keywords in your domain name, you need to have a domain name in Pinyin Chinese. You can use a .com, .net, or .cn extension with it. For even better results, host your site on a Chinese host because this gives you an additional bonus with Baidu.
5. Use simple navigation structures
Simple navigation structures are a must with every search engine but for Baidu they matter even more. Baidu won't follow links that are deeply buried in all kinds of messy code or that go many levels deep in the site hierarchy.
6. Watch for duplicate content
Baidu is very strict about duplicate content. With the other search engines you might also have problems, if you have duplicate content but Baidu is even less tolerant. Use a robots.txt to tell what not to index and you are safe.
7. No links to bad neighbors and no link farms
Linking to bad neighbors and getting links from link farms isn't a good idea with any search engine but penalties with Baidu are even more severe, so you need to consider this. Also, don't put too many outbound links on your site because this also affects your Baidu rankings in a negative way.
8. Plan in advance
In Google you can get top rankings for a week (though this certainly isn't the norm and we don't mean you should use blackhat strategies to achieve it) but with Baidu success doesn't come that fast. In Baidu it can take 6 months or more to achieve the good rankings you will achieve in Google overnight and you need to take this into account. For instance, if you are promoting a summer-related site, you should start optimization not later than November, so that when the season comes, your site will have achieved the rankings you want.
9. Baidu doesn't deal with Flash and JavaScript
Flash and JavaScript aren't Google's favorites but Baidu absolutely hates them. This is why you should use Flash and JavaScript only if you provide altenative HTML versions of the content you have incorporated in the Flash/JavaScript. Baidu doesn't like iFrames either, so avoid their use as well.
10. Make sure your site is spiderable
As any other search engine, Baidu uses crawlers, so make sure your site is spiderable. Use a spider simulator to check what is accessible from your site and what isn't.
As you see, optimization for Baidu isn't totally different from optimization for any other search engine but it certainly has its specifics. Follow the rules, be patient and sooner or later success will come to you. When you are done with your Baidu optimization, it won't be a surprise, if your site starts ranking better with Google as well, especially for country-specific searches. When you have so much content in Chinese and a Chinese domain name, this will inevitably help you to achieve better rankings for your Chinese search terms in any other search engine.
SEO Musts for Local Business
The Internet might be global in nature, but if your business is local, it makes no sense to concentrate on global reach, when your customers live in your city, or even in your neighborhood. For local businesses getting a global reach is a waste of resources. Instead, you should concentrate on the local community. You might be asking how you can do it, when the Web is global and Google doesn't classify sites according to their location. Here is how you can go local with SEO:
1. Use your location in your keywords.
The first trick is to use your location in your keywords. For example, if you are in London and you sell car insurance, your most important keyphrase should be “car insurance London” because this keyphrase contains your business and your location and will drive people who are looking for car insurance in London in particular.
2. Use your location in metatags
Metatags matter for search engines and you shouldn't miss to include your location, together with your other keywords in the metatags of the pages of your site. Of course, you must have your location in the keywords you use in the body text because otherwise it is a bit suspicious when your body text doesn't have your location as a keyword but your tags are stuffed with it.
3. Use your location in your body text
Keywords in the body text count a lot and you can't afford to skip them. If your web copy is optimized for “car insurance” only, this won't help you rank well with “car insurance London”, so make sure that your location is part of your keywords.
4. Take advantage of Google Places and Yahoo Local
Google Places and Yahoo Local are great places to submit to because they will include you in their listings for a particular location.
5. Create backlinks with your location as anchor text
It could be a bit tricky to get organic backlinks with your location as anchor text because some keywords with location don't sound very natural – for instance, “car insurance London” isn't grammatically correct and you will hardly get an organic inline link with it but you can use it in the Name field to comment on blogs. If the blog is dofollow, you will still get a backlink with anchor text that helps for SEO.
6. Get included in local search engine
Global search engines, such as Google, Bing, or Yahoo can bring you lots of traffic but depending on your location, local search engines might be the real golden mine. A local search engine could mean a search engine for the area (though it is not very likely to have regional search engines) or more likely for your country. For instance, Baidu is a great option, if you are selling on the Chinese market.
7. Get listed in local directories
In addition to local search engines, you need to try your luck with local directories, too. You might think that nobody reads directory listings but this isn't exactly so. For instance, Yellow Pages are one of the first places where people look when searching for a local vendor for a particular product.
8. Run locally-targeted ad campaigns
One of the most efficient ways to drive targeted, local traffic to your site is with the help of locally-targeted ad campaigns. PPC ads and classifieds are the two options that work best – at least for most webmasters.
9. Do occasional checks of your keywords
Occasionally checking the current search volume of your keywords is a good idea because shifts in search volumes are quite typical. Needless to say, if people don't search for “car insurance London” anymore because they have started using other search phrases and you continue to optimize for “car insurance London”, this is a waste of time and money. Also, keep an eye on the keywords your competitors use – this will give you clue which keywords work and which don't.
10. Use social media
Social media can drive more traffic to a site than search engines and for local search this is also true. Facebook, Twitter, and the other social networking sites have a great sales potential because you can promote your business for free and reach exactly the people you need. Local groups on social sites are especially valuable because the participants there are mainly from the region you are interested in.
11. Ask for reviews and testimonials
Client reviews and testimonials are a classical business instrument and these are like letters of recommendation for your business. However, as far as SEO is concerned, they could have another role. There are review sites, where you can publish such reviews and testimonials (or ask your clients to do it) and this will drive business to you. Some of these sites are Yelp and Merchant Circle but it is quite probable that there are regional or national review sites you can also post at.
12. Create separate pages for your different locations
When you have business in several locations, this makes the task a bit more difficult because you can't possibly optimize for all of them – you can't have a keyphrase such as “car insurance London, Berlin, Paris, New York”. In this case the solution is to create separate pages for your different locations. If your locations span the globe, you can also create different sites on different, country-specific domains (i.e. uk.co for GB, .de for Germany, etc.) but this is only reasonable to do, if your business is truly multinational. Otherwise, just a separate page for each of your locations will do.
These simple tips how to optimize your site for local searches are a must, if you rely on the local market. Maybe you are already doing some of them and you know what works for you and what doesn't. Anyway, if you haven't tried them all, try them now and see if this will have a positive impact on your rankings (and your business) or not.


How to Optimize your Website for Mobile Search
It is not only web designers and developers, who need to adapt to webcomake changes to their strategies and tactics, if they want to capture the lucrative mobile search market. Mobile search is a constantly growing segment of the market, which is good news. However, mobile search has its own rules and they are kind of different from the rules of traditional desktop search. This is why if you don't want to miss mobile searchers, you need to adapt to their requirements. Here are some very important rules to consider when optimizing for mobile search:
1. Mobile Searchers Use Shorter Keyphrases/Keywords
Mobile users search for shorter keyphrases, or even just for keywords. Even mobile devices with QWERTY keyboards are awkward for typing long texts and this is the reason why mobile searchers usually are very brief in their search queries. Very often the search query is limited to only 2 or even 1 words. As a result, if you don't rank well for shorter keyphrases (unfortunately, they are also more competitive), then you will be missing a lot of mobile traffic.
2. Mobile Search Is Mainly Local Search
Mobile users search mostly for local stuff. In addition to shorter search keyphrases, mobile searchers are also locally targeted. It is easy to understand - when a user is standing in the street and is looking for a place to dine, he or she is most likely looking for things in the neighborhood, not in another corner of the world. Searches like “pizza 5th Avenue” are quite popular, which makes local search results even more important to concentrate on.
3. Current Data Rules in Mobile Search
Sports results, news, weather, financial information are among the most popular mobile search categories. The main topics and niches mobile users prefer are kind of limited but again, they revolve around places to eat or shop in the area, sports results, news, weather conditions, market information, and other similar topics where timing and location are key. If your site is in one of these niches, then you really need to optimize it because if your site is not mobile-friendly chances are you are losing visitors. You could even consider having two separate versions of your site – one for desktop searchers and one for mobile searchers.
4. In Mobile Search, Top 10 Is Actually Top 3
Users hate to scroll down long search pages or hit Next, Next, Next. Desktop searchers aren't fond of scrolling endless pages either but in mobile search the limitations are even more severe. A page with 10 search results fits on the screen of a desktop but on a mobile device it might be split into 2 or more screens. Therefore, in mobile search, it is not Top 10, it is more Top 4, or even Top 3 because only the first 3 or 4 positions are on the first page and have a higher chance to attract the user's attention without having to go to the next page.
5. Promote Your Mobile-Friendly Site
Submit your site to major mobile search engines, mobile portals, and directories. It is great if your visitors come from Google and the other major search engines but if you want to get even more traffic, mobile search engines, mobile portals, and directories are even better. For now these mobile resources work great to bring mobile traffic, so don't neglect them. Very often a mobile user doesn't search with Google, but goes to a portal he or she knows. If your site is listed with this portal, the user will come directly to you from there, not from a search engine. The case with directories is similar – i.e. if you are optimizing the site of a pizza restaurant, then you should submit it to all directories where pizza restaurants and restaurants in general for your location are listed.
6. Follow Mobile Standards
Mobile search standards are kind of different and if you want your site to be spiderable, you need to comply with them. Check the guidelines of W3C to see what mobile standards are. Even if your site doesn't comply with mobile standards, it will still be listed in search results but it will be transcoded by the search engine and the result could be pretty shocking to see. Transcoders convert sites to a mobile format but this is not done in a sophisticated manner and the output might be really unbelievable – and everything but mobile-friendly.
7. Don't Forget Meta.txt
Meta.txt is a special file, where you briefly describe the contents of your site and point the user agent to the most appropriate version for it. Search engine spiders directly index the meta.txt file (provided it is located in the root directory), so even if the rest of your site is not accessible, you will still be included in search results. Meta.txt is similar to robots.txt in desktop search but it also has some similarity with metatags because you can put content it it (as you do with the Description and Keywords metatags). The format of the meta.txt file is colon delimited (as is the format of robots.txt). Each field in the file has the following syntax form <fieldname>:<value>. One of the advantages of meta.txt is that it is easily parsed by humans and search engines.
8. No Long Pages for Mobile Searchers
Use shorter texts because mobile users don't have the time to read lengthy pages. We already mentioned that mobile searchers don't like lengthy keyphrases. Well, they like lengthy pages even less! This is why, if you can make a special, shorter mobile version of your site, this would be great. Short pages don't mean that you should skip your keywords, though. Keywords are really vital for mobile search, so don't exclude them but don't keyword stuff, either.
9. Predictive Search Is Popular With Mobile Searchers
Use phrases, which are common in predictive search. Predictive search is also popular with mobile searchers because it saves typing effort. This is why, if your keywords are among the common predictive search results, this seriously increases your chances to be found. It is true that predictive search keywords change from time to time and you can't always follow them but you should at least give it a try.
10. Preview Your Site on Mobile Devices
Always check how your site looks on a mobile device. With the plethora of devices and screen sizes it is not possible to check your site on absolutely every single device you can think of, but if you can check it at least on a couple of the most important ones, this is more than nothing. Even if you manage to get visitors from mobile search engines, if your site is shown distorted on a mobile screen, these visitors will run away. Transcoding is one reason why a site gets distorted, so it is really a good idea to make your site mobile-friendly instead of to rely on search engines to transcode it and make it a design nightmare in the process.
Mobile search is relatively new but it is a safe bet that it will get a huge boost in the near future. If you are uncertain whether your particular site deserves to be optimized for mobile devices or not, use AdWords Keyword Research Tool to track mobile volumes for your particular keywords. If the volumes are high, or if a particular keyword is doing remarkably well in the mobile search segment, invest more time and effort to optimize for it.

How to Pick an SEO Friendly Designer
A Web designer is one of the persons without whom it is not possible to create a site. However, when SEO is concerned, Web designers can be really painful to deal with. While there are many Web designers, who are SEO-proficient, it is still not an exception to stumble upon design geniuses, who are focused only on the graphic aspect of the site. For them SEO is none of their business and they couldn't care less for something as unimportant as good rankings with search engines. Needless to say, if you hire such a designer, don't expect that your site will rank well with search engines.
If you will do SEO on your own, then you might not care a lot about the SEO skills of your Web designer but still there are design issues as we'll see next, which can affect your rankings very badly. When he or she designs the site against SEO rules, then it is not possible to fix this with SEO tricks.
When we say that you need to hire a SEO-friendly designer, we presume that you are a SEO pro and you know SEO but if you aren't, then have a look at the SEO Tutorial and the SEO Checklist. If you have no idea about SEO, then you will hardly be able to select a SEO-friendly designer because you won't know what to look for.
One of the ultimate tests if a designer is SEO-friendly or not is to look at his or her past sites – are they done professionally, especially in the SEO department. If their past sites don't exhibit blatant SEO mistakes, such as the ones we'll list in a second and they rank well, this is a recommendation that this person is worth hiring. Anyway, after you look at past sites, ask the designer if he or she did the SEO for their past sites because in some cases it might be that the client himself or herself has done a lot to optimize the site and this is why the site ranks well.
Here is a checklist of common web design sins that will make your site a SEO disaster. If you notice any or all of the following in the past sites your would-be designer has created, just move to the next designer. These SEO-unfriendly design elements are absolute sins and unless the client made them do it, no designer who would use the below techniques deserves your attention:
1. Rely heavily on Flash
Many designers still believe that Flash is the next best thing after sliced bread. While Flash can be very artistic and make a site look cool (and load forever in the browser), heavily Flash-ed sites are disaster in terms of SEO. Simple HTML sites rank better with search engines and as we point out in Optimizing Flash Sites, if the use of Flash is a must, then an HTML version of the same page is more than mandatory.
2. No internal links, or very few links
Internal links are backlinks and they are very important. Of course, this doesn't mean that all the text on a page must be hyperlinked to all the other pages on the site but if there are only a couple of internal links a page, this is a missed chance to get backlinks.
3. Images, not text for anchors
This is another frequent mistake many designers make. Anchor text is vital in SEO and when your links lack anchor text, this is bad. It is true that for menu items and other page elements, it is much easier to use an image than text because with text you can never be sure it will display correctly on users' screens, but since this is impacting your site's rankings in a negative way, you should sacrifice beauty for functionality.
4. Messy code and tons of code
If you have no idea about HTML, then it might be impossible for you to judge if a site's code is messy and if the amount of code is excessive but cleanness of code is an important criterion for SEO. When the code is messy, it might not be spiderable at all and this can literally exclude your site from search engines because they won't be able to index it.
5. Excessive use of (SEO non-friendly) JavaScript
Similarly to Flash, search engines don't love JavaScript, especially tons of it. Actually, the worst with JavaScript is that if not coded properly, it is quite possible that because of the use of JavaScript your pages (or parts of them) are not spiderable, which automatically means that they won't be indexed.
6. Overoptimized sites
Overoptimized sites aren't better than under-optimized. In fact, they could be much worse because when you keyword stuff and use other techniques (even when they are not Black Hat SEO) to artificially inflate the rankings of the site, this could get you banned from search engines and this is the worst that can happen to a site.
7. Dynamic and other SEO non-friendly URLs
Well, maybe dynamic URLs is not exactly a design issue but if you are getting a turn-key site - i.e. it is not up to you to upload and configure it and to create the links inside - then dynamic URLs are bad and you have to ask the designer/developer not to use them. You can rewrite dynamic and other SEO non-friendly URLs on your own but actually this means to make dramatic changes to the site and this is hardly the point of hiring a designer.
These points are very important and this is why you need to follow them, when you are choosing a SEO-friendly designer. Some of the items on the list are so bad for SEO (i.e. Flash, JavaScript) that even if the site is a design masterpiece and you promote it heavily, you will still be unable to get decent rankings. SEO-friendliness of design is a necessity, not a whim and you shouldn't settle for a SEO-unfriendly designs – this can be really expensive!

How to Boost your SEO with Google Adwords
Many advertisers use Google AdWords as their major PPC network. However, in addition to using AdWords for getting paid traffic to your site, it can also be used for SEO. Here are some ideas how you can use AdWords for SEO.
1. For Keyword Research
The most valuable use of AdWords for SEO is to research keywords. Keywords are the basis of any SEO campaign and even if you are an expert in your niche, you should always research keywords simply because users frequently search for quite unexpected keywords and keyphrases you as an expert will never think of. Needless to say, what matters most for high rankings is which keywords your target audience is searching for, not which keywords you as an expert think are most popular in a particular niche.
In order to find what users are searching for, you need a keyword research tool. It is true that there are many special (free and paid) keyword research tools but Google AdWords Keyword Tool is light years ahead of them all.
It is simple to use AdWords to research keywords. You can either enter the URL of your site or put in some seed keywords, the tool will then automatically generate a whole bunch of suggested keywords. Look at the results and shortlist all the keywords that seem relevent and have a decent global search volume.
You may want to rank well for ALL the generated keywords, but its best to focus all your efforts on a selected few. The idea now is to find keywords that are relatively easy to optimize and yet have a decent search volume. These would be the keywords with the least compitetion in Google. Go to Google.com and enter each of your short listed keywords (one at a time). It is best if you search for the exact phrase, so surround your keyword with double quotes. Note how many web results there are for each of the phrases. Now that you have collected the 'Number of web results' for each keyword, calculate competition ratio by divding it's 'Global search volume' by the 'Number of web results'. The keywords with the higher ratios are the easier ones to optimize.
You can now start a SEO campaign for your keywords however you'll see next, it might be much wiser to start an AdWords campaign instead.
2. To Ensure that the Keywords You Have Picked Convert Well

After you have picked your keywords, you need to verify if these keywords really work for you – i.e. if they convert properly. No matter how precise you've been when picking your keywords, if you don't test them in practice, you can never know for sure if they work well or don't. You can pick lucrative keywords with high global search volume and low levels of competition and still end nowhere.
For instance, for this website - webconfs.com we could try optimizing for the keyword "Search Engine Optimization". It could take a year or so with a LOT of effort to reach the first page on Google for "Search Engine Optimization” and still one can never be sure this will happen.
However, let's pretend that this happens – We manage to top Google for "Search Engine Optimization” after a year of hard SEO work. To our greatest disappointment, even the first place for "Search Engine Optimization” on Google did'nt bring the expected results because the bounce rate for this particular keyword turned out to be very high. Since we do not provide SEO Services a lot of people reaching us via "Search Engine Optimization" may NOT be getting what they're looking for. Instead, lesser popular keywords, such as "SEO Tips” or "SEO Guide” might have lower bounce rates and may actually perform better than "Search Engine Optimization” did for us.
The result is not surprising but the price paid is. If we had launched an AdWords campaign, it would have saved a lot of trouble. We could have spent $20-50 on AdWords for "Search Engine Optimization” and it would have taken us a week or less to figure that the bounce rate for this keyword is very high and it makes no sense to do organic SEO for it. These $20-50 on AdWords would have spared a year of wasted SEO efforts.


3. For Getting a Better CTR with Your Existing Rankings
In addition to keyword research, AdWords is a valuable tool for getting a better CTR (Click Thru Rate) with your existing rankings. You might rank well for a given keyword, get a lot of traffic, and still be unable to monetize this traffic because your CTR is low. The reasons for this might be various but inadequate title and description could be a very possible reason.
AdWords can also help you get better CTR with your existing rankings. For instance, if you run ad AdWords campaign and you are satisfied with the conversion/performance, you might want to keep changing your ad title and description until you feel you have reached the maximum CTR for your keywords.
Sure, it might take you a couple of tries till you find the winning combination of a title and a description and you might even lower your CTR in the process but once you find this magical combination of a title and description, just copy them as the title and description for your page in order to maximize your organic search CTR as well.

4. For Geographic Targeting
One more good use of AdWords for SEO is geotargeting. If you bid on traffic from many geographic locations, you can use Google Analytics to compare how different locations convert. It is quite natural to have significant discrepancies in the conversions for the same keyword among the countries.
When you go to Google Analytics and see which countries are converting best, you can invest more effort in them. For instance, you can create local pages for these countries or target the geo-specific keywords with exceptionally good conversion rates.
AdWords is a really valuable tool not only for advertisers. It started as a tool for advertisers but its use is not restricted to them alone. For many publishers and SEO experts AdWords is the most valuable tool because even a moderate AdWords campaign can give you valuable insights and save you a lot of time and money to optimize for words, which don't work for you.

YouTube Traffic
YouTube is one of the most popular sites and in addition to all the fun there, YouTube offers many opportunities for promotion and getting traffic to your site. Similarly to Facebook and Twitter, in order to use YouTube successfully for promotion and getting traffic, you need to know the rules for this. Here are some tips how to promote yourself, your site, and your products and how to get free traffic from YouTube:
1. Post viral videos
There are millions of videos on YouTube. If you post a video nobody is interested in, this video will go unnoticed, as millions of other videos. The clue to getting traffic from YouTube is to post useful videos, or even better – viral videos. Viral videos are not only useful videos, but they also tend to appeal to large groups of people. If your video manages to get viral, people will promote it for you and the only thing left for you is to reap the benefits.
2. Create an interesting profile
Similarly to Facebook, Twitter, or any other social networking site, an interesting profile is a must. If people like your videos, they will check your profile to learn more about you. When they see that your profile is boring, they won't bother more with you. You can make your profile a bit informal but don't make it as if it were the profile of a crazy teenager – you are using YouTube for business, right?
3. Include your logo and website in the video
Your logo and your website URL are your major branding weapons. This is why you must include them in the video. You can include them in the beginning of the video or at the end. It is best to have your logo and URL throughout the whole video because this way you will be gaining lots of exposure but if you can't do it (for instance because of artistic considerations), the beginning and the end of the video will suffice.
4. Post quality videos
As already mentioned, there is no shortage of videos on YouTube. Unfortunately, this also means there is no shortage of videos with poor quality. These videos are not favored by viewers, so if you want viewers to watch your videos, make sure that your videos don't have crappy sound and/or blurred pictures. YouTube is not a board for professional videographers, so you can post amateur videos, but make sure their quality is decent.
5. Promote your videos
If your videos get viral, you are lucky but you can't count on this. In order to get YouTube traffic, your videos need viewers. You can't rely solely on the fact that viewers will find your videos – you need to promote them. Even viral videos will benefit from a promotion by you.
6. Make your videos search-friendly
One of the ways viewers find your videos is through search – both locally on YouTube and on search engines. This is why you need to make your videos search-friendly. To do this, include your major keywords in the title and in the descriptions. Also, pay special attention to the tags. List as many keywords as relevant in the tags, but beware that you don't get spammy.
7. Post in series
Standalone videos can become a hit but it is best if you create series of videos and post them once a day/week. This way viewers will know that there will be more and they will be coming to check. Even if you don't create series, at least try to post videos regularly – this builds audience loyalty.
8. Post video responses
Video responses are one of the unique things about YouTube and you should take full advantage of it. Search your niche, choose the most popular videos in your niche and post video responses to them. Just be careful that the response you post is related to the video you are responding to and don't make your video response a blatant self-promotion.
9. Choose the right time to post your videos
On YouTube, timing is very important because there are peaks in traffic and times when there are not so many viewers. Weekdays (especially Wednesdays and above all - Thursdays) morning or early afternoon US time is the best time to post a general interest video. In order to have your video uploaded in the prime time, you need to plan a bit. Have in mind that for large videos and/or slow Internet connections the upload could take you an hour, so start early.
10. Keep your videos short
YouTube doesn't impose limits on the length of videos it publishes but generally long videos are boring. 3 to 5 minutes is the best duration for a video but if required you could go from 1 to 6 minutes. When a video is longer than 6 or 7 minutes, this gets boring and not many people will watch it to the end (where your logo and URL are to be found). 3 to 5 minutes is enough to lay your idea, give some details AND tell viewers to visit your site for more.
11. Comment on other people's videos and include a link to your site in your comment
In addition to video responses, you can also use plain good comments. Again, search for popular videos in your niche and comment on them. If your comments are liked by viewers, they will check your profile and probably watch your videos.
YouTube is a valuable resource to drive traffic to your site and to promote it. The competition there might be fierce, but there is always room for a couple of good videos. Fill this room before your competitors do!
Make money from your website
For millions of people web sites are the major source of income. Thanks to the various ways to monetize a site, it is possible to make a living as a web master. However, this doesn't mean that every web master is a millionaire. Well, if you know how to monetize your web site, you might not get rich but at least chances are that under the worst possible scenario, you will be able to cover your hosting fees and the other expenses you make for your site.
Currently the most profitable ways to monetize your site are CPM ads, CPC ads, CPA ads, direct sales of ad space, and selling merchandise/goods/services. These ways are described in this article.
CPM Ads Can Bring In Cents
CPM ads (an abbreviation for Cost Per Thousand) is the oldest type of site monetization. You put banners on your site and advertisers pay you based on the number of unique impressions/page views your site has.
CPM ads are the easiest form of site monetization because they require almost no effort on your side, but they are also the least feasible because as a rule (unless you have really high traffic and your audience is well-targeted), the rates are low. Some CPM ad networks pay as low as $.05 per 1,000 uniques, which means that you need millions of visitors in order to make $100-200 a month.
Popups and layer ads pay a bit more ($.5-2 per 1,000 uniques/pageviews). They could bring you a few dollars a month but many visitors find them especially irritating and this is why many web masters are not willing to put popups and/or layer ads on their sites.
There are high-paying CPM ads – for instance ads paying $2-5 per 1,000 uniques but usually the networks that run them have very high traffic and quality requirements for the sites that are eligible. As a result, it is very hard to get into these networks and take advantage of these well-paid CPM ads.
There are many CPM networks we can recommend. For sites with lots of traffic the choice better and some of the good options are Advertising.com, TribalFusion, CasaleMedia, and ValueClick because there CPM rates are good. AdBrite, AdToll, Right Media, and BurstMedia are also good. Some of these networks are CPM-only, while others have other types of ads as well. Google also has CPM ads, so you may want to try them.
CPC Ads Are a Profitable Way to Monetize a Site
CPC ads (an abbreviation from Cost Per Click) are different from CPM because with CPC you get paid not when visitors view ads but when they click on them. The good news is that CPC rates are much, much better and as a result, it is possible to make a decent income even with a small site with not so high traffic. The key with CPC success is to have a well-targeted site in a niche where there are a lot of advertisers.
The most popular CPC network is Google Adsense and even though there are other CPC networks, the income you can make from Adsense is much higher. The reason is that Google Adsense has many advertisers and if your site is in a profitable niche, CPC can be the best way to monetize your site. Additionally, unlike some of the other CPC ad networks, Google Adsense is open to publishers from all over the world.
Usually CPC ads are text ads and you publish them in blocks. However, there are also intext ads, where the keyword is underlined and when the visitor puts the mouse on it, a tooltip with CPC ads appears. Intext ads are less obtrusive but it takes much more effort for the user to notice them (and above all – to click them), which means that your chances to make money are further decreased. Kontera is one of the most popular networks for intext CPC ads.
The list of good CPC ad networks is not as long as the list of CPM or CPA ad networks but still there is a choice. For instance, Google Adsense, Yahoo! Publisher Network (YPN), BidVertiser, Chitika, and Clicksor are generally considered top choices but since many CPM ad networks pay decent rates for clicks on their CPM ads, you might want to try them as well.
CPA Ads Could Make You Rich
Currently CPA (an abbreviation from Cost Per Action) is the most profitable way to monetize a site in a profitable niche. CPA, also known as “affiliate programs”, pays you a commission when your visitors perform an action. Most often this action is a purchase, but it could also be something else – i.e. download a free trial, or signup for the service of the advertiser.
Affiliate programs can make you rich because there are many products with really fat commissions. For niches such as health, finance, travel, etc. affiliate programs are a real golden mine.
However, affiliate programs require a lot of efforts on your side and still there is no guarantee that the offers you pick to promote will convert well and will make you money. Sometimes even the highest converting offers won't convert on your site and the only thing you can do is replace them with other offers, hoping that they will convert better.
There might be hundreds of CPM ad networks but for CPA they are thousands. It is practically impossible to try all of them personally and this is why we would like to recommend some of the best to start with. Amazon, eBay, ShareASale, Commission Junction, Clickbank, Max Bounty, Azoogle, Never Blue Ads, LinkShare, or PepperJam are just a few of the greatest CPA networks we can recommend.
Direct Sales of Ads
If you are not happy with the CPM rates of ad networks, or CPC and CPA don't convert well with your site, you could try to find direct advertisers. However, have in mind that such an endeavor is not necessarily bound to succeed and if your site doesn't have audience advertisers can't reach elsewhere, you will hardly be able to negotiate good prices. Still, for some sites direct sales of ad space are a viable alternative and this is why you could also try it.
Openads Ad Server and OIO Publisher Ad Platform are two of the sites where you could try your luck. AdBrite also allowa you to price your ad space. In fact, almost any major network gives you the chance to put a widget on your site to invite advertisers directly to advertise on your site.
Sell Merchandise/Goods/Services
In addition to CPA ads, where you are selling other merchants' products, you can try selling your own merchandise or products/services with your brand. This technique works well mainly for popular sites with loyal audience and is hardly the easiest way to monetize a site. You could try to sell merchandise/goods/services as a supplementary service and if you see that this monetization technique works, you can expand the business. CafePress is one of the best places where you can sell merchandise with your logo and the greatest thing is that they print on demand, which means you don't have to keep your merchandise in stock.
The monetization techniques we described here can be combined. You can run CPM ads together with CPA or CPC. You can also combine multiple ads from the same type (i.e. CPM, CPC, or CPA) from different ad networks, provided that this doesn't violate the terms of these networks. There isn't a universal prescription about the best way to monetize a site. The basic rule is that you need to try and see what works for you. The fact that a given monetization technique works for somebody else doesn't mean that it will work for you, so you need to try and see for yourself.

Top 10 Costly Link Building Mistakes
Link building is one of the most important SEO activities but this certainly doesn't mean that you should build links at any price – literally and figuratively. Link building can be very expensive in terms of time and money. There are many costly link building mistakes and here are some of the most common:
1 Check if backlinks have a “nofollow” attribute
Link exchanges are still one of the white hat ways to build backlinks but unfortunately, there are many unscrupulous webmasters, who will cheat you. One of the scams is when you pay somebody for a backlink, it suddenly disappears or has the “nofollow” attribute. That is why you should check from time to time if the link is still there and if it doesn't have the “nofollow” attribute.
2 Getting good quality links but with useless anchor texts
It is great when PR of the site you are getting links from is high but when the anchor text is “Click here!” or something like that, such a link is barely useful. Keywords in the anchor text are vital, so if the backlink doesn't have them, it isn't a valuable one. Analyzing the anchor texts of links takes time but the Backlink Anchor Text Analyzer tool can do the hard job for you.
3 Getting an image link (when a text link with keyword is possible)
Sometimes when web masters hurry to get backlinks, they skip minor details, such as anchor text. Yes, an image link is great and it could even bring you more visitors than a text link (if the image is attractive, of course and users click it) but for SEO purposes nothing beats a keyword in the anchor text.
4 Not using ALT text if image link is the only possibility
Image links might be the worse option than text links but if an image link is the only possibility to get a backlink, don't reject it. However, make sure that the ALT text of the image link has your keywords – this is more than nothing.
5 Getting backlinks from irrelevant websites
Now, this mistake is really a popular one! When hunting for backlinks, you should concentrate on relevant sites only. If you have a dating site, getting links from a finance one is not valuable. It is true that it is not easy to find relevant sites to get links from but unless your site is in a very narrow niche, chances are that there are hundreds or even thousands of relevant sites you can get a backlink from. If you need a list of such sites for your niche, try the Backlink Builder and see what suggestions it can give you.
6 Getting backlinks from sites/pages with tons of links
A backlink is more valuable, if it comes from a page, which is not cluttered with tons of other backlinks. Many pages have 200, or more links and if your link is one of them, this isn't a great achievement. On the other hand, many directories put the “nofollow” attribute on nonpaid links, so actually even if there are 200 links on page and most of them are “nofollow” (but yours isn't), this still counts.
7 Links from pages spiders can't crawl
A link might look perfectly legitimate (i.e. keywords in the anchor text and no “nofollow” attribute) and still it might not be a link. This is especially an issue with link exchanges because you put a link to the other site but the other site doesn't do the same for you. Links Google can't index can be placed on dynamic pages or simply on pages, which are not indexed by Google because robots.txt bans it. That is why it doesn't hurt to check from time if the pages your links are placed on are accessible to spiders. The Search Engine Spider Simulator tool can help you do this in no time at all.
8 Explicitly selling links
There is hardly a web master who hasn't heard that paid links can hurt your rankings but still many web masters don't miss the chance to make a few bucks. If you really want to sell links, you'd better use the specialized link selling services, such as Backlinks.com because they are more discreet. However, have in mind that while some of the paid links networks try to hide the fact that the links are paid, the rest are not that discreet. Also, maybe the worst gaffe you can make is to include phrases in website like “Buy 5 PR links for $10”or any other hint that you are selling links. You can include “Advertise here!” or similar messages and still de facto sell paid links but this is not as explicit as listing your prices for links.
9 Linking to sites with poor reputation
Linking to sites with poor reputation, also known as “bad neighbors” is one of the worst mistakes you can make. When you link to such sites, for Google this means that you endorse them and this results in penalties for you. That is why you must absolutely always check the sites (and their reputation) first before you link to them. Even if you are offered a lot of money to link to a site with poor reputation, you'd better decline the offer because otherwise your rating with search engines will suffer and this will cause you a lot of problems.
10 Linking to good sites gone bad
Even if you check carefully the sites you link to, sometimes it happens that a site, which used to be more or less decent all of a sudden starts publishing porn ads or other objectionable content. That is why it doesn't hurt if you check not only that the outbound links you have are not broken but also where they lead to.
Links are very important and that is why you should pay attention to what links you are getting. It is not a waste of time to monitor what's going on with your links and in addition to the tools listed in the article, you can also try the Backlink Summary tool.
10 Ways to Get Traffic for Free
Getting traffic is one of the most important tasks for any web master. What is more, you can't get traffic once and then just reap the benefits. Getting traffic is an ongoing task and you must be constantly doing it, if you want to get traffic and keep it. Here are some of the ways to get traffic for free:
1Optimize your site for search engines.
Search engines have always been a major way to get traffic for free. That is why you need to do your homework and optimize your site so that it ranks well for the keywords you target. SEO is still the most powerful way to get traffic for free and you really need to invest some time and efforts in the optimization of your site. SEO is not that difficult and if you want to get familiar with it in a nutshell, check our SEO Tutorial. If you are too busy for that, you can start with the 15 Minute SEO article.
2Frequently update the contents of your site. .
If you expected some shocking secrets revealed, you might be a bit disappointed. One of the first steps in getting traffic for free is trivial but vital – get great content and frequently update it. In terms of SEO, content is king. If your content is good and frequently updated you will not only build a loyal audience of recurring visitors, who will often come to see what is new, but search engines will also love your site.
3Take advantage of social bookmarking sites.
Social bookmarking sites (especially the most popular among them) are another powerful way to get traffic for free. If you want to learn how to do it, check the How to get Traffic from Social Bookmarking sites article, where we have explained what to do if you want to get free traffic from sites such as Digg, Delicious, etc.
4Use your Twitter and Facebook accounts.
Social networks are also a way to get traffic for free. If you are popular on networks, such as Twitter or Facebook, the traffic you get from there can easily surpass the traffic from Google and the other search engines. It is true that building a large network of targeted followers on Twitter and supporters on Facebook takes a lot of time and effort but generally the result is worth.
5Get links with other sites in your niche.
Another way to get traffic for free is from other sites in your niche. Getting links with other sites in your niche is also good for SEO, especially if you manage to get links without the famous nofollow attribute. But even if the links are nofollow (i.e. they are useless for SEO), they still help to get traffic to your site. If you manage to put your link in a visible place on a site with high volumes of traffic, you can get thousands of hits from this link alone. If you need list of sites within your niche where you could get backlinks from, check the Backlink Builder tool. However, be careful if you exchange links because linking to bad neighbors can do you a lot of harm.
6 Use any chance to promote your site for free.
Free promotion is always welcome, so don't neglect it. There are many ways to promote your site for free and some of the most popular ones include free classified ads, submissions to directories, inclusion in various listings, etc. It is true that not all free ways to promote your site work well but if you select the right places to promote your site for free, this can also result in tons of traffic.
7Create a free product or service.
Content drives most traffic when you offer something useful. There are many types of useful content you can create and they largely depend on the niche of your site. You can have articles with tons of advice, or short tips but one of the most powerful ways to get traffic is to create a free product or service. When this product or service gets popular and people start visiting your site, chances are that they will visit the other sections of the site as well.
8Use viral content.
Free products and services are great for getting free traffic to your site and one of the best varieties in this aspect is viral content. Viral content is called so because it distributes like a virus – i.e. when users like your content, they send it to their friends, post it on various sites, and promote it for free in many different ways. Viral content distributes on its own and your only task is to create it and submit it to a couple of popular sites. After that users pick it and distribute it for you. Viral content can be a hot video or a presentation but it can also be a good old article or an image.
9Use offline promotion.
Offline promotion is frequently forgotten but it is also a way to get traffic for free. Yes, computers are everywhere and many people spend more time online than offline but still life hasn't moved completely on the Web. Offline promotion is also very powerful and if you know how to use it, this can also bring you many visitors. Some of the traditional offline ways to promote your site include printing its URL on your company's business cards and souvenirs or sticking it on your company vehicles. You can also start selling T-shirts and other merchandise with your logo and this way make your brand more popular.
10Include your URL in your signature.
URLs in forum signatures are also a way to get traffic for free. There are forums, which get millions of visitors a day and if you are a popular user on such a forum, you can use this to get traffic to your site. When you post on forums and people like your posts, they tend to click the link to your site on your signature to learn more about you. In rare cases you might be able to post a deep link (i.e. a link to an internal page of the site) rather than a link to your homepage and this is also a way to focus attention to a particular page. Unfortunately, deep links are rarely allowed.
Getting traffic for free is a vast topic and it is not possible to list all the ways to do it. However, if you know the most important ways – i.e. the ways we discussed in this article and you apply them properly, it is guaranteed that you will be able to get lots of traffic for free.

How to get traffic from Facebook
Facebook is not the first social network but it is the most popular one. There have been many other social networks before Facebook and while some of them were popular at some point in time, none could reach the popularity of Facebook. In addition to keeping in touch with your friends, Facebook can be (and is) used for business. You can use it to promote your products and services, to acquire new clients, or to get traffic to your site.
Like Twitter, Facebook is just one of the many ways to get some traffic to your site. Many marketers believe that it is just a matter of time for the traffic from Facebook, Twitter and the other major social networking sites to surpass the traffic their sites get from Google.
While this time might come, don't take this as a promise that even if you do everything right, Facebook, Twitter, or any other similar site will do traffic miracles for you. For some people Facebook works like a charm, for others it doesn't work at all. The same applies to Twitter. You can't know in advance if Facebook and/or Twitter will crash your server with traffic. Just try both and see which one (if any) works for you.
Unlike Twitter, which is very simplistic, Facebook offers more possibilities. Yes, you might need more time in order to explore all the possibilities and take advantage of them but hopefully these efforts will have a great return in terms of traffic. Here are some tips that can help you turn Facebook into a traffic monster:
1 Your profile is your major weapon
As with Twitter and any other social network, if you don't make your profile interesting, you will hardly become popular. Give enough background information for you and don't forget to make your profile public because this way even people, who don't know you, when they encounter your profile, they might become interested in you and become a supporter of yours.
2Include information about your site on your Wall and in the photo gallery
Facebook gives you the opportunity to write a lot about you and your endeavors, as well as to include pictures, so use all these opportunities to build interest in you and your products. It is even better to post videos and fill in the other tabs, so if you have something meaningful to put there, just do it.
3Build your network
As with other social networking sites, your network is your major capital. That is why you need to invite your friends, acquaintances, and partners and ask them to join as your supporter. You should also search for people with interests similar to yours. However, don't be pushy and don't spam because this is not the way to convince people to join your network.
4Post regularly
No matter how interesting the stuff in your Facebook profile is, if you don't publish new content regularly, the traffic to your Facebook profile (and respectively the Facebook traffic to your site) will slow down. If you can post daily, it is fine but even if you don't post that regularly, try to do it as frequently as you can. If nothing else, updating your status regularly is more than nothing, so do it.
5Be active
A great profile, an impressive network, and posting regularly are just a part of the recipe for success on Facebook. You also need to be active – visit the profiles of your supporters, take part in their groups and other initiatives, visit their sites. You are right that all this takes a lot of time and you might soon discover that Facebooking is a full-time occupation but if you notice an increase in traffic to your site, then all this is worth.
6Arrange your page
Unlike other social networks, Facebook gives you more flexibility and you can move around many of the boxes. If you put the RSS feed with the links to your blog in a visible space, this alone can generate lots of traffic for you.
7Check what Facebook apps are available
Facebook apps are numerous and new and new ones are released all the time. While many of these apps are not exactly what you need, there are apps, which can work for you in a great way. For instance, MarketPlace widget/plugin or Blog Friends widget are very useful and you should take advantage of them. You can also use the widgets for crossposting (i.e. posting directly on Twitter from Facebook) because this saves you time.
8Use Facebook Social Ads
If you can't get traffic the natural way, you might consider using Facebook Social ads. These are PPC ads and starting a campaign is similar to an Adwords campaign.
9Start a group
There are many groups on Facebook but it is quite probable that there is a free niche for you. Start a group about something related to your business and invite people to join it. The advantage of this approach is that you are getting targeted users – i.e. people, who are interested in you, your product, your ideas, etc.
10 Write your own Facebook extensions
While this step is certainly not for everybody, if you can write Facebook extensions, this is one more way to make your Facebook profile popular and get some traffic to your site.
11Use separate profiles
Unfortunately, social networks do expose a lot of personal information and you are not paranoid, if you don't want so much publicity. Many people are rightfully worried about their privacy on social network sites and that is why it is not uncommon to have one personal profile for friends and one business profile to promote their business. You can have one single profile for both purposes, but if you have privacy concerns, consider separating this profile in two – you'd better be safe than sorry.
Facebook is changing all the time and no matter how hard you try to follow these changes, there will be new and new possibilities for you to explore. That is why it is not possible to compile a complete list of all the tactics you can use in order to drive traffic from Facebook to your site. Anyway, if you try just the basics for Facebook success we listed here, chances are that you will see a considerable traffic increase.
How to get traffic from Twitter
Twitter is one of the latest and greatest Web 2.0 apps and it gets tons of traffic. However, from the point of view of a SEO expert, it is more important that Twitter can get you tons of traffic as well. So, if you still don't have an account with Twitter, you'd better open one.
Twitter is simple to use and this is what made it so popular. Twitter is fashionable right now, so enjoy the moment. Even the creators of Twitter admit that as with MySpace and other Web 2.0 sensations, Twitter will inevitably go out of fashion some day, so hurry up and get some traffic for free now, when it is still all the rage of the season.
Twitter is simple to use, yet it is really powerful. You might need a couple of hours to get familiar with the basic functionality of Twitter and of some of the extras it has but you can harness its power, even if you don't know it very well.
Unlike most of the other places you can get traffic for free, Twitter is a microblogging platform, which means that there are restrictions on the number of characters in a message. Therefore, you need to be concise in your Tweets and use your space wisely. In addition to being concise, here are some more tips to help you get traffic from Twitter:
1Make your Twitter profile interesting
Your profile and your username are the first two things your visitors will see when they go to your Twitter page. If your profile looks boring, people won't bother to read your tweets, not to mention visit the links you post in them. You can't write a very long bio of yours, but you can enter a few words about you – i.e. your occupation, your interests, etc. You can also include a couple of keywords in your bio.
2 Pick a niche-targeted username
Your username is also very important. You need to pick a username that is targeted at your niche. For instance, if you are promoting your SEO services and want to drive traffic to your SEO site, you can choose something like SEOmaster, SEOguru, SEOservices, etc. Your username will show in searches other users make and this is why you must pay attention to what you choose.
3 Put your site/blog URL in your profile
According to some statistics, 80% of tweeters don't provide an URL in their bio! Well, maybe these people are not SEO experts/Internet marketers and they don't need this traffic but you as a SEO expert can't afford to miss it. So, don't forget to include your URL in your profile!
4 Send the link to your profile to your friends, coworkers, and acquaintances
Your friends, coworkers, and acquaintances will be your most loyal audience, so if they don't know about your Twitter page, make them aware of what they are missing. If you have their emails, or know their accounts on other networks, you can send a mass invite.
5Search for Twitter users with similar interests
You might have millions of friends, but more followers are always welcome. That is why you can use the search functions on Twitter and find people with similar interests. Find as many as you can and invite them all. These people might not be as loyal as your friends, coworkers, and acquaintances but still you will get hits from them as well. Some Twitter users report that about 1-2% of their followers visit their site a day, which means that if you have 1,000 followers, you might expect to get at least 10 or 20 visits a day to your site. This response rate might seem low but there are ways to increase it.
6 Socialize on Twitter as much as you can
When you are active in Twitter, respond to the posts of your followers and visit their links, this seriously increases your chances that you will get the same in return. In a word, actively follow those that follow you.
7 Tweet regularly
As with all other kinds of media, if you want to keep your audience, you need to feed it regularly. Writing a short tweet takes just seconds, but it is enough in order to keep your followers happy. It goes without saying, that you should tweet about useful things, so if you don't have something meaningful to post about you or your sites, it is quite OK to post a link to an article, a video, a blog, etc. you found on the Net and you like.
7Don't spam
You might feel that every single user on Twitter is interested in you and your blog/site but this is not exactly so. You might be tempted to make as many users as you can aware of your Twitter page and your latest tweets but you'd better refrain from doing this, unless you want to see if you can get a ban or not.
8Take advantage of Twitterfeed
Twitterfeed is one more useful service you can take advantage of in order to increase your reach. Go to twitterfeed.com and configure your feeds.
9Make Twitter Search love you
Twitter has a great search function and its main advantage is that it offers real-time results. Google might be fast in indexing pages but its indexing is not real-time. Users are hungry for hot news and nothing beats a real-time search. Many bloggers report that they are getting more traffic from Twitter than from Google and partially this is due to the fact that their tweets are popular and users find them with ease.
10Add Twitter gadgets to your site
There are tons of Twitter gadgets and new ones are being released every day. The cool thing about Twitter gadgets is that your blog visitors can become your Twitter followers. If your Twitter followers have many followers, chances are that some of these followers will notice you and will join your network. As we already mentioned, building a large and targeted network is key to getting more Twitter traffic to your site.
These are some of the main ways in which you can get traffic from Twitter. If you are creative and if you monitor what's going on on Twitter and what new Twitter gadgets are released, you will certainly find more ways to drive traffic from Twitter to your site.
HTML 5 and SEO
HTML 5 is still in the making but for any SEO expert, who tries to look ahead, some knowledge about HTML 5 and how it will impact SEO is not unnecessary information. It is true that the changes and the new concepts in HTML 5 will impact Web developers and designers much more than SEO experts but still it is far from the truth to say that HTML 5 will not mean changes in the Organic SEO policy.
What's New in HTML 5?
HTML 5 follows the way the Net evolved in the last years and includes many useful tags and elements. At first glance, it might look as if HTML 5 is going in the direction of a programming language (i.e. PHP) but actually this is not so – it is still an XML–based presentation language. The new tags and elements might make HTML 5 look more complex but this is only at first glance.
HTML 5 is not very different from HTML 4. One of the basic ideas in the development of HTML 5 was to ensure backward compatibility and because of that HTML 5 is not a complete revamp of the HTML specification. So, if you had worries that you will have to start learning it from scratch, these worries are groundless.

How the Changes in HTML 5 Will Affect SEO?
As a SEO expert, you are most likely interested mainly in those changes in the HTML 5 specification, which will affect your work. Here are some of them:
•    Improved page segmentation. Search engines are getting smarter and there are many reasons to believe that even now they are applying page segmentation. Basically, page segmentation means that a page is divided into several separate parts (i.e. main content, menus, headers, footers, links sections, etc.) and these parts are treated as separate entries. At present, there is no way for a Web master to tell search engines how to segment a page but this is bound to change in HTML 5.
•    A new <article> tag. The new <article> tag is probably the best addition from a SEO point of view. The <article> tag allows to mark separate entries in an online publication, such as a blog or a magazine. It is expected that when articles are marked with the <article> tag, this will make the HTML code cleaner because it will reduce the need to use <div> tags. Also, probably search engines will put more weight on the text inside the <article> tag as compared to the contents on the other parts of the page.
•    A new <section> tag. The new <section> tag can be used to identify separate sections on a page, chapter, book. The advantage is that each section can have its separate HTML heading. As with the <article> tag, it can be presumed that search engines will pay more attention to the contents of separate sections. For instance, if the words of a search string are found in one section, this implies higher relevance as compared to when these words are found all across the page or in separate sections.
•    A new <header> tag. The new <header> tag (which is different from the head element) is a blessing for SEO experts because it gives a lot of flexibility. The <header> tag is very similar to the <H1> tag but the difference is that it can contain a lot of stuff, such as H1, H2, H3 elements, whole paragraphs of text, hard–coded links (and this is really precious for SEO), and any other kind of info you feel relevant to include.
•    A new <footer> tag. The <footer> tag might not be as useful as the <header> one but still it allows to include important information there and it can be used for SEO purposes as well. The <header> and <footer> tags can be used many times on one page – i.e. you can have a separate header/footer for each section and this gives really a lot of flexibility.
•    A new <nav> tag. Navigation is one of the important factors for SEO and everything that eases navigation is welcome. The new <nav> tag can be used to identify a collection of links to other pages.
As you see, the new tags follow the common structure of a standard page and each of the parts (i.e. header, footer, main section) has a separate tag. The tags we described here, are just some (but certainly not all) of the new tags in HTML 5, which will affect SEO in some way. For instance, <audio>, <video> or <dialogue> tags are also part of the HTML 5 standard and they will allow to further separate the content into the adequate categories. There are many other tags but they are of relatively lower importance and that is why they are not discussed.
For now HTML 5 is still far in the future. When more pages become HTML 5–compliant, search engines will pay more attention to HTML 5. Only then it will be possible to know how exactly search engines will treat HTML 5 pages. The mass adoption of HTML 5 won't happen soon and it is a safe bet to say that for now you can keep to HTML 4 and have no concerns. Additionally, it will take some time for browsers to adjust to HTML 5, which further delays the moment when HTML 5 will be everywhere.
However, once HTML 5 is accepted and put to use, it will be the dominating standard for the years to come and that is why you might want to keep an eye on what other web masters are doing, just to make sure that you will not miss the moment when HTML 5 becomes the defacto standard.
SEO Careers during a Recession
I don't know if many people became SEO experts because they planned ahead and thought that SEO careers are relatively stable in the long run, especially when compared to other business areas, or the reasons to make a career in SEO were completely different, but my feeling is that SEO experts are lucky now. Why? Because while the recession makes many industries wrench in pain, many SEO professionals are in top financial shape and full of optimism for the future.
It will be an exaggeration to say that the SEO industry doesn't feel the recession. This is not exactly so but when compared to industries such as automobiles, newspapers, banking, real estate, etc., SEO looks like a coveted island of financial security. This doesn't mean that there is no drop in volumes and everybody in SEO is working for top dollar but as a whole the SEO industry and the separate individuals, who make their living in SEO, are much better than many other employees and entrepreneurs.
What Can You Expect from Your SEO Career During a Recession?
The question about what realistic expectations are is fundamental. I bet there are people in SEO, who are not very happy with their current situation and blame the recession for that. Well, if most of your clients were from troubled industries (cars, real estate, financial services, etc.), then you do have a reason to complain. In such cases you should be happy if you can pay the bills. What you can do (if you haven't already done it) is to look for new customers from other industries.
Another factor that influences your expectations about your SEO career during the recession is your position on the career ladder. It makes a big difference if you work for a company or you are your own boss. Being an employee has always been a more vulnerable position, so if you expect job security, this is easier to achieve when you ares an independent SEO contractor. Mass layoffs might not the common for SEO companies but hired workers are never immune against it.
Additionally, your skill level also affects how your SEO carer will be influenced by the recession. The recession is not the right time for novices to enter SEO. Many people from other industries rush to SEO as a life belt. When these people don't have the right skills and expertise but expect rivers of gold, this inevitably leads to disappointment.

What Makes SEO Careers Recession-Proof?
So, if you are a seasoned SEO practitioner and you don't dream of rivers of gold, you can feel safe with SEO because unlike careers in many other industries SEO careers are relatively recession-proof. Here are some of the reasons why SEO careers are recession-proof:
•    The SEO market is an established market. If you remember the previous recession from the beginning of the century, when the IT industry was among the most heavily stricken, you might be skeptical a bit that now it won't be the same story. No, it is not the same now. SEO is not a new service anymore and the SEO market itself is more established than it was a couple of years ago. This is what makes the present recession different from the previous one – the difference is fundamental and it can't be neglected.
•    SEO is one of the last expenses companies cut. SEO has already become a necessity for companies of any size. Unlike hardware, cars, not to mention entertainment and even business trips, SEO expenses are usually not that big but they help a company to stay aboard. That is why when a company decides to make cuts in the budget, SEO expenses are usually not among the things that get the largest cut (or any cut at all).
•    SEO has great ROI. The Return On Investment (ROI) for money spent on SEO is much higher than the ROI for other types of investments. SEO brings companies money and this is what makes it such a great investment. Stop SEO and the money stops coming as well.
•    Many clients start aggressive SEO campaigns in an attempt to get better results fast. During a recession SEO is even more important. That is why many clients decide that an aggressive SEO campaign will help them get more clients and as a result these clients double their pre-recession budgets.
•    SEO is cheaper than PPC. SEO is just one of the many ways for a site to get traffic. However, it is also one of the most effective ways to drive tons of traffic. For instance, if you consider PPC, the cost advantages of SEO are obvious. PPC is very expensive and as a rule, ranking high in organic search results even for competitive keywords is cheaper than PPC.
•    Cheaper than traditional promotion methods. Traditional promotion methods (i.e. offline marketing) are still an option but their costs are higher even than PPC and the other forms of online promotion. Besides many companies have given offline marketing completely and have turned to SEO as their major way to promote their business and attract new clients.
•    SEO is an recurring expense. Many businesses build their business model around memberships and other forms of recurring payments. For you memberships and other types of recurring payments are presold campaigns – i.e. more or less you know that if the client is happy with a campaign you did for him, he or she will return. Acquiring recurring clients is very beneficial because you have less expenses in comparison to acquiring clients one by one.
The outlook for SEO careers during times of recession is pretty positive. As we already mentioned, it is possible to experience drops in volumes or some of your clients to go the bankruptcy road but as a whole SEO offers more stability than many other careers. If you manage to take advantage of the above mentioned recession-proof specifics of SEO and you are a real professional, you won't have the pleasure to feel the recession in all its bitterness.

Bing Optimization
Ever since Microsoft launched its Bing search engine, it has drawn a lot of interest (and speculation) from the SEO community. On one hand, this is quite logical because Bing is intended to be one more heavy-weight player and it is expected to cut some share from Google. On the other hand, this is hardly the first time a new heavy-weight player comes to the ring, so maybe the expectations that Bing will put an end to Google's monopoly are groundless. Still, Bing is quite different (in a positive way) from the other search engines and this is its major strength.
Is Bing Really That Different?
The first impression you get when you go to Bing.com is that it is different – the background makes it cute but sure, there have been many other cases of search engines with tons of graphical frills to disguise their irrelevant search algorithms. However, when you type a search term, the results you get are a pleasant surprise because they are relevant.
It is this relevance of search results that worries SEO experts. The results you get when you search with Bing are relevant, yet they are very different from Google's. Actually, no matter if you search with Google or with Bing (or if you go to Bingle, you can compare the result sets side by side), you get relevant results and the two sets are very different from one another.
One of the most important things SEO experts are curious to know about Bing is its algorithm. Obviously, Bing's algorithm is different from Google's because when the search term is the same but the set of results is different, a difference in the algorithm is the obvious answer. Actually, the question is exactly what is different between the two algorithms and if the difference is so drastic that it makes it mandatory to reoptimize a site for Bing.
What Do I Need to Do In Order to Optimize My Site for Bing?
Wait. This is the first thing you need to do. Right now it is too early to say what steps (if any) are required in order to optimize your site for Bing.
Additionally, no matter how promising Bing looks, it is still early to predict if it will become a real competitor to Google or if it will become one more failed attempt to dethrone Google. Let's see how users react – will they start Binging more or will they stick to Google. When it becomes clear that Bing will be able to make it, then it will make sense to optimize for it as well. So for now the best you can do is wait.
Which Factors Make a Site Rank Well With Bing
As you probably guess, the exact algorithm of Bing is not publicly available and because of that there is a lot of speculation about what weighs more for Bing (in comparison to Google) and what weighs less. Many SEO experts test different search queries, analyze the results, and based on that try to figure out what of the known SEO tactics works with Bing. For instance, these tests are quite interesting.
Some SEO experts even think that Bing is actually Live Search in new clothes (i.e. user interface), while others say that there are noticeable differences between Live Search and Bing. But there is no doubt that for now Bing is a significant improvement over Live Search in terms of relevance of search results.
Bing is hardly the first time when there is no agreement in the SEO community about the intricacies of the algorithm but if we can summarize, here are some factors, which are (or at least are strongly believed to be) of importance when Bing optimization is concerned:
•    Backlinks are of less importance. If you compare the first 10 results in Bing and Google, it is noticeable that all equal, the winners in Bing have less backlinks than the winners in Google. It is unclear if nofollow matters with Bing.
•    Inbound anchor text matters more. The quantity of quality inbound links might be of less importance for Bing but the anchor text certainly matters more. Actually, since anchor text is one of the measurements of the quality of inbound links, it isn't much different. Get quality anchor text and you will do well in both Bing and Google.
•    Link spamming won't do much for you on Bing. Since the quantity of backinks (even if they are of supreme quality) seems to be of less importance to Bing, link spamming will be even less effective than with Google.
•    Onpage factors matter more than with Google. This is one of the most controversial points. Many SEO experts disagree but many also think that onpage factors matter more with Bing than with Google. Still, it has nothing to do with the 90s, when onpage factors were definitive.
•    Bing pays more attention to the authority of the site. If this is true, this is bad news for bloggers and small sites because it means that search results are distorted in favor of older sites and/or sites of authoritative organizations. Age of domain is also very important with Bing – even more than with Google.
•    PR matters less. When you perform a search for a competitive keyword and you see a couple of PR2 or even PR1 sites among the top 10 results, this might make you wonder. On Google this is hardly possible but on Bing it looks quite normal.
•    Fresh content matters less. Bing looks a bit conservative – or maybe it just can't index sites that quickly – but it seems that fresh content is not so vital as with Google. This is related to the age of domain specifics and as a result you will see ancient pages rank high (but these ancient pages are relevant to the search query).
•    Bing is more Flash-friendly. Optimizing a Flash site for Google is a bit of a SEO nightmare. It is too early to say but it looks like Bing is more Flash-friendly, which is good news to all sites where Flash is (still) heavily employed.
For now it is too early to say which factors are of primary importance with Bing. But the fact that their search results are relevant means that their algorithm is really precise. Well, maybe the relevant results in Bing are due to the fact that web masters were taken by surprise and they haven't had the time to optimize for Bing. As a result, the content is authentic, there are no SEO gimmicks and artificial pumping. We'll see if this will stay so in the future, when web masters learn how to optimize for Bing as well!

Top 10 SEO Mistakes
1Targetting the wrong keywords
This is a mistake many people make and what is worse – even experienced SEO experts make it. People choose keywords that in their mind are descriptive of their website but the average users just may not search them. For instance, if you have a relationship site, you might discover that “relationship guide” does not work for you, even though it has the “relationship” keyword, while “dating advice” works like a charm. Choosing the right keywords can make or break your SEO campaign. Even if you are very resourceful, you can't think on your own of all the great keywords but a good keyword suggestion tool, for instance, the Website Keyword Suggestion tool will help you find keywords that are good for your site.
2Ignoring the Title tag
Leaving the <title> tag empty is also very common. This is one of the most important places to have a keyword, because not only does it help you in optimization but the text in your <title> tag shows in the search results as your page title.
3A Flash website without a html alternative
Flash might be attractive but not to search engines and users. If you really insist that your site is Flash-based and you want search engines to love it, provide an html version. Here are some more tips for optimizing Flash sites. Search engines don't like Flash sites for a reason – a spider can't read Flash content and therefore can't index it.
4JavaScript Menus
Using JavaScript for navigation is not bad as long as you understand that search engines do not read JavaScript and build your web pages accordingly. So if you have JavaScript menus you can't do without, you should consider build a sitemap (or putting the links in a noscript tag) so that all your links will be crawlable.
5Lack of consistency and maintenance
Our friend Rob from Blackwood Productions often encounters clients, who believe that once you optimize a site, it is done foreve. If you want to be successful, you need to permanently optimize your site, keep an eye on the competition and – changes in the ranking algorithms of search engines.
6Concentrating too much on meta tags
A lot of people seem to think SEO is about getting your meta keywords and description correct! In fact, meta tags are becoming (if not already) a thing of the past. You can create your meta keywords and descriptions but don't except to rank well only because of this.
7Using only Images for Headings
Many people think that an image looks better than text for headings and menus. Yes, an image can make your site look more distinctive but in terms of SEO images for headings and menus are a big mistake because h2, h2, etc. tags and menu links are important SEO items. If you are afraid that your h1 h2, etc. tags look horrible, try modifying them in a stylesheet or consider this approach: http://www.stopdesign.com/articles/replace_text.
8Ignoring URLs
Many people underestimate how important a good URL is. Dynamic page names are still very frequent and no keywords in the URL is more a rule than an exception. Yes, it is possible to rank high even without keywords in the URL but all being equal, if you have keywords in the URL (the domain itself, or file names, which are part of the URL), this gives you additional advantage over your competitors. Keywords in URLs are more important for MSN and Yahoo! but even with Google their relative weight is high, so there is no excuse for having keywordless URLs.
9Backlink spamming
It is a common delusion that it more backlinks are ALWAYS better and because of this web masters resort to link farms, forum/newgroup spam etc., which ultimately could lead to getting their site banned. In fact, what you need are quality backlinks. Here are some more information on The Importance of Backlinks
10Lack of keywords in the content
Once you focus on your keywords, modify your content and put the keywords wherever it makes sense. It is even better to make them bold or highlight them.
Choosing SEO as Your Career
Its always better to know in advance what you can expect from a career in SEO.
Some Good Reasons to Choose SEO as Your Career
1High demand for SEO services
Once SEO was not a separate profession – Web masters performed some basic SEO for the sites they managed and that was all. But as sites began to grow and make money, it became more reasonable to hire a dedicated SEO specialist than to have the Web master do it. The demand for good SEO experts is high and is constantly on the rise.
2A LOT of people have made a successful SEO career
There are many living proofs that SEO is a viable business. The list is too long to be quoted here but some of the names include Rob from Blackwood Productions, Jill Wahlen from High Rankings, Rand Fishkin from SEO Moz and many others.
3Search Engine Optimizers make Good Money !
SEO is a profession that can be practiced while working for a company or as a solo practitioner. There are many jobboards like Dice and Craigslist that publish SEO job advertisements. It is worth noting that the compensation for SEO employees is equal to or even higher than that of developers, designers and marketers. Salaries over $80K per annum are not an exception for SEO jobs.
As a solo SEO practitioner you can make even more money. Almost all freelance sites have sections for SEO services and offers for $50 an hour or more are quite common. If you are still not confident that you can work on your own, you can start a SEO job, learn a bit and then start your own company.
If you already feel confident that you know a lot about SEO, you can take this quiz and see how you score. Well, don't get depressed if you didn't pass – here is a great checklist that will teach you a lot, even if you are already familiar with SEO.
4Only Web–Designing MAY NOT be enough
Many companies offer turn–key solutions that include Web design, Web development AND SEO optimization. In fact, many clients expect that when they hire somebody to make their site, the site will be SEO friendly, so if you are good both as a designer and a SEO expert, you will be a truely valuable professional.
On the other hand, many other companies are dealing with SEO only because they feel that this way they can concentrate their efforts on their major strentgh – SEO, so you can consider this possibility as well.
5Logical step ahead if you come from marketing or advertising
The Web has changed the way companies do business, so to some extent today's marketers and advertisers need to have at least some SEO knowledge if they want to be successful. SEO is also a great career for linguists.
6Lots of Learning
For somebody who comes from design, development or web administration, SEO might look not technical enough and you might feel that you will downgrade if you move to SEO. Don't worry so much – you can learn a LOT from SEO, so if you are a talented techie, you are not downgrading but you are actually upgrading your skills packages.
7SEO is already recognized as a career
Finally, if you need some more proof that SEO is a great career, have a look at the available SEO courses and exams for SEO practitioners. Well, they might not be a CISCO certification but still they help to institutionalize the SEO profession.
Some Ugly Aspects of SEO
1Dependent on search engines
It is true that in any career there are many things that are outside of your control but for SEO this is a rule number one. Search engines frequently change their algorithms and what is worse – these changes are not made public, so even the greatest SEO gurus admit that they make a lot of educated guesses about how things work. It is very discouraging to make everything perfect and then to learn that due to a change in the algorithm, your sites dropped 100 positions down. But the worst part is that you need to communicate this to clients, who are not satisfied with their sinking ratings.
2No fixed rules
Probably this will change over time but for now the rule is that there are no rules – or at least not written ones. You can work very hard, follow everything that looks like a rule and still success is not coming. Currently you can't even rely on bringing a search engine to court because of the damages they have done to your business because search engines are not obliged to rank high sites that have made efforts to get optimized.
3Rapid changes in rankings
But even if you somehow manage to get to the top for a particular keyword, keeping the position requires constant efforts. Well, many other businesses are like that, so this is hardly a reason to complain – except when an angry customer starts shouting at you that this week their ratings are sinking and of course this is all your fault.
4SEO requires Patience
The SEO professional and customers both need to understand that SEO takes constant effort and time. It could take months to move ahead in the ratings, or to build tens of links. Additionally, if you stop optimizing for some time, most likely you will experience a considerable drop in ratings. You need lots of motivation and patience not to give up when things are not going your way.
5Black hat SEO
Black hat SEO is probably one of the biggest concerns for the would–be SEO practitioner. Fraud and unfair competition are present in any industry and those who are good and ethical suffer from this but black hat SEO is still pretty widespread. It is true that search engines penalize black hat practices but still black hat SEO is a major concern for the industry.
So, let's hope that by telling you about the pros and cons of choosing SEO as your career we have helped you make an informed decision about your future.

How to get Traffic from Social Bookmarking sites
Sites like digg.com, reddit.com, stumbleupon.com etc can bring you a LOT of traffic. How about getting 20,000 and more visitors a day when your listing hits the front page?
Getting to the front page of these sites is not as difficult as it seems. I have been successful with digg and del.icio.us (and not so much with Reddit though the same steps should apply to it as well) multiple times and have thus compiled a list of steps that have helped me succeed:
 add to del.icio.us
1Pay attention to your Headlines
Many great articles go unnoticed on social bookmarking sites because their headline is not catchy enough. Your headline is the first (and very often the only) thing users will see from your article, so if you don't make the effort to provide a catchy headline, your chances of getting to the front page are small.
Here are some examples to start with :-
Original headline :
The Two Types of Cognition Modified Headline : Learn to Understand Your Own Intelligence
Original headline: Neat way to organize and find anything in your purse instantly!
Modified Headline : How to Instantly Find Anything in Your Purse Here is a good blog post that should help you with your headlines.
2Write a meaningful & short description
The headline is very important to draw attention but if you want to keep that attention, a meaningful description is vital. The description must be slightly provocative because this draws more attention but still, never use lies and false facts to provoke interest. For instance, if your write “This article will reveal to you the 10 sure ways to deal with stress once and forever and live like a king from now on.” visitors will hardly think that your story is true and facts-based.
You also might be tempted to use a long tell-it-all paragraph to describe your great masterpiece but have in mind that many users will not bother to read anything over 100-150 characters. Additionally, some of the social bookmarking sites limit descriptions, so you'd better think in advance how to describe your article as briefly as possible.
3Have a great first paragraph
This is a rule that is always true but for successful social bookmarking it is even more important. If you have successfully passed Level 1 (headlines) and Level 2 (description) in the Catch the User's Attraction game, don't let a bad first paragraph make them leave your site.
4Content is king
However, the first paragraph is not everything. Going further along the chain of drawing (and retaining) users' attention, we reach the Content is King Level. If your articles are just trash, bookmarking them is useless. You might cheat users once but don't count on repetitive visits. What is more, you can get your site banned from social bookmarking sites, when you persistently post junk.
5Make it easy for others to vote / bookmark your site
It is best when other people, not you, bookmark your site. Therefore, you must make your best to make it easier for them to do it. You can put a bookmarking button at the end of the article, so if users like your content, they can easily post it. If you are using a CMS, check if there is an extension that allows to add Digg, Del.icio.us, and other buttons but if you are using static HTML, you can always go to the social bookmarking site and copy the code that will add their button to your pages.
Here is a link that should help you add Links for Del.icio.us, Digg, and More to your pages.
6Know when to submit
The time when you submit can be crucial for your attempts to get to the front page. On most social bookmarking sites you have only 24 hours to get to the front page and stay there. So, if you post when most users (and especially your supporters) are still sleeping, you are wasting valuable time. By the time they get up, you might have gone to the tenth page. You'd better try it for yourself and see if it works for you but generally posting earlier than 10 a.m. US Central Time is not good. Many people say that they get more traffic around 3 p.m. US Central Time. Also, workdays are generally better in terms of traffic but the downside is that you have more competitors for the front page than on weekends.
7Submit to the right category
Sometimes a site might not work for you because there is no right category for you. Or because you don't submit to the right category – technology, health, whatever – but to categories like General, Miscellaneous, etc. where all unclassified stuff goes. And since these categories fill very fast, your chance to get noticed decreases.
8Build a top-profile
Not all users are equal on social bookmarking sites. If you are an old and respected user who has posted tons of interesting stuff, this increases the probability that what you submit will get noticed. Posting links to interesting articles on other sites is vital for building a top-profile. Additionally, it is suspicious, when your profile has links to only one site. Many social bookmarking sites frown when users submit their own content because this feels like self-promotion.
9Cooperate with other social bookmarkers
The Lonely Wolf is a suicidal strategy on sites like StubleUpon, Digg, Netscape. Many stories make it to the front page not only because they are great but because they are backed up by your network of friends. If in the first hours after your submittal you get at least 15 votes from your friends and supporters, it is more likely that other users will vote for you. 50 votes can get you to the top page of Digg.
10Submit in English
Linguistic diversity is great but the majority of users are from English-speaking countries and they don't understand exotic languages. So, for most of the social bookmarking sites submitting anything in a language different from English is not recommendable. The languages that are at an especial disadvantage are Chinese, Arabic, Slavic languages and all the other that use non-latin alphabet. German, Spanish, French are more understandable but still they are not English. If you really must submit your story (i.e. because you need the backlink), include an English translation at least of the title. But the best way to proceed with non-English stories is to post them on where they belong. Check this link for a list of non-English sites.
11Never submit old news
Submitting old news will not help you in becoming a respected user. Yesterday's news is history. But if you still need to submit old stuff, consider feature articles, howtos and similar pieces that are up-to-date for a long time.
12Check your facts
You must be flattered that users read your postings but you will hardly be flattered when users prove that you haven't got the facts right. In addition to sarcastic comments, you might also receive negative votes for your story, so if you want to avoid this, check you facts - or your readers will do it.
13Check you spelling
Some sites do not allow to edit your posts later, so if you misspell the title, the URL, or a keyword, it will stay this way forever.
14Not all topics do well
But sometimes even great content and submitting to the right category do not push you to the top. One possible reason could be that your stories are about unpopular topics. Many sites have topics that their users love and topics that don't sell that well. For instance, Apple sells well on Digg and The War in Iraq on Netscape. Negative stories - about George Bush, Microsoft, evil multinational companies, corruption and crime also have a chance to make it to the front page. You can't know these things in advance but some research on how many stories tagged with keywords like yours have made the front page in the last year or so can give you a clue.
15Have Related Articles / Popular Articles
Traffic gurus joke that traffic from social bookmarking sites is like an invasion – the crowds pour in and in a day or two they are gone. Unfortunately this is true – after your listing rolls from the front page (provided that you reached the front page), the drop in traffic is considerable. Besides, many users come just following the link to your article, have a look at it and then they are gone. One of the ways to keep them longer on your site is to have links to Related Articles / Popular Articles or something similar that can draw their attention to other stuff on the site and make them read more than one article.
16RSS feeds, newsletter subscriptions, affiliate marketing
RSS feeds, newsletter subscriptions, affiliate marketing are all areas in which the traffic from social bookmarking sites can help you a lot. Many people who come to your site and like it, will subscribe to RSS feeds and/or your newsletter. So, you need to put these in visible places and then you will be astonished at the number of new subscriptions you got on the day when you were on the front page of a major social bookmarking site.
17Do not use automated submitters
After some time of active social bookmarking, you will discover that you are spending hours on end posting links. Yes, this is a lot of time and using automated submitters might look like the solution but it isn't. Automated submitters often have malware in them or are used for stealing passwords, so unless you don't care about the fate of your profile and don't mind being banned, automated submitters are not the way to go.
18Respond to comments on your stories
Social bookmarking sites are not a newsgroup but interesting articles can trigger a pretty heated discussion with hundreds of comments. If your article gets comments, you must be proud. Always respond to commends on your stories and even better – post comments on other stories you find interesting. This is a way to make friends and to create a top-profile.
19Prepare your server for the expected traffic
This is hardly a point of minor importance but we take for granted that you are hosting your site on a reliable server that does not crash twice a day. But have in mind that your presence on the front page of a major social bookmarking site can drive you a lot traffic, which can cause your server to crash – literally!
I remember one of the times I was on the front page on Digg, I kept restarting Apache on my dedicated server because it was unable to cope with the massive traffic. I have many tools on my site and when the visitors tried them, this loaded the server additionally.
Well, for an articles site getting so much traffic is not so devastating but if you are hosting on a so-so server, you'd better migrate your site to a machine that can handle a lot of simultaneous hits. Also, check if your monthly traffic allowance is enough to handle 200-500,000 or even more visitors. It is very amateurish to attract a lot of visitors and not be able to serve them because your server crashed or you have exceeded your bandwidth!
20The snowball effect
But despite the differences in the likes of the different social bookmarking communities, there are striking similarities. You will soon discover that if a post is popular on one of the major sites, this usually drives it up on the other big and smaller sites. Usually it is Digg posts that become popular on StumbleUpon and Reddit but there are many other examples. To use this fact to your best advantage, you may want to concentrate your efforts on getting to the front page of the major players only and bet on the snowball effect to drive you to the top on other sites.
An additional benefit of the snowball effect is that if your posting is interesting and people start blogging about it, you can get tons of backlinks from their blogs. This happened to me and the result was that my PR jumped to 6 on the next update.
 Choosing a SEO Company
After you have been dealing for some time with SEO on your own, you discover that no matter how hard you try, your site does not rank well or that your site ranks well but optimizing it for search engines takes all your time and all your other tasks lag behind. If this is the case with you, maybe it is better to consider hiring a SEO company to do the work for you. With so many SEO companies out there, you can't complain that you have no choice. Or is it just the opposite – so many companies but few reliable?
It is stretching the truth to say that there are no reliable SEO companies. Yes, there might be many scam SEO companies but if you know what to look for when selecting a SEO company, the risk of hiring fraudsters is reduced. It is much better if you yourself have a substantial knowledge of SEO and can easily decide if they promise you the starts in the sky or their goals are realistic but even if you are not quite familiar with SEO practices, here is a list with some points to watch for when choosing a SEO company:
•    Do they promise to guarantee #1 ranking? If they do, you have a serious reason to doubt their competencies. As the Google SEO selection tips say, no one can guarantee a #1 ranking in Google. This is true even for not so competitive words.
•    Get recommendation from friends, business partners, etc. Word of mouth is very important for the credibility of a company.
•    Ask in forums. There are many reputable Web master forums, so if you can't find somebody who can recommend you a SEO company right away, consider asking in Web master forums. However, beware that not all forum posters are honest people, so take their opinion (no matter if positive or negative) with a grain of salt. Forums are not such a reliable source of information as in-person contact.
•    Google the company name. If the company is a known fraudster, chances are that you will find a lot of information about it on the Web. However, lack of negative publicity does not mean automatically that the company is great, nor do some subjective negative opinions mean that the company is a scammer.
•    Ask for examples of sites they have optimized. Happy customers are the best form of promotion, so feel free to ask your potential SEO company about sites they have optimized and references from clients. If you get a rejection because of confidentiality reasons, this must ring a bell about the credibility of the SEO company - former customers are not supposed to be a secret.
•    Check the PR of their own site. If they can't optimize their site well enough to get a good PR (over 4-5), they are not worth hiring.
•    Ask them what keywords their site ranks for. Similarly to the page rank factor, if they don't rank well for the keywords of their choice, they are hardly as professional as they are pretending to be.
•    Do they use automated submissions? If they do, stay away from them. Automated submissions can get you banned from search engines.
•    Do they use any black hat SEO tricks? You need to know in advance what black hat SEO is in order to judge them, so getting familiar with the most important black hat SEO tricks is worth before you go and start cross-examining them.
•    Where do they collect backlinks from? Backlinks are very, very important for SEO success but if they come from link farms and other similar sites, this can cause a lot of trouble. So, make sure the SEO firm collects links from reputable sites only.
•    Get some personal impressions, if possible. Gut instinct and impressions from meetings are also a way to judge a company, though sometimes it is not difficult to get mislead, so use this approach with caution.
•    High price does not guarantee high quality. If you are eager to pay more, this does not mean that you will get more. Just because a firm costs more DOES NOT make them better SEO's. There are many reasons for high prices and high quality is only one of them. For instance, the company might work inefficiently and this is the reason for their ridiculously high costs, not the quality of their work.
•    Cheap is more expensive. This is also true. If you think you can pay peanuts for a professional SEO campaign, then you need to think again. Professional SEO companies offer realistic prices.
•    Use tricky questions. Using tricky questions is a double-edged sword, especially if you are not an expert. But there are several easy questions that can help you.
For instance, you might ask them how many search engines they will automatically submit your site to. If they are scammers, they will try to impress you with big numbers. But in this case, the best answer would be "no automatic submissions".
Another tricky question is to ask them if they will place in you top 10 for some competitive keywords of your choice. The trap here is that it is them, not you, who chooses the words that are best for your site. It is not that probable that they will choose exactly the same words as you suggest, so if they tell you that you just give them the words and they push you to the top, tell them “Goodbye”.
•    Do they offer subscription services? SEO is a constant process and if you want to rank well and keep on like that, efforts are necessary all the time. Because of this, it is better to select a company that includes post-optimization maintenance, than get a company that pushes your site to the top and then leaves you in the wild on your own.
We tried to mention some of the most important issues in selecting a SEO company. Of course, there are many other factors to consider and each case is different, so give it some thought, before you sign the contract for hiring a SEO company.
 Keyword Difficulty
The wise choice of the right keywords you will optimize for is the first and crucial step to a successful SEO campaign. If you fail on this very first step, the road ahead is very bumpy and most likely you will only waste your (or your client's) money and time. There are many ways to determine which keywords to optimize for and usually the final list of them is made after a careful analysis of what the online population is searching for, which keywords have your competitors chosen and above all - which are the keywords that you feel describe your site best. All of this is great and certainly this is the way to go but if you want to increase your chances of success, additional research is never too much, especially when its results will save you the shots in the dark.
Dreaming High - Shooting the Top-Notch Keywords?
After you have made a long and detailed list of all the lucrative keywords that are searched by tens of thousands a day, do not hurry yet. It is great that you have chosen popular keywords but it would be even greater if you have chosen keywords for which top positioning is achievable with reasonable effort. If you have many competitors for the keywords you have chosen, chances are, no matter how hard you try, that you will hardly be able to overtake them and place your site amongst the top ten results. And as every SEO knows, if you can't be on the first page (or on the second and in the worst case on the third one) of the organic search results, you'd better think again if the potential gain from optimization for those particular words is worth the effort. It is true that sometimes even sites that are after the first 50 results get decent traffic from search engines but it is certain that you can't count on that. And even if you somehow manage to get to the top, do you have any idea what it will take to keep the good results?
You can feel discouraged that all lucrative keywords are already occupied but it is too early to give up. Low-volume search keywords can be as lucrative as the high-volume ones and their main advantage is that you will have less competition. The SEO experts from Blackwood Productions confirm that it is possible with less effort and within budget to achieve much better results with low-volume search keywords than if you targeted the high-volume search ones. In order to do this, you need to make an estimate about how difficult it would be to rank well for a particular keyword.
Get Down to Earth
The best way to estimate how difficult it would be to rank well for a particular keyword is by using the appropriate tools. If you search the Web, you will see several keyword difficulty tools. Choose a couple of them, for instance Seochat's Keyword Difficulty Tool, Cached's Keyword Difficulty Tool and Seomoz's Keyword Difficulty Tool and off we go. The idea behind choosing multiple tools is not that you have so much free time that you need to find a way to waste it. If you choose only one tool, you will finish your research faster but having in mind the different results that each tool gives, you'd better double check before you start the optimization itself. The Seomoz's tool is a kind of complicated and if you want to use it you need to make several registrations but it is worth the trouble (and the patience - while you wait for the results to be calculated).
You may also want to check for several keywords or keyword phrases. You will be surprised to see how different the estimated difficulty for similar keywords is! For instance, if you are optimizing a financial site, which deals mainly with credits and loans, and some of your keywords are finance, money, credit, loan, and mortgage, running a check with the seochat's Keyword Difficulty Tool produces results like these (the percentages are rounded but you get the idea): finance - 89%, money - 76% credit - 74% loan - 66% mortgage - 65%.
It seems that the keyword finance is very tough and since your site is targeted at credits and loans and not on stock exchange or insurance, which are also branches of finance, there is no need to cry over the fact that it is very difficult to compete for the finance keyword.
The results were similar with the second tool, though it does not give percentages but uses a scale starting from Very Easy to Very Difficult. I did not check all the results with the third tool, because it seems that the seomoz report on keyword difficulty for a particular word needs ages to be compiled but the results were similar, so it becomes clear that it is more feasible to optimize for mortgage and loan, rather than for the broader term finance.
You may want to bookmark some of these tools for future use as well. They will be very useful to monitor possible changes on the keyword difficulty landscape. After you have optimized your site for the keywords you have selected, occasionally check again the difficulty of the keywords you are already optimizing for because the percentages are changing over time and if you discover that the competition for your keywords has increased, make some more efforts to retain the gained positions.
Optimizing for MSN
SEO experts often forget that there are three major search engines. While there is no doubt that Google is the number one with the most searches and Yahoo! manages to get about a quarter of the market, MSN has not retired yet. It holds about 10-15 percent of the searches (according to some sources even less – about 5%) but it has a loyal audience that can't be reached through the other two major search engines, so if you plan a professional SEO campaign, you can't afford to skip MSN. In a sense getting high rankings in MSN is similar to getting high rankings for less popular keywords – because competition is not that tough you might be able to get enough visitors from MSN only in comparison to the case when you have optimized for a more popular search engine.
Although optimizing for MSN is different from optimizing for Google and Yahoo!, there are still common rules that will help you to rank high in any search engine. As a rule, if you rank well in Google, chances are that you will rank well in Yahoo! (if you are interested in the tips and tricks for optimizing for Yahoo!, you want to have a look at the Optimizing for Yahoo! Article) and MSN as well. The opposite is not true, however. If you rank well in MSN, there is no guarantee that you'll do the same in Google. So, when you optimize for MSN, keep an eye on your Google ranking as well. It's no good to top MSN and be nowhere in Google (the opposite is more acceptable, if you need to make the choice).
But why is this so? The answer is simple - the MSN algorithm is different and that is why, even if the same pages were indexed, the search results will vary.
The MSN Algorithm
As already mentioned, it is the different MSN algorithm that leads to such drastic results in ranking. Otherwise, MSN, like all search engines, first spiders the pages on the Web, then indexes them in its database and after that applies the algorithm to generate the pages with the search results. So, the first step in optimizing for MSN is the same as for the other search engines – to have a spiderable site. (Have a look at Search Engine Spider Simulator to see how spiders see your site). If your site is not spiderable, then you don't have even a hypothetical chance to top the search results.
There is quite a lot of speculation about the MSN algorithm. Looking at the search results MSN delivers, it is obvious that its search algorithm is not as sophisticated as Google's, or even Yahoo!'s and many SEO experts agree that the MSN search algorithm is years behind its competitors. So, what can you do in this case? Optimize as you did for Google a couple of years ago? You are not far from the truth, though actually is is not that simple.
One of the most important differences is that MSN still relies heavily on metatags, as explained below. None of the other major search engines uses metatags that heavily anymore. It is obvious that metatags give SEO experts a great opportunity for manipulating search results. Maybe metatags are the main reason for the inaccurate search results that MSN often produces.
The second most important difference between MSN and the other major search engines is their approach to keywords. Well, for MSN keywords are very, very important, too, but unlike Google, for MSN onpage factors are dominating, while offpage factors (like backlinks for example), are still of minor importance. Well, it is a safe bet that the importance of backlinks will be changed in the future but for now they are not a primary factor for high rankings.
Keywords, Keywords, Keywords
It is hardly surprising that keywords are the most important item for MSN. What is surprising is that MSN relies too much on them. It is very easy to fool MSN – just artificially inflate your keyword density, put a couple of keywords in file names (and even better – in domain names) and around the top of the page and you are almost done for MSN. But if you do the above-mentioned black hat practices, your joy of topping MSN will not last long because, unless you provide separate pages that are optimized for Google, your stuffed pages might pretty well get you banned from Google. If you decide to have separate pages for Google and MSN, first, it it hardly worth the trouble, and second, the risk of duplicate content penalty can't be ignored.
So, what is the catch? The catch is that if you try to polish your site for MSN and stuff it with keywords, this might get you into trouble with Google, which certainly is worse than not ranking well in MSN. But if you optimize wisely, it is more likely than not that you will rank decently in Google and perform well in Yahoo! and MSN as well.
Metatags
Having meaningful metatags never hurts but with MSN this is even more important because its algorithm still uses them as a primary factor in calculating search results. Having well-written (not stuffed) metatags will help you with MSN and some other minor search engines, while at the same time well-written metatags will not get you banned from Google.
The Description metatag is very important:
<META NAME=”Description” CONTENT=”Place your description here” />
MSNBot reads its content and based on that (in addition to keywords found on page) judges how to classify your site. So if you leave this tag empty (i.e. CONTENT=””), you have missed a vital chance to be noticed by MSN. There is no evidence that MSN uses the other metatags in its algorithm that is why leaving the Description metatag empty is even more unforgivable.
 Web Directories and Specialized Search Engines
SEO experts spend most of their time optimizing for Google and occasionally one or two other search engines. There is nothing wrong in it and it is most logical, having in mind that topping Google is the lion's share in Web popularity but very often, no matter what you do, topping Google does not happen. Or sometimes, the price you need to pay (not literally but in terms of effort and time) to top Google and keep there is too high. Maybe we should mention here the ultimate SEO nightmare – being banned from Google, when you simply can't use Google (or not at least until you are readmitted to the club) and no matter if you like it or not, you need to have a look about possible alternatives.
What are Google Alternatives
The first alternative to Google is obvious – optimize for the other major search engines, if you have not done it already. Yahoo! and MSN (to a lesser degree) can bring you enough visitors, though sometimes it is virtually impossible to optimize for the three of them at the same time because of the differences in their algorithms. You could also optimize your site for (or at least submit to) some of the other search engines (Lycos, Excite, Netscape, etc.) but having in mind that they altogether hardly have over 3-5% of the Web search traffic, do not expect much.
Another alternative is to submit to search directories (also known as Web directories) and specialized search engines. Search directories might sound so pre-Google but submitting to the right directories might prove better than optimizing for MSN, for example. Specialized search engines and portals have the advantage that the audience they attract consists of people who are interested in a particular topic and if this is your topic, you can get to your target audience directly. It is true that specialized search engines will not bring you as many visitors, as if you were topping Google but the quality of these visitors is extremely high.
Naming all Google alternatives would be a long list and it is outside the scope of this article but just to be a little more precise about what alternatives exist, we cannot skip SEO instruments like posting to blogs and forums or paid advertisements.
Web Directories
What is a Web Directory?
Web directories (or as they are better known – search directories) existed before the search engines, especially Google, became popular. As the name implies, web directories are directories where different resources are gathered. Similarly to desktop directories, where you gather files in a directory based on some criterion, Web directories are just enormous collections of links to sites, arranged in different categories. The sites in a Web directory are listed in some order (most often alphabetic but it is not necessarily so) and users browse through them.
Although many Web directories offer a search functionality of some kind (otherwise it will be impossible to browse thousands of pages for let's say Computers), search directories are fundamentally different from search engines in the two ways – most directories are edited by humans and URLs are not gathered automatically by spiders but submitted by site owners. The main advantage of Web directories is that no matter how clever spiders become, when there is a human to view and check the pages, there is a lesser chance that pages will be classified in the wrong categories. The disadvantages of the first difference are that the lists in web directories are sometimes outdated, if no human was available to do the editing and checking for some time (but this is not that bad because search engines also deliver pages that do not exist anymore) and that sometimes you might have to wait half an year before being included in a search directory.
The second difference – no spiders – means that you must go and submit your URL to the search directory, rather than sit and wait for the spider to come to your site. Fortunately, this is done only once for each directory, so it is not that bad.
Once you are included in a particular directory, in most cases you can stay there as long as you wish to and wait for people (and search engines) to find you. The fact that a link to your site appears in a respectable Web directory is good because first, it is a backlink and second, you increase your visibility for spiders, which in turn raises your chance to be indexed by them.
Examples of Web Directories
There are hundreds and thousands of search directories but undoubtedly the most popular one is DMOZ. It is a general purpose search directory and it accepts links to all kinds of sites. Other popular general-purpose search directories are Google Directory and Yahoo! Directory. The Best of the Web is one of the oldest Web directories and it still keeps to high standards in selecting sites.
Besides general-purpose Web directories, there are incredibly many topical ones. For instance, the The Environment Directory lists links to environmental sites only, while The Radio Directory lists thousands of radio stations worldwide, arranged by country, format, etc. There are also many local and national Web directories, which accept links to sites about a particular region or country only and which can be great if your site is targeted at local and national audience only. You see, it is not possible to mention even the topics of specialized search directories only because the list will get incredibly long. Using Google and specialized search resources like The Search Engines Directory, you can find on your own many directories that are related to your area of interest.
Specialized Search Engines
What is a Specialized Search Engine?
Specialized search engines are one more tool to include in your SEO arsenal. Unlike general-purpose search engines, specialized search engines index pages for particular topics only and very often there are many pages that cannot be found in general-purpose search engines but only in specialized ones. Some of the specialized search engines are huge sites that actually host the resources they link to, or used to be search directories but have evolved to include links not only to sites that were submitted to them. There are many specialized search engines for every imaginable topic and it is always wise to be aware of the specialized search engines for your niche. The examples in the next section are by no means a full list of specialized search engines but are aimed to give you the idea of what is available. If you search harder on the Web, you will find many more resources.
Examples of Specialized Search Engines
Probably specialized search engines are not that numeric as Web directories but still certainly there is no shortage of them either, especially if one counts password-protected sites with database accessible only from within the site as a specialized search engine. As with Web directories, if there were a list of specialized search engines it would be really, really long (and constantly changing), so instead, here are some links to lists of search engines: Pandia Powersearch, Webquest, Virtual Search Engines, the already mentioned The Search Engines Directory, etc. What is common for these lists is that they offer a selection of specialized search engines, arranged by topic, so it is a good starting point for the hunt of specialized search engines.
Importance of Sitemaps
There are many SEO tips and tricks that help in optimizing a site but one of those, the importance of which is sometimes underestimated is sitemaps. Sitemaps, as the name implies, are just a map of your site - i.e. on one single page you show the structure of your site, its sections, the links between them, etc. Sitemaps make navigating your site easier and having an updated sitemap on your site is good both for your users and for search engines. Sitemaps are an important way of communication with search engines. While in robots.txt you tell search engines which parts of your site to exclude from indexing, in your site map you tell search engines where you'd like them to go.
Sitemaps are not a novelty. They have always been part of best Web design practices but with the adoption of sitemaps by search engines, now they become even more important. However, it is necessary to make a clarification that if you are interested in sitemaps mainly from a SEO point of view, you can't go on with the conventional sitemap only (though currently Yahoo! and MSN still keep to the standard html format). For instance, Google Sitemaps uses a special (XML) format that is different from the ordinary html sitemap for human visitors.
One might ask why two sitemaps are necessary. The answer is obvious - one is for humans, the other is for spiders (for now mainly Googlebot but it is reasonable to expect that other crawlers will join the club shortly). In that relation it is necessary to clarify that having two sitemaps is not regarded as duplicate content. In 'Introduction to Sitemaps', Google explicitly states that using a sitemap will never lead to penalty for your site.
Why Use a Sitemap
Using sitemaps has many benefits, not only easier navigation and better visibility by search engines. Sitemaps offer the opportunity to inform search engines immediately about any changes on your site. Of course, you cannot expect that search engines will rush right away to index your changed pages but certainly the changes will be indexed faster, compared to when you don't have a sitemap.
Also, when you have a sitemap and submit it to the search engines, you rely less on external links that will bring search engines to your site. Sitemaps can even help with messy internal links - for instance if you by accident have broken internal links or orphaned pages that cannot be reached in other way (though there is no doubt that it is much better to fix your errors than rely on a sitemap).
If your site is new, or if you have a significant number of new (or recently updated pages), then using a sitemap can be vital to your success. Although you can still go without a sitemap, it is likely that soon sitemaps will become the standard way of submitting a site to search engines. Though it is certain that spiders will continue to index the Web and sitemaps will not make the standard crawling procedures obsolete, it is logical to say that the importance of sitemaps will continue to increase.
Sitemaps also help in classifying your site content, though search engines are by no means obliged to classify a page as belonging to a particular category or as matching a particular keyword only because you have told them so.
Having in mind that the sitemap programs of major search engines (and especially Google) are still in beta, using a sitemap might not generate huge advantages right away but as search engines improve their sitemap indexing algorithms, it is expected that more and more sites will be indexed fast via sitemaps.
Generating and Submitting the Sitemap
The steps you need to perform in order to have a sitemap for your site are simple. First, you need to generate it, then you upload it to your site, and finally you notify Google about it.
Depending on your technical skills, there are two ways to generate a sitemap - to download and install a sitemap generator or to use an online sitemap generation tool. The first is more difficult but you have more control over the output. You can download the Google sitemap generator from here. After you download the package, follow the installation and configuration instructions in it. This generator is a Python script, so your Web server must have Python 2.2 or later installed, in order to run it.
The second way to generate a sitemap is easier. There are many free online tools that can do the job for you. For instance, have a look at this collection of Third-party Sitemap tools. Although Google says explicitly that it has neither tested, nor verified them, this list will be useful because it includes links to online generators, downloadable sitemap generators, sitemap plugins for popular content-management systems, etc., so you will be able to find exactly what you need.
After you have created the sitemap, you need to upload it to your site (if it is not already there) and notify Google about its existence. Notifying Google includes adding the site to your Google Sitemaps account, so if you do not have an account with Google, it is high time to open one. Another detail that is useful to know in advance is that in order to add the sitemap to your account, you need to verify that you are the legitimate owner of the site.
Currently Yahoo! and MSN do not support sitemaps, or at least not in the XML format, used by Google. Yahoo! allows webmasters to submit “a text file with a list of URLs” (which can actually be a stripped-down version of a site map), while MSN does not offer even that but there are rumors that it is indexing sitemaps when they are available onsite. Most likely this situation will change in the near future and both Yahoo! and MSN will catch with Google because user-submitted site maps are just a too powerful SEO tool and cannot be ignored.

How to Build Backlinks
It is out of question that quality backlinks are crucial to SEO success. More, the question is how to get them. While with on-page content optimization it seems easier because everything is up to you to do and decide, with backlinks it looks like you have to rely on others to work for your success. Well, this is partially true because while backlinks are links that start on another site and point to yours, you can discuss with the Web master of the other site details like the anchor text, for example. Yes, it is not the same as administering your own sites – i.e. you do not have total control over backlinks – but still there are many aspects that can be negotiated.
Getting Backlinks the Natural Way
The idea behind including backlinks as part of the page rank algorithm is that if a page is good, people will start linking to it. And the more backlinks a page has, the better. But in practice it is not exactly like this. Or at least you cannot always rely on the fact that your contents is good and people will link to you. Yes, if your content is good and relevant you can get a lot of quality backlinks, including from sites with similar topic as yours (and these are the most valuable kind of backlinks, especially if the anchor text contains your keywords) but what you get without efforts could be less than what you need to successfully promote your site. So, you less than what you need to successfully promote your site. So, you will have to resort to other ways of acquiring quality backlinks as described next.
Ways to Build Backlinks
Even if plenty of backlinks come to your site the natural way, additional quality backlinks are always welcome and the time you spend building them is not wasted. Among the acceptable ways of link building are getting listed in directories, posting in forums, blogs and article directories. The unacceptable ways include inter-linking (linking from one site to another site, which is owned by the same owner or exists mainly for the purpose to be a link farm), linking to spam sites or sites that host any kind of illegal content, purchasing links in bulk, linking to link farms, etc.
The first step in building backlinks is to find the places from which you can get quality backlinks. A valuable assistant in this process is the Backlink Builder tool. When you enter the keywords of your choice, the Backlink Builder tool gives you a list of sites where you can post an article, message, posting, or simply a backlink to your site. After you have the list of potential backlink partners, it is up to you to visit each of the sites and post your content with the backlink to your site in it.
You might wonder why sites as those, listed by the Backlink Builder tool provide such a precious asset as backlinks for free. The answer is simple – they need content for their site. When you post an article, or submit a link to your site, you do not get paid for this. You provide them for free with something they need – content – and in return they also provide you for free with something you need – quality backlinks. It is a free trade, as long as the sites you post your content or links are respected and you don't post fake links or content.
Getting Listed in Directories
If you are serious about your Web presence, getting listed in directories like DMOZ and Yahoo is a must – not only because this is a way to get some quality backlinks for free, but also because this way you are easily noticed by both search engines and potential visitors. Generally inclusion in search directories is free but the drawback is that sometimes you have to wait a couple of months before you get listed in the categories of your choice.
Forums and Article Directories
Generally search engines index forums so posting in forums and blogs is also a way to get quality backlinks with the anchor text you want. If the forum or blog is a respected one, a backlink is valuable. However, in some cases the forum or blog administrator can edit your post, or even delete it if it does not fit into the forum or blog policy. Also, sometimes administrators do not allow links in posts, unless they are relevant ones. In some rare cases (which are more an exception than a rule) the owner of a forum or a blog would have banned search engines from indexing it and in this case posting backlinks there is pointless.
While forum postings can be short and do not require much effort, submitting articles to directories can be more time-consuming because generally articles are longer than posts and need careful thinking while writing them. But it is also worth and it is not so difficult to do.
Content Exchange and Affiliate Programs
Content exchange and affiliate programs are similar to the previous method of getting quality backlinks. For instance, you can offer to interested sites RSS feeds for free. When the other site publishes your RSS feed, you will get a backlink to your site and potentially a lot of visitors, who will come to your site for more details about the headline and the abstract they read on the other site.
Affiliate programs are also good for getting more visitors (and buyers) and for building quality backlinks but they tend to be an expensive way because generally the affiliate commission is in the range of 10 to 30 %. But if you have an affiliate program anyway, why not use it to get some more quality backlinks?
News Announcements and Press Releases
Although this is hardly an everyday way to build backlinks, it is an approach that gives good results, if handled properly. There are many sites (for instance, here is a list of some of them) that publish for free or for a fee news announcements and press releases. A professionally written press release about an important event can bring you many, many visitors and the backlink from a respected site to yours is a good boost to your SEO efforts. The tricky part is that you cannot release press releases if there is nothing newsworthy. That is why we say that news announcements and press releases are not a commodity way to build backlinks.
Backlink Building Practices to Avoid
One of the practices that is to be avoided is link exchange. There are many programs, which offer to barter links. The principle is simple – you put a link to a site, they put a backlink to your site. There are a couple of important things to consider with link exchange programs. First, take care about the ratio between outbound and inbound links. If your outbound links are times your inbound, this is bad. Second (and more important) is the risk that your link exchange partners are link farms. If this is the case, you could even be banned from search engines, so it is too risky to indulge in link exchange programs.
Linking to suspicious places is something else that you must avoid. While it is true that search engines do not punish you if you have backlinks from such places because it is supposed that you have no control over what bad guys link to, if you enter a link exchange program with the so called bad neighbors and you link to them, this can be disastrous to your SEO efforts. For more details about bad neighbors, check the Bad Neighborhood article. Also, beware of getting tons of links in a short period of time because this still looks artificial and suspicious.
Your Website from Google Banned to Google Unbanned
Even if you are not looking for trouble and do not violate any known Google SEO rule, you still might have to experience the ultimate SEO nightmare - being excluded from Google’s index. Although Google is a kind of a monopolist among search engines, it is not a bully company that excludes innocent victims for pure pleasure. Google keeps rigorously to SEO best practices and excludes sites that misbehave.
If you own and run a blog or website then being listed by Google is a very important step so it is read by as many people as possible; but what if your website gets Google banned? If this has happened to you, then you know that it hurts your site because you won’t show up in the Google search engine and that means less traffic to your site. Getting unbanned from Google is a long and drawn out process. And sometimes Google won’t even tell you the reason they banned your website in the first place, which doesn’t make things any easier.
Some of the ways a site can be Google banned include having spam on it, putting in too many keywords that clog up your site, making your owned URLs redirect to each other, improperly inserting a robot.txt file, duplicating your own pages and sending people to them over and over, and linking to bad sites like those with adult content, gambling or other unauthorized areas. There are multiple other reasons, so it’s a good idea to try to get them to let you know the reason for being Google banned. That will make it much simpler to fix the problem. Over-optimization has many faces and you can have a look at the Optimization, Over-Optimization or SEO Overkill? Article to get some ideas of practices that you should avoid.
Here are the necessary steps that you need to follow in order to get Google reconsideration for getting unbanned. Be sure to follow Google reconsideration request process precisely and correctly if you want to get your website unbanned and get your site back in business providing whatever products or services that it has:
1 Send an Google Reconsideration Request for getting Unbanned
Getting Google reinclusion of your website requires putting in a Google reconsideration request. First, the way you know your site is Google banned is that suddenly it doesn’t have a page rank on it. Then, in order to determine for sure that this is the case, enter your site at www.yoursite.com into Google, using whatever the name of your site is instead of the words yoursite. If you don’t see any of your pages there, then it’s likely you were Google banned.
Another way to tell if you are truly Google banned is to see if your pages show up in page indexing on Google. Or, if it is a news blog then you can go to www.googlenews.com and if you don’t see your articles there, you will also know you were probably banned from Google and now need to send a Google reconsideration request.
2 Be Polite to Google
Next, make remember that you are sending your Google reconsideration request to a real person who works for Google and someone will actually read your reconsideration request at Google office to be unbanned. Therefore you want to be polite and go into as much detail as possible, as it is better to give too much information than not enough in this situation. Being nice counts in this situation and if you act like a jerk, then it’s likely no one will want to help you.
3 Provide Information about the Domain
List things such as if it was a brand new domain name, tell them some background about your website, and also tell them the rules you think you may have broken. In case, there has been spam click on your account, get to the proofs of the same and write to them about it. This shows them you are serious about resolving the problem when sending the Google reconsideration suggestion. Put down everything that you think someone would need to know in order to know who you are and to jog their memory on why you were banned in the first place. Be sure to do your research so you will understand what his going on and can fully explain it to the Google representatives while sending the reconsideration request to Google.
4 Explain the Solution to the Past Problem
While sending the reconsideration request to Google, tell the representative what you have already done to fix the problem that caused you to be banned. Spell it out in detail and give them your actual page URL to prove it. It’s best to give as much information and data as you can so they will understand what you did to solve the issue. For example, if you had your site linked to bad links, then you must make sure that you remove every one of those and unlink them. Be sure to have removed all spam, or anything else that Google doesn’t approve or like. Then, prove to Google that you did this by showing them the evidence. Or, if you had invalid clicks, which is one of the common reasons to get Google banned, show why the clicks were valid. It takes all this sort of information in detail to make them understand the situation and help you to resolve it. Also, ensure that the changes now made to your website meet the requirements for Google reinclusion. Don’t do even a single thing on your website, which may annoy them.
5 Verify the Website
Next, login to your Google webmaster account and add and verify your site. Then go to http://www.google.com/webmasters/tools/reconsideration. This is the area that you use to put in your reconsideration request to Google to be unbanned. You can also send the information in an email to help@google.com. This is where Google representatives give support to customers. You may have to also sign up for Google webmaster tools once you are logged into your account if you don’t already have it
6 Provide Proof
It’s never a good idea to be an idiot and try to blame Google, or try to say you didn’t have any idea what you did wrong. You need real proof for Google reinclusion, not just blame or acting stupid. Show them the proof of the changes you made for Google reconsideration. And at last, always be considerate and thank them for the time and effort that they are taking to look into your reconsideration request to Google and help you to solve the problems and to get you unbanned from Google so your site can be relisted and you can keep getting the traffic you need to run your business, blog or news site.
7 Be Patient
It can take several weeks for a Google representative to get back with you and answer your Google reinclusion suggestion. They do have a lot of other things to handle and you need to understand that you aren’t the only one who may be having issues. While you are waiting, continue to look over your site and try to make sure all the alleged violations are fixed and good to go.
8 Send Follow Up Email for Google reconsideration
Be sure to send follow-up emails to Google to ask how the request is going and if they know when the situation will be resolved. You probably shouldn’t send one every day, because this could be regarded as you being a pest, but be sure to send one in periodically until you get an answer that you understand and can deal with to solve the Google banned problem.
All in all, it can be a time consuming and complicated process in order for your site to switch your site from Google banned to Google unbanned, but with the proper preparation and information, you should be well on your way to being in their good graces again. However, it’s well worth your efforts, so just follow these steps and Google should get back with you and fix your situation and your site.
Optimizing Flash Sites
If there is a really hot potato that divides SEO experts and Web designers, this is Flash. Undoubtedly a great technology to include sounds and picture on a Web site, Flash movies are a real nightmare for SEO experts. The reason is pretty prosaic – search engines cannot index (or at least not easily) the contents inside a Flash file and unless you feed them with the text inside a Flash movie, you can simply count this text lost for boosting your rankings. Of course, there are workarounds but until search engines start indexing Flash movies as if they were plain text, these workarounds are just a clumsy way to optimize Flash sites, although certainly they are better than nothing.
Why Search Engines Dislike Flash Sites?
Search engines dislike Flash Web sites not because of their artistic qualities (or the lack of these) but because Flash movies are too complex for a spider to understand. Spiders cannot index a Flash movie directly, as they do with a plain page of text. Spiders index filenames (and you can find tons of these on the Web), but not the contents inside.
Flash movies come in a proprietary binary format (.swf) and spiders cannot read the insides of a Flash file, at least not without assistance. And even with assistance, do not count that spiders will crawl and index all your Flash content. And this is true for all search engines. There might be differences in how search engines weigh page relevancy but in their approach to Flash, at least for the time beings, search engines are really united – they hate it but they index portions of it.
What (Not) to Use Flash For?
Despite the fact that Flash movies are not spider favorites, there are cases when a Flash movie is worth the SEO efforts. But as a general rule, keep Flash movies at a minimum. In this case less is definitely better and search engines are not the only reason. First, Flash movies, especially banners and other kinds of advertisement, distract users and they generally tend to skip them. Second, Flash movies are fat. They consume a lot of bandwidth, and although dialup days are over for the majority of users, a 1 Mbit connection or better is still not the standard one.
Basically, designers should keep to the statement that Flash is good for enhancing a story, but not for telling it – i.e. you have some text with the main points of the story (and the keywords that you optimize for) and then you have the Flash movie to add further detail or just a visual representation of the story. In that connection, the greatest SEO sin is to have the whole site made in Flash! This is is simply unforgivable and do not even dream of high rankings!
Another “No” is to use Flash for navigation. This applies not only to the starting page, where once it was fashionable to splash a gorgeous Flash movie but external links as well. Although it is a more common mistake to use images and/or javascript for navigation, Flash banners and movies must not be used to lead users from one page to another. Text links are the only SEO approved way to build site navigation.
Workarounds for Optimizing Flash Sites
Although a workaround is not a solution, Flash sites still can be optimized. There are several approaches to this:
•    Input metadata
This is a very important approach, although it is often underestimated and misunderstood. Although metadata is not as important to search engines as it used to be, Flash development tools allow easily to add metadata to your movies, so there is no excuse to leave the metadata fields empty.
•    Provide alternative pages
For a good site it is a must to provide html only pages that do not force the user to watch the Flash movie. Preparing these pages requires more work but the reward is worth because not only users, but search engines as well will see the html only pages.
•    Flash Search Engine SDK
This is the life-belt. The most advanced tool to extract text from a Flash movie. One of the handiest applications in the Flash Search Engine SDK is the tool named swf2html. As it name implies, this tool extracts text and links from a Macromedia Flash file and writes the output unto a standard HTML document, thus saving you the tedious job to do it manually.
However, you still need to have a look at the extracted contents and correct it, if necessary. For example, the order in which the text and links is arranged might need a little restructuring in order to put the keyword-rich content in the title and headings or in the beginning of the page.
Also, you need to check if there is no duplicate content among the extracted sentences and paragraphs. The font color of the extracted text is also another issue. If the font color of the extracted text is the same as the background color, you will run into hidden text territory.

•    SE-Flash.com
Here is a tool that visually shows what from your Flash files is visible to search engines and what is not. This tool is very useful, even if you already have the Flash Search Engine SDK installed because it provides one more check of the accuracy of the extracted text. Besides, it is not certain that Google and the other search engines use Flash Search Engine SDK to get contents from a Flash file, so this tool might give completely different results from those that the SDK will produce.
These approaches are just some of the most important examples of how to optimize Flash sites. There are many other approaches as well. However, not all of them are brilliant and clear, or they can be classified on the boundary of ethical SEO – e.g. creating invisible layers of text that is delivered to spiders instead the Flash movie itself. Although this technique is not wrong – i.e. there is no duplicate or fake content, it is very similar to cloaking and doorway pages and it is better to avoid it.

Bad Neighborhood
Has it ever happened to you to have a perfectly optimized site with lots of links and content and the right keyword density and still do not rank high in search engines? Probably every SEO has experienced this. The reasons for such kind of failure can be really diverse – starting from the sandbox effect (your site just needs time to get mature), to overoptimization and inappropriate online relations (i.e. the so called “bad neighborhood” effect).
While there is not much you can do about the sandbox effect but wait, in most other cases it is up to you to counteract the negative effects you are suffering from. You just need to figure out what is stopping you from achieving the deserved rankings. Careful analysis of your site and the sites that link to you can give you ideas where to look for for the source of trouble and deal with it. If it is overoptimization – remove excessive stuffing; if it is bad neighbors – say “goodbye” to them. We have already deals with overoptimization as a SEO overkill and in this article we will have a look at another frequent rankings killer.
Link Wisely, Avoid Bad Neighbors
It is a known fact that one of the most important items for high rankings, especially with Google, are links. The Web is woven out of links and inbound and outbound links are most natural. Generally, the more inbound links (i.e. other sites link to you) you have, the better. On the contrary, if you have many outbound links, this is not very good. And what is worse – it can be disastrous, if you link to improper places – i.e. bad neighbors. The concept is hardly difficult to comprehend – it is so similar to real life: if you choose outlaws or bad guys for friends, you are considered to be one of them.
It might look unfair to be penalized for things that you have not done but linking to sites with bad reputation is equal to a crime for search engines and by linking to such a site, you can expect to be penalized as well. And yes, it is fair because search engines do penalize sites that use different tricks to manipulate search results. In a way, in order to guarantee the integrity of search results, search engines cannot afford to tolerate unethical practices.
However, search engines tend to be fair and do not punish you for things that are out of your control. If you have many inbound links from suspicious sites, this will not be regarded as a malpractice on your side because generally it is their Web master, not you, who has put all these links. So, inbound links, no matter where they come from, cannot harm you. But if in addition to inbound links, you have a considerable amount of outbound links to such sites, in a sense you vote for them. Search engines consider this as malpractice and you will get punished.

Why Do Some Sites Get Labelled as Bad Neighbors?
We have already mentioned in this article some of the practices that are a reason for search engines to ban particular sites. But the “sins” are not only limited to being a spam domain. Generally, companies get blacklisted because they try to boost their ranking by using illegal techniques such as keyword stuffing, duplicate content (or lack of any original content), hidden text and links, doorway pages, deceptive titles, machine-generated pages, copyright violators, etc. Search engines also tend to dislike meaningless link directories that conceive the impression that they are topically arranged, so if you have a fat links section on your site, double-check what you link to.
Figuring Out Who's Good, Who's Not
Probably the question that is popping is: “But since the Web is so vast and so constantly changing, how can I know who is good and who is bad?” Well, you don't have to know each of the sites on the black list, even if it were possible. The black list itself is changing all the time but it looks like there will always be companies and individuals who are eager to earn some cash by spamming, disseminating viruses and porn or simply performing fraudulent activities.
The first check you need to perform when you have doubts that some of the sites you are linking to are bed neighbors is to see if they are included in the indices of Google and the other search engines. Type “site:siteX.com”, where “siteX.com” is the site you are performing a check about and see if Google returns any results from it. If it does not return any results, chances are that this site is banned from Google and you should immediately remove any outbound links to siteX.com.
If you have outbound links to many different sites, such checks might take a lot of time. Fortunately, there are tools that can help you in performing this task. The CEO of Blackwood Productions has recommended http://www.bad-neighborhood.com/ as one of the reliable tools that reports links to and from suspicious sites and sites that are missing in Google's index.
What is Robots.txt
Robots.txt
It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty. Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages (although in this case the only sure way for not indexing sensitive data is to keep it offline on a separate machine). Additionally, if you want to save some bandwidth by excluding images, stylesheets and javascript from indexing, you also need a way to tell spiders to keep away from these items.
One way to tell search engines which files and folders on your Web site to avoid is with the use of the Robots metatag. But since not all search engines read metatags, the Robots matatag can simply go unnoticed. A better way to inform search engines about your will is to use a robots.txt file.
What Is Robots.txt?
Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.
The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.
The concept and structure of robots.txt has been developed more than a decade ago and if you are interested to learn more about it, visit http://www.robotstxt.org/ or you can go straight to the Standard for Robot Exclusion because in this article we will deal only with the most important aspects of a robots.txt file. Next we will continue with the structure a robots.txt file.
Structure of a Robots.txt File
The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:
User-agent:
Disallow:
“User-agent” are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:
# All user agents are disallowed to see the /temp directory.
User-agent: *
Disallow: /temp/
The Traps of a Robots.txt File
When you start making complicated files – i.e. you decide to allow different user agents access to different directories – problems can start, if you do not pay special attention to the traps of a robots.txt file. Common mistakes include typos and contradicting directives. Typos are misspelled user-agents, directories, missing colons after User-agent and Disallow, etc. Typos can be tricky to find but in some cases validation tools help.
The more serious problem is with logical errors. For instance:
User-agent: *
Disallow: /temp/
User-agent: Googlebot
Disallow: /images/
Disallow: /temp/
Disallow: /cgi-bin/
The above example is from a robots.txt that allows all agents to access everything on the site except the /temp directory. Up to here it is fine but later on there is another record that specifies more restrictive terms for Googlebot. When Googlebot starts reading robots.txt, it will see that all user agents (including Googlebot itself) are allowed to all folders except /temp/. This is enough for Googlebot to know, so it will not read the file to the end and will index everything except /temp/ - including /images/ and /cgi-bin/, which you think you have told it not to touch. You see, the structure of a robots.txt file is simple but still serious mistakes can be made easily.
Tools to Generate and Validate a Robots.txt File
Having in mind the simple syntax of a robots.txt file, you can always read it to see if everything is OK but it is much easier to use a validator, like this one: http://tool.motoricerca.info/robots-checker.phtml. These tools report about common mistakes like missing slashes or colons, which if not detected compromise your efforts. For instance, if you have typed:
User agent: *
Disallow: /temp/
this is wrong because there is no slash between “user” and “agent” and the syntax is incorrect.
In those cases, when you have a complex robots.txt file – i.e. you give different instructions to different user agents or you have a long list of directories and subdirectories to exclude, writing the file manually can be a real pain. But do not worry – there are tools that will generate the file for you. What is more, there are visual tools that allow to point and select which files and folders are to be excluded. But even if you do not feel like buying a graphical tool for robots.txt generation, there are online tools to assist you. For instance, the Server-Side Robots Generator offers a dropdown list of user agents and a text box for you to list the files you don't want indexed. Honestly, it is not much of a help, unless you want to set specific rules for different search engines because in any case it is up to you to type the list of directories but is more than nothing.
Jumping Over the Google Sandbox
It's never easy for newcomers to enter a market and there are barriers of different kinds. For newcomers to the world of search engines, the barrier is called a sandbox – your site stays there until it gets mature enough to be allowed to the Top Positions club. Although there is no direct confirmation of the existence of a sandbox, Google employees have implied it and SEO experts have seen in practice that new sites, no matter how well optimized, don't rank high on Google, while on MSN and Yahoo they catch quickly. For Google, the jailing in the sandbox for new sites with new domains is on average 6 months, although it can vary from less than a month to over 8 months.
Sandbox and Aging Delay
While it might be considered unfair to stop new sites by artificial means like keeping them at the bottom of search results, there is a fair amount of reasoning why search engines, and above all Google, have resorted to such measures. With blackhat practices like bulk buying of links, creation of duplicate content or simply keyword stuffing to get to the coveted top, it is no surprise that Google chose to penalize new sites, which overnight get tons of backlinks, or which are used as a source of backlinks to support an older site (possibly owned by the same company). Needless to say, when such fake sites are indexed and admitted to top positions, this deteriorates search results, so Google had to take measures for ensuring that such practices will not be tolerated. The sandbox effect works like a probation period for new sites and by making the practice of farming fake sites a long-term, rather than a short-term payoff for site owners, it is supposed to decrease its use.
Sandbox and aging delay are similar in meaning and many SEO experts use them interchangeably. Aging delay is more self-explanatory – sites are “delayed” till they come of age. Well, unlike in legislation, with search engines this age is not defined and it differs. There are cases when several sites were launched in the same day, were indexed within a week from each other but the aging delay for each of them expired in different months. As you see, the sandbox is something beyond your control and you cannot avoid it but still there are steps you can undertake to minimize the damage for new sites with new domains.
Minimizing Sandbox Damages
While Google sandbox is not something you can control, there are certain steps you can take in order to make the sandbox effect less destructive for your new site. As with many aspects of SEO, there are ethical and unethical tips and tricks and unethical tricks can get you additional penalties or a complete ban from Google, so think twice before resorting to them. The unethical approaches will not be discussed in this article because they don comply with our policy.
Before we delve into more detail about particular techniques to minimize sandbox damage, it is necessary to clarify the general rule: you cannot fight the sandbox. The only thing you can do is to adapt to it and patiently wait for time to pass. Any attempts to fool Google – starting from writing melodramatic letters to Google, to using “sandbox tools” to bypass the filter – can only make your situation worse. There are many initiatives you can take, while in the sandbox, for as example:
•    Actively gather content and good links – as time passes by, relevant and fresh content and good links will take you to the top. When getting links, have in mind that they need to be from trusted sources – like DMOZ, CNN, Fortune 500 sites, or other reputable places. Also, links from .edu, .gov, and .mil domains might help because these domains are usually exempt from the sandbox filter. Don't get 500 links a month – this will kill your site! Instead, build links slowly and steadily.
•    Plan ahead– contrary to the general practice of launching a site when it is absolutely complete, launch a couple of pages, when you have them. This will start the clock and time will be running parallel to your site development efforts.
•    Buy old or expired domains – the sandbox effect is more serious for new sites on new domains, so if you buy old or expired domains and launch your new site there, you'll experience less problems.
•    Host on a well- established host – another solution is to host your new site on a subdomain of a well-established host (however, free hosts are generally not a good idea in terms of SEO ranking). The sandbox effect is not so severe for new subdomains (unless the domain itself is blacklisted). You can also host the main site on a subdomain and on a separate domain host just some contents, linked with the main site. You can also use redirects from the subdomained site to the new one, although the effect of this practice is also questionable because it can also be viewed as an attempt to fool Google.
•    Concentrate on less popular keywords – the fact that your site is sandboxed does not mean that it is not indexed by Google at all. On the contrary, you could be able to top the search results from the very beginning! Looking like a contradiction with the rest of the article? Not at all! You could top the results for less popular keywords – sure, it is better than nothing. And while you wait to get to the top for the most lucrative keywords, you can discover that even less popular keywords are enough to keep the ball rolling, so you may want to make some optimization for them.
•    Rely more on non-Google ways to increase traffic – it is often reminded that Google is not the only search engine or marketing tool out there. So if you plan your SEO efforts to include other search engines, which either have no sandbox at all or the period of stay there is relatively short, this will also minimize the damages of the sandbox effect.
Optimizing for Yahoo!
Back in the dawn of the Internet, Yahoo! was the most popular search engine. When Google arrived, its indisputably precise search results made it the preferred search engine. However, Google is not the only search engine and it is estimated that about 20-25% or searches are conducted on Yahoo! Another major player on the market is MSN, which means that SEO professionals cannot afford to optimize only for Google but need to take into account the specifics of the other two engines (Yahoo! and MSN) as well.
Optimizing for three search engines at the same time is not an easy task. There were times, when the SEO community was inclined to think that the algorithm of Yahoo! was on deliberately just the opposite to the Google algorithm because pages that ranked high in Google did not do so well in Yahoo! and vice versa. The attempt to optimize a site to appeal to both search engines usually lead to being kicked out of the top of both of them.
Although there is no doubt that the algorithms of the two search engines are different, since both are constantly changing, none of them is made publicly available by its authors and the details about how each of the algorithms function are obtained by speculation based on probe-trial tests for particular keywords, it is not possible to say for certain what exactly is different. What is more, having in mind the frequency with which algorithms are changed, it is not possible to react to every slight change, even if algorithms' details were known officially. But knowing some basic differences between the two does help to get better ranking. A nice visual representation of the differences in positioning between Yahoo! and Google gives the Yahoo vs Google tool.
The Yahoo! Algorithm - Differences With Google
Like all search engines, Yahoo! too spiders the pages on the Web, indexes them in its database and later performs various mathematical operations to produce the pages with the search results. Yahoo! Slurp (the Yahoo! spiderbot) is the the second most active spider crawler on the Web. Yahoo! Slurp is not different from the other bots and if your page misses important elements of the SEO mix that make it not spiderable, then it hardly makes a difference which algorithm will be used because you will never get to a top position. (You may want to try the Search Engine Spider Simulator and check what of your pages is spiderable).
Yahoo! Slurp might be even more active than Googlebot because occasionally there are more pages in the Yahoo! index than in Google. Another alleged difference between Yahoo! and Google is the sandbox (putting the sites “on hold” for some time till they appear in search results). Google's sandbox is deeper, so if you have made recent changes to your site, you might have to wait a month or two (shorter for Yahoo! and longer for Google) till these changes are reflected in the search results.
With new major changes in the Google algorithm under way (the so-called “BigDaddy” Infrastructure expected to be fully launched in March-April 2006) it's hard to tell if the same SEO tactics will be hot on Google in two months' time. One of the supposed changes is the decrease in weight of links. If this happens, a major difference between Yahoo! and Google will be eliminated because as of today Google places more importance on factors such as backlinks, while Yahoo! sticks more to onpage factors, like keyword density in the title, the URL, and the headings.
Of all the differences between Yahoo! and Google, the way keywords in the title and in the URL are treated is the most important. If you have the keyword in these two places, then you can expect a top 10 place in Yahoo!. But beware – a title and an URL cannot be unlimited and technically you can place no more than 3 or 4 keywords there. Also, it matters if the keyword in the title and in the URL is in a basic form or if it is a derivative – e.g. when searching for “cat”, URLs with “catwalk” will also be displayed in Yahoo! but most likely in the second 100 results, while URLs with “cat” only are quite near to the top.
Since Yahoo! is first a directory for submissions and then a search engine (with Google it's just the opposite), a site, which has the keyword in the category it is listed under, stands a better chance to be in the beginning of the search results. With Google this is not that important. For Yahoo! keywords in filenames also score well, while for Google this is not a factor of exceptional importance.
But the major difference is keyword density. The higher the density, the higher the positioning with Yahoo! But beware – some of the keyword-rich sites on Yahoo! can with no difficulty fall into the keyword-stuffed category for Google, so if you attempt to score well on Yahoo! (with keyword density above 7-8%), you risk to be banned by Google!
Yahoo! WebRank
Following Google's example, Yahoo! introduced a Web toolbar that collects anonymous statistics about which sites users browse, thus way getting an aggregated value (from 0 to 10) of how popular a given site is. The higher the value, the more popular a site is and the more valuable the backlinks from it are.
Although WebRank and positioning in the search results are not directly correlated, there is a dependency between them – sites with high WebRank tend to position higher than comparable sites with lower WebRank and the WebRanks of the top 20-30 results for a given keyword are most often above 5.00 on average.
The practical value of WebRank as a measure of success is often discussed in SEO communities and the general opinion is that this is not the most relevant metrics. However, one of the benefits of WebRank is that it alerts Yahoo! Slurp that a new page has appeared, thus inviting it to spider it, if it is not already in the Yahoo! Search index.
When Yahoo! toolbar was launched in 2004, it had an icon that showed the WebRank of the page that is currently open in the browser. Later this feature has been removed but still there are tools on the Web that allow to check the WebRank of a particular page. For instance, this tool allows to check the WebRanks of a whole bunch of pages at a time.
See Your Site With the Eyes of a Spider
Making efforts to optimize a site is great but what counts is how search engines see your efforts. While even the most careful optimization does not guarantee tops position in search results, if your site does not follow basic search engine optimisation truths, then it is more than certain that this site will not score well with search engines. One way to check in advance how your SEO efforts are seen by search engines is to use a search engine simulator.
Spiders Explained
Basically all search engine spiders function on the same principle – they crawl the Web and index pages, which are stored in a database and later use various algorithms to determine page ranking, relevancy, etc of the collected pages. While the algorithms of calculating ranking and relevancy widely differ among search engines, the way they index sites is more or less uniform and it is very important that you know what spiders are interested in and what they neglect.
Search engine spiders are robots and they do not read your pages the way a human does. Instead, they tend to see only particular stuff and are blind for many extras (Flash, JavaScript) that are intended for humans. Since spiders determine if humans will find your site, it is worth to consider what spiders like and what don't.
Flash, JavaScript, Image Text or Frames?!
Flash, JavaScript and image text are NOT visible to search engines. Frames are a real disaster in terms of SEO ranking. All of them might be great in terms of design and usability but for search engines they are absolutely wrong. An incredible mistake one can make is to have a Flash intro page (frames or no frames, this will hardly make the situation worse) with the keywords buried in the animation. Check with the Search Engine Spider Simulator tool a page with Flash and images (and preferably no text or inbound or outbound hyperlinks) and you will see that to search engines this page appears almost blank.
Running your site through this simulator will show you more than the fact that Flash and JavaScript are not SEO favorites. In a way, spiders are like text browsers and they don't see anything that is not a piece of text. So having an image with text in it means nothing to a spider and it will ignore it. A workaround (recommended as a SEO best practice) is to include meaningful description of the image in the ALT attribute of the <IMG> tag but be careful not to use too many keywords in it because you risk penalties for keyword stuffing. ALT attribute is especially essential, when you use links rather than text for links. You can use ALT text for describing what a Flash movie is about but again, be careful not to trespass the line between optimization and over-optimization.
Are Your Hyperlinks Spiderable?
The search engine spider simulator can be of great help when trying to figure out if the hyperlinks lead to the right place. For instance, link exchange websites often put fake links to your site with _javascript (using mouse over events and stuff to make the link look genuine) but actually this is not a link that search engines will see and follow. Since the spider simulator would not display such links, you'll know that something with the link is wrong.
It is highly recommended to use the <noscript> tag, as opposed to _javascript based menus. The reason is that _javascript based menus are not spiderable and all the links in them will be ignored as page text. The solution to this problem is to put all menu item links in the <noscript> tag. The <noscript> tag can hold a lot but please avoid using it for link stuffing or any other kind of SEO manipulation.
If you happen to have tons of hyperlinks on your pages (although it is highly recommended to have less than 100 hyperlinks on a page), then you might have hard times checking if they are OK. For instance, if you have pages that display “403 Forbidden”, “404 Page Not Found” or similar errors that prevent the spider from accessing the page, then it is certain that this page will not be indexed. It is necessary to mention that a spider simulator does not deal with 403 and 404 errors because it is checking where links lead to not if the target of the link is in place, so you need to use other tools for checking if the targets of hyperlinks are the intended ones.
Looking for Your Keywords
While there are specific tools, like the Keyword Playground or the Website Keyword Suggestions, which deal with keywords in more detail, search engine spider simulators also help to see with the eyes of a spider where keywords are located among the text of the page. Why is this important? Because keywords in the first paragraphs of a page weigh more than keywords in the middle or at the end. And if keywords visually appear to us to be on the top, this may not be the way spiders see them. Consider a standard Web page with tables. In this case chronologically the code that describes the page layout (like navigation links or separate cells with text that are the same sitewise) might come first and what is worse, can be so long that the actual page-specific content will be screens away from the top of the page. When we look at the page in a browser, to us everything is fine – the page-specific content is on top but since in the HTML code this is just the opposite, the page will not be noticed as keyword-rich.
Are Dynamic Pages Too Dynamic to be Seen At All
Dynamic pages (especially ones with question marks in the URL) are also an extra that spiders do not love, although many search engines do index dynamic pages as well. Running the spider simulator will give you an idea how well your dynamic pages are accepted by search engines. Useful suggestions how to deal with search engines and dynamic URLs can be found in the Dynamic URLs vs. Static URLs article.
Meta Keywords and Meta Description
Meta keywords and meta description, as the name implies, are to be found in the <META> tag of a HTML page. Once meta keywords and meta descriptions were the single most important criterion for determining relevance of a page but now search engines employ alternative mechanisms for determining relevancy, so you can safely skip listing keywords and description in Meta tags (unless you want to add there instructions for the spider what to index and what not but apart from that meta tags are not very useful anymore).
Optimization, Over-Optimization or SEO Overkill?
The fight to top search engines' results knows no limits – neither ethical, nor technical. There are often reports of sites that have been temporarily or permanently excluded from Google and the other search engines because of malpractice and using “black hat” SEO optimization techniques. The reaction of search engines is easy to understand – with so many tricks and cheats that SEO experts include in their arsenal, the relevancy of returned results is seriously compromised to the point where search engines start to deliver completely irrelevant and manipulated search results. And even if search engines do not discover your scams right away, your competitors might report you.

Keyword Density or Keyword Stuffing?
Sometimes SEO experts go too far in their desire to push their clients' sites to top positions and resort to questionable practices, like keyword stuffing. Keyword stuffing is considered an unethical practice because what you actually do is use the keyword in question throughout the text suspiciously often. Having in mind that the recommended keyword density is from 3 to 7%, anything above this, say 10% density starts to look very much like keyword stuffing and it is likely that will not get unnoticed by search engines. A text with 10% keyword density can hardly make sense, if read by a human. Some time ago Google implemented the so called “Florida Update” and essentially imposed a penalty for pages that are keyword-stuffed and over-optimized in general.
Generally, keyword density in the title, the headings, and the first paragraphs matters more. Needless to say that you should be especially careful not to stuff these areas. Try the Keyword Density Cloud tool to check if your keyword density is in the acceptable limits, especially in the above-mentioned places. If you have a high density percentage for a frequently used keyword, then consider replacing some of the occurrences of the keyword with synonyms. Also, generally words that are in bold and/or italic are considered important by search engines but if any occurrence of the target keywords is in bold and italic, this also looks unnatural and in the best case it will not push your page up.
Doorway Pages and Hidden Text
Another common keyword scam is doorway pages. Before Google introduced the PageRank algorithm, doorways were a common practice and there were times when they were not considered an illegal optimization. A doorway page is a page that is made especially for the search engines and that has no meaning for humans but is used to get high positions in search engines and to trick users to come to the site. Although keywords are still very important, today keywords alone have less effect in determining the position of a site in search results, so doorway pages do not get so much traffic to a site but if you use them, don't ask why Google punished you.
Very similar to doorway pages was a scam called hidden text. This is text, which is invisible to humans (e.g. the text color is the same as the page background) but is included in the HTML source of the page, trying to fool search engines that the particular page is keyword-rich. Needless to say, both doorway pages and hidden text can hardly be qualified as optimization techniques, there are more manipulation than everything else.
Duplicate Content
It is a basic SEO rule that content is king. But not duplicate content. In terms of Google, duplicate content means text that is the same as the text on a different page on the SAME site (or on a sister-site, or on a site that is heavily linked to the site in question and it can be presumed that the two sites are related) – i.e. when you copy and paste the same paragraphs from one page on your site to another, then you might expect to see your site's rank drop. Most SEO experts believe that syndicated content is not treated as duplicate content and there are many examples of this. If syndicated content were duplicate content, that the sites of news agencies would have been the first to drop out of search results. Still, it does not hurt to check from time if your site has duplicate content with another, at least because somebody might be illegally copying your content and you do not know. The Similar Page Checker tool will help you see if you have grounds to worry about duplicate content.
Links Spam
Links are another major SEO tool and like the other SEO tools it can be used or misused. While backlinks are certainly important (for Yahoo backlinks are important as quantity, while for Google it is more important what sites backlinks come from), getting tons of backlinks from a link farm or a blacklisted site is begging to be penalized. Also, if outbound links (links from your site to other sites) considerably outnumber your inbound links (links from other sites to your site), then you have put too much effort in creating useless links because this will not improve your ranking. You can use the Domain Stats Tool to see the number of backlinks (inbound links) to your site and the Site Link Analyzer to see how many outbound links you have.
Using keywords in links (the anchor text), domain names, folder and file names does boost your search engine rankings but again, the precise measure is the boundary between topping the search results and being kicked out of them. For instance, if you are optimizing for the keyword “cat”, which is a frequently chosen keyword and as with all popular keywords and phrases, competition is fierce, you might not see other alternative for reaching the top but getting a domain name like http://www.cat-cats-kittens-kitty.com, which no doubt is packed with keywords to the maximum but is first – difficult to remember, and second – if the contents does not correspond to the plenitude of cats in the domain name, you will never top the search results.
Although file and folder names are less important than domain names, now and then (but definitely not all the time) you can include “cat” (and synonyms) in them and in the anchor text of the links. This counts well, provided that anchors are not artificially stuffed (for instance if you use “cat_cats_kitten” as anchor for internal site links this anchor certainly is stuffed). While you have no control over third sides that link to you and use anchors that you don't like, it is up to you to perform periodic checks what anchors do other sites use to link to you. A handy tool for this task is the Backlink Anchor Text Analysis, where you enter the URL and get a listing of the sites that link to you and the anchor text they use.
Finally, to Google and the other search engines it makes no difference if a site is intentionally over-optimized to cheat them or over-optimization is the result of good intentions, so no matter what your motives are, always try to keep to reasonable practices and remember that do not overstep the line.

Ranking in Country Specific Search Engines
In the world of Search Engine Optimization, Location is important. Search engines like to bring relevant results to a user, not only in the area of keywords and sites that give the user exactly what they are looking for, but also in the correct language as well. It doesn't do a lot of good for a Russian-speaking individual to continually get websites returned in a search query that are written in Egyptian or in Chinese. So a search engine has to have some way to be able to return the results the user is looking for in the right language, and a search engine's goal is also to try and get the user as close to home as possible in the realm of their search results.
Many people wonder why their websites don't rank well in some search engines, especially if they are trying to get ranked in a search engine based in another country. Perhaps they may not even know they are in another country? You say that is impossible: how could one not know what country they are in? It might surprise that individual to find that their website might in fact be hosted in a completely different country, perhaps even on another continent!
Consider that many search engines, including Google, will determine country not only based on the domain name (like .co.uk or .com.au), but also the country of a website's physical location based upon IP address. Search engines are programmed with information that tells them which IP addresses belong to which particular country, as well as which domain suffixes are assigned to which countries.
Let's say, for instance, that you are wishing to rank highly in Google based in the United States. It would not do well, then, for you to have your website hosted in Japan or Australia. You might have to switch your web host to one whose servers reside in the United States.
There is a tool we like to use called the Website to Country Tool. What this tool does is it allows you to view which country your website is hosted. Not only will this tell you what country your site is hosted in, but it can also help you determine a possible reason why your website may not be ranking as highly as you might like in a particular search engine.
It might be disheartening to learn that your website has been hosted in another country, but it is better to understand why your site might not be ranking as highly as you'd like it to be, especially when there is something you can definitely do about it.
The Age of a Domain Name
One of the many factors in Google's search engine algorithm is the age of a domain name. In a small way, the age of a domain gives the appearance of longevity and therefore a higher relevancy score in Google.
Driven by spam sites which pop up and die off quickly, the age of the domain is usually a sign whether or not a site is yesterday's news or tomorrow's popular site. We see this in the world of business, for example. While the novelty that may go with a new store in town brings a short burst of initial business, people tend to trust a business that has been around for a long time over one that is brand new. The same is true for websites. Or, as Rob from BlackwoodProductions.com says, "Rent the store (i.e. register the domain) before you open for business".
Two things that are considered in the age of a domain name are:
•    The age of the website
•    The length of time a domain has been registered
The age of the website is built up of how long the content has been actually on the web, how long the site has been in promotion, and even the last time content was updated. The length of time a domain has been registered is measured by not only the actual date the domain was registered, but also how long it is registered for. Some domains only register for a year at a time, while others are registered for two, five, or even ten years.
In the latest Google update that SEOs call the Jagger Update, some of the big changes seen were the importance given to age; age of incoming links, age of web content, and the date the domain was registered. There were many things, in reality, that were changed in this last update, but since we're talking about the age of a domain, we'll only deal with those issues specifically. We'll talk more in other articles about other factors you will want to be aware of that Google changed in their evaluation criteria of websites on the Internet.
One of the ways Google uses to minimize search engine spam is by giving new websites a waiting period of three to four months before giving it any kind of PageRank. This is referred to as the "sandbox effect". It's called the "sandbox effect" because it has been said that Google wants to see if those sites are serious about staying around on the web. The sandbox analogy comes from the concept that Google does this by throwing all of the new sites into a sandbox and let them play together, away from all the adults. Then, when those new sites "grow up", so to speak, then they are allowed to be categorized with the "adults", or the websites that aren't considered new.
What does this mean to you? For those of you with new websites, you may be disappointed in this news, but don't worry. There are some things you can do while waiting for the sandbox period to expire, such as concentrating on your backlink strategies, promoting your site through Pay-per-click, articles, RSS feeds, or in other ways. Many times, if you spend this sandbox period wisely, you'll be ready for Google when it does finally assign you a PageRank, and you could find yourself starting out with a great PageRank!
Even though the domain's age is a factor, critics believe it only gets a little weight in the algorithm. Since the age of your domain is something you have no control over, it doesn't necessarily mean that your site isn't going to rank well in the Search Engine Results Pages (SERPs). It does mean, however, that you will have to work harder in order to build up your site popularity and concentrate on factors that you can control, link inbound links and the type of content you present on your website.
So what happens if you change your domain name? Does this mean you're going to get a low grade with a search engine if you have a new site? No, not necessarily. There are a few things you can do to help ensure that your site won't get lost in the SERPs because of the age of the domain.
1. Make sure you register your domain name for the longest amount of time possible. Many registrars allow you to register a domain name for as long as five years, and some even longer. Registering your domain for a longer period of time gives an indication that your site intends to be around for a long time, and isn't going to just disappear after a few months. This will help boost your score with regards to your domain's age.
2. Consider registering a domain name even before you are sure you're going to need it. We see many domains out there that even while they are registered; they don't have a website to go with it. This could mean that the site is in development, or simply someone saw the use of that particular domain name, and wanted to snatch it up before someone else did. There doesn't seem to be any problems with this method so far, so it certainly can't hurt you to buy a domain name you think could be catchy, even if you end up just selling it later on.
3. Think about purchasing a domain name that was already pre-owned. Not only will this allow you to avoid the "sandbox effect" of a new website in Google, but it also allows you to keep whatever PageRank may have already been attributed to the domain. Be aware that most pre-owned domains with PageRank aren't as cheaply had as a new domain, but it might be well worth it to you to invest a bit more money right at the start.
4. Keep track of your domain's age. One of the ways you can determine the age of a domain is with this handy Domain Age Tool. What it does is allows you to view the approximate age of a website on the Internet, which can be very helpful in determining what kind of edge your competitors might have over you, and even what a site might have looked like when it first started.
To use it, simply type in the URL of your domain and the URLs of your competitors, and click submit. This will give you the age of the domains and other interesting information, like anything that had been cached from the site initially. This could be especially helpful if you are purchasing a pre-owned domain.
Because trustworthy sites are going to have to be the wave of the future, factoring in the age of a domain is a good idea. Even though a site that may have been around for years may suddenly go belly-up, or the next big eBay or Yahoo! just might be getting it start, it may not be a full measure of how trustworthy a site is or will be. This is why there are many other factors that weigh into a search engine's algorithm and not just a single factor alone. What we do know is that we've seen age becoming of more importance that it had been previously, there are only good things to be said about having a site that's been around for a while.

The Importance of Backlinks
If you've read anything about or studied Search Engine Optimization, you've come across the term "backlink" at least once. For those of you new to SEO, you may be wondering what a backlink is, and why they are important. Backlinks have become so important to the scope of Search Engine Optimization, that they have become some of the main building blocks to good SEO. In this article, we will explain to you what a backlink is, why they are important, and what you can do to help gain them while avoiding getting into trouble with the Search Engines.
What are "backlinks"? Backlinks are links that are directed towards your website. Also knows as Inbound links (IBL's). The number of backlinks is an indication of the popularity or importance of that website. Backlinks are important for SEO because some search engines, especially Google, will give more credit to websites that have a good number of quality backlinks, and consider those websites more relevant than others in their results pages for a search query.
When search engines calculate the relevance of a site to a keyword, they consider the number of QUALITY inbound links to that site. So we should not be satisfied with merely getting inbound links, it is the quality of the inbound link that matters. A search engine considers the content of the sites to determine the QUALITY of a link. When inbound links to your site come from other sites, and those sites have content related to your site, these inbound links are considered more relevant to your site. If inbound links are found on sites with unrelated content, they are considered less relevant. The higher the relevance of inbound links, the greater their quality.
For example, if a webmaster has a website about how to rescue orphaned kittens, and received a backlink from another website about kittens, then that would be more relevant in a search engine's assessment than say a link from a site about car racing. The more relevant the site is that is linking back to your website, the better the quality of the backlink.
Search engines want websites to have a level playing field, and look for natural links built slowly over time. While it is fairly easy to manipulate links on a web page to try to achieve a higher ranking, it is a lot harder to influence a search engine with external backlinks from other websites. This is also a reason why backlinks factor in so highly into a search engine's algorithm. Lately, however, a search engine's criteria for quality inbound links has gotten even tougher, thanks to unscrupulous webmasters trying to achieve these inbound links by deceptive or sneaky techniques, such as with hidden links, or automatically generated pages whose sole purpose is to provide inbound links to websites. These pages are called link farms, and they are not only disregarded by search engines, but linking to a link farm could get your site banned entirely.
Another reason to achieve quality backlinks is to entice visitors to come to your website. You can't build a website, and then expect that people will find your website without pointing the way. You will probably have to get the word out there about your site. One way webmasters got the word out used to be through reciprocal linking. Let's talk about reciprocal linking for a moment.
There is much discussion in these last few months about reciprocal linking. In the last Google update, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings with the sheer number of inbound links. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant inbound link was ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.
We must be careful with our reciprocal links. There is a Google patent in the works that will deal with not only the popularity of the sites being linked to, but also how trustworthy a site is that you link to from your own website. This will mean that you could get into trouble with the search engine just for linking to a bad apple. We could begin preparing for this future change in the search engine algorithm by being choosier with which we exchange links right now. By choosing only relevant sites to link with, and sites that don't have tons of outbound links on a page, or sites that don't practice black-hat SEO techniques, we will have a better chance that our reciprocal links won't be discounted.
Many webmasters have more than one website. Sometimes these websites are related, sometimes they are not. You have to also be careful about interlinking multiple websites on the same IP. If you own seven related websites, then a link to each of those websites on a page could hurt you, as it may look like to a search engine that you are trying to do something fishy. Many webmasters have tried to manipulate backlinks in this way; and too many links to sites with the same IP address is referred to as backlink bombing.
One thing is certain: interlinking sites doesn't help you from a search engine standpoint. The only reason you may want to interlink your sites in the first place might be to provide your visitors with extra resources to visit. In this case, it would probably be okay to provide visitors with a link to another of your websites, but try to keep many instances of linking to the same IP address to a bare minimum. One or two links on a page here and there probably won't hurt you.
There are a few things to consider when beginning your backlink building campaign. It is helpful to keep track of your backlinks, to know which sites are linking back to you, and how the anchor text of the backlink incorporates keywords relating to your site. A tool to help you keep track of your backlinks is the Domain Stats Tool. This tool displays the backlinks of a domain in Google, Yahoo, and MSN. It will also tell you a few other details about your website, like your listings in the Open Directory, or DMOZ, from which Google regards backlinks highly important; Alexa traffic rank, and how many pages from your site that have been indexed, to name just a few.
Another tool to help you with your link building campaign is the Backlink Builder Tool. It is not enough just to have a large number of inbound links pointing to your site. Rather, you need to have a large number of QUALITY inbound links. This tool searches for websites that have a related theme to your website which are likely to add your link to their website. You specify a particular keyword or keyword phrase, and then the tool seeks out related sites for you. This helps to simplify your backlink building efforts by helping you create quality, relevant backlinks to your site, and making the job easier in the process.
There is another way to gain quality backlinks to your site, in addition to related site themes: anchor text. When a link incorporates a keyword into the text of the hyperlink, we call this quality anchor text. A link's anchor text may be one of the under-estimated resources a webmaster has. Instead of using words like "click here" which probably won't relate in any way to your website, using the words "Please visit our tips page for how to nurse an orphaned kitten" is a far better way to utilize a hyperlink. A good tool for helping you find your backlinks and what text is being used to link to your site is the Backlink Anchor Text Analysis Tool. If you find that your site is being linked to from another website, but the anchor text is not being utilized properly, you should request that the website change the anchor text to something incorporating relevant keywords. This will also help boost your quality backlinks score.
Building quality backlinks is extremely important to Search Engine Optimization, and because of their importance, it should be very high on your priority list in your SEO efforts. We hope you have a better understanding of why you need good quality inbound links to your site, and have a handle on a few helpful tools to gain those links.
Dynamic URLs vs. Static URLs
The Issue at Hand
Websites that utilize databases which can insert content into a webpage by way of a dynamic script like PHP or JavaScript are increasingly popular. This type of site is considered dynamic. Many websites choose dynamic content over static content. This is because if a website has thousands of products or pages, writing or updating each static by hand is a monumental task.
There are two types of URLs: dynamic and static. A dynamic URL is a page address that results from the search of a database-driven web site or the URL of a web site that runs a script. In contrast to static URLs, in which the contents of the web page stay the same unless the changes are hard-coded into the HTML, dynamic URLs are generated from specific queries to a site's database. The dynamic page is basically only a template in which to display the results of the database query. Instead of changing information in the HTML code, the data is changed in the database.
But there is a risk when using dynamic URLs: search engines don't like them. For those at most risk of losing search engine positioning due to dynamic URLs are e-commerce stores, forums, sites utilizing content management systems and blogs like Mambo or WordPress, or any other database-driven website. Many times the URL that is generated for the content in a dynamic site looks something like this:
http://www.somesites.com/forums/thread.php?threadid=12345&sort=date

A static URL on the other hand, is a URL that doesn't change, and doesn't have variable strings. It looks like this:
http://www.somesites.com/forums/the-challenges-of-dynamic-urls.htm
Static URLs are typically ranked better in search engine results pages, and they are indexed more quickly than dynamic URLs, if dynamic URLs get indexed at all. Static URLs are also easier for the end-user to view and understand what the page is about. If a user sees a URL in a search engine query that matches the title and description, they are more likely to click on that URL than one that doesn't make sense to them.
A search engine wants to only list pages its index that are unique. Search engines decide to combat this issue by cutting off the URLs after a specific number of variable strings (e.g.: ? & =).
For example, let's look at three URLs:
   http://www.somesites.com/forums/thread.php?threadid=12345&sort=date
   http://www.somesites.com/forums/thread.php?threadid=67890&sort=date
   http://www.somesites.com/forums/thread.php?threadid=13579&sort=date

All three of these URLs point to three different pages. But if the search engine purges the information after the first offending character, the question mark (?), now all three pages look the same:

   http://www.somesites.com/forums/thread.php
   http://www.somesites.com/forums/thread.php
   http://www.somesites.com/forums/thread.php

Now, you don't have unique pages, and consequently, the duplicate URLs won't be indexed.
Another issue is that dynamic pages generally do not have any keywords in the URL. It is very important to have keyword rich URLs. Highly relevant keywords should appear in the domain name or the page URL. This became clear in a recent study on how the top three search engines, Google, Yahoo, and MSN, rank websites.
The study involved taking hundreds of highly competitive keyword queries, like travel, cars, and computer software, and comparing factors involving the top ten results. The statistics show that of those top ten, Google has 40-50% of those with the keyword either in the URL or the domain; Yahoo shows 60%; and MSN has an astonishing 85%! What that means is that to these search engines, having your keywords in your URL or domain name could mean the difference between a top ten ranking, and a ranking far down in the results pages.
The Solution
So what can you do about this difficult problem? You certainly don't want to have to go back and recode every single dynamic URL into a static URL. This would be too much work for any website owner.
If you are hosted on a Linux server, then you will want to make the most of the Apache Mod Rewrite Rule, which is gives you the ability to inconspicuously redirect one URL to another, without the user's (or a search engine's) knowledge. You will need to have this module installed in Apache; for more information, you can view the documentation for this module here. This module saves you from having to rewrite your static URLs manually.
How does this module work? When a request comes in to a server for the new static URL, the Apache module redirects the URL internally to the old, dynamic URL, while still looking like the new static URL. The web server compares the URL requested by the client with the search pattern in the individual rules.
For example, when someone requests this URL:
http://www.somesites.com/forums/the-challenges-of-dynamic-urls.html

The server looks for and compares this static-looking URL to what information is listed in the .htaccess file, such as:
RewriteEngine on
RewriteRule thread-threadid-(.*)\.htm$ thread.php?threadid=$1
It then converts the static URL to the old dynamic URL that looks like this, with no one the wiser:
   http://www.somesites.com/forums/thread.php?threadid=12345
You now have a URL that only will rank better in the search engines, but your end-users can definitely understand by glancing at the URL what the page will be about, while allowing Apache's Mod Rewrite Rule to handle to conversion for you, and still keeping the dynamic URL.
If you are not particularly technical, you may not wish to attempt to figure out the complex Mod Rewrite code and how to use it, or you simply may not have the time to embark upon a new learning curve. Therefore, it would be extremely beneficial to have something to do it for you. This URL Rewriting Tool can definitely help you. What this tool does is implement the Mod Rewrite Rule in your .htaccess file to secretly convert a URL to another, such as with dynamic and static ones.
With the URL Rewriting Tool, you can opt to rewrite single pages or entire directories. Simply enter the URL into the box, press submit, and copy and paste the generated code into your .htaccess file on the root of your website. You must remember to place any additional rewrite commands in your .htaccess file for each dynamic URL you want Apache to rewrite. Now, you can give out the static URL links on your website without having to alter all of your dynamic URLs manually because you are letting the Mod Rewrite Rule do the conversion for you, without JavaScript, cloaking, or any sneaky tactics.
Another thing you must remember to do is to change all of your links in your website to the static URLs in order to avoid penalties by search engines due to having duplicate URLs. You could even add your dynamic URLs to your Robots Exclusion Standard File (robots.txt) to keep the search engines from spidering the duplicate URLs. Regardless of your methods, after using the URL Rewrite Tool, you should ideally have no links pointing to any of your old dynamic URLs.
You have multiple reasons to utilize static URLs in your website whenever possible. When it's not possible, and you need to keep your database-driven content as those old dynamic URLs, you can still give end-users and search engine a static URL to navigate, and all the while, they are still your dynamic URLs in disguise. When a search engine engineer was asked if this method was considered "cloaking", he responded that it indeed was not, and that in fact, search engines prefer you do it this way. The URL Rewrite Tool not only saves you time and energy by helping you use static URLs by converting them transparently to your dynamic URLs, but it will also save your rankings in the search engines.

Duplicate Content Filter: What it is and how it works
Duplicate Content has become a huge topic of discussion lately, thanks to the new filters that search engines have implemented. This article will help you understand why you might be caught in the filter, and ways to avoid it. We'll also show you how you can determine if your pages have duplicate content, and what to do to fix it.
Search engine spam is any deceitful attempts to deliberately trick the search engine into returning inappropriate, redundant, or poor-quality search results. Many times this behavior is seen in pages that are exact replicas of other pages which are created to receive better results in the search engine. Many people assume that creating multiple or similar copies of the same page will either increase their chances of getting listed in search engines or help them get multiple listings, due to the presence of more keywords.
In order to make a search more relevant to a user, search engines use a filter that removes the duplicate content pages from the search results, and the spam along with it. Unfortunately, good, hardworking webmasters have fallen prey to the filters imposed by the search engines that remove duplicate content. It is those webmasters who unknowingly spam the search engines, when there are some things they can do to avoid being filtered out. In order for you to truly understand the concepts you can implement to avoid the duplicate content filter, you need to know how this filter works.
First, we must understand that the term "duplicate content penalty" is actually a misnomer. When we refer to penalties in search engine rankings, we are actually talking about points that are deducted from a page in order to come to an overall relevancy score. But in reality, duplicate content pages are not penalized. Rather they are simply filtered, the way you would use a sieve to remove unwanted particles. Sometimes, "good particles" are accidentally filtered out.
Knowing the difference between the filter and the penalty, you can now understand how a search engine determines what duplicate content is. There are basically four types of duplicate content that are filtered out:
1.    Websites with Identical Pages - These pages are considered duplicate, as well as websites that are identical to another website on the Internet are also considered to be spam. Affiliate sites with the same look and feel which contain identical content, for example, are especially vulnerable to a duplicate content filter. Another example would be a website with doorway pages. Many times, these doorways are skewed versions of landing pages. However, these landing pages are identical to other landing pages. Generally, doorway pages are intended to be used to spam the search engines in order to manipulate search engine results.
2.    Scraped Content - Scraped content is taking content from a web site and repackaging it to make it look different, but in essence it is nothing more than a duplicate page. With the popularity of blogs on the internet and the syndication of those blogs, scraping is becoming more of a problem for search engines.
3.    E-Commerce Product Descriptions - Many eCommerce sites out there use the manufacturer's descriptions for the products, which hundreds or thousands of other eCommerce stores in the same competitive markets are using too. This duplicate content, while harder to spot, is still considered spam.
4.    Distribution of Articles - If you publish an article, and it gets copied and put all over the Internet, this is good, right? Not necessarily for all the sites that feature the same article. This type of duplicate content can be tricky, because even though Yahoo and MSN determine the source of the original article and deems it most relevant in search results, other search engines like Google may not, according to some experts.
So, how does a search engine's duplicate content filter work? Essentially, when a search engine robot crawls a website, it reads the pages, and stores the information in its database. Then, it compares its findings to other information it has in its database. Depending upon a few factors, such as the overall relevancy score of a website, it then determines which are duplicate content, and then filters out the pages or the websites that qualify as spam. Unfortunately, if your pages are not spam, but have enough similar content, they may still be regarded as spam.
There are several things you can do to avoid the duplicate content filter. First, you must be able to check your pages for duplicate content. Using our Similar Page Checker, you will be able to determine similarity between two pages and make them as unique as possible. By entering the URLs of two pages, this tool will compare those pages, and point out how they are similar so that you can make them unique.
Since you need to know which sites might have copied your site or pages, you will need some help. We recommend using a tool that searches for copies of your page on the Internet: www.copyscape.com. Here, you can put in your web page URL to find replicas of your page on the Internet. This can help you create unique content, or even address the issue of someone "borrowing" your content without your permission.
Let's look at the issue regarding some search engines possibly not considering the source of the original content from distributed articles. Remember, some search engines, like Google, use link popularity to determine the most relevant results. Continue to build your link popularity, while using tools like www.copyscape.com to find how many other sites have the same article, and if allowed by the author, you may be able to alter the article as to make the content unique.
If you use distributed articles for your content, consider how relevant the article is to your overall web page and then to the site as a whole. Sometimes, simply adding your own commentary to the articles can be enough to avoid the duplicate content filter; the Similar Page Checker could help you make your content unique. Further, the more relevant articles you can add to compliment the first article, the better. Search engines look at the entire web page and its relationship to the whole site, so as long as you aren't exactly copying someone's pages, you should be fine.
If you have an eCommerce site, you should write original descriptions for your products. This can be hard to do if you have many products, but it really is necessary if you wish to avoid the duplicate content filter. Here's another example why using the Similar Page Checker is a great idea. It can tell you how you can change your descriptions so as to have unique and original content for your site. This also works well for scraped content also. Many scraped content sites offer news. With the Similar Page Checker, you can easily determine where the news content is similar, and then change it to make it unique.
Do not rely on an affiliate site which is identical to other sites or create identical doorway pages. These types of behaviors are not only filtered out immediately as spam, but there is generally no comparison of the page to the site as a whole if another site or page is found as duplicate, and get your entire site in trouble.
The duplicate content filter is sometimes hard on sites that don't intend to spam the search engines. But it is ultimately up to you to help the search engines determine that your site is as unique as possible. By using the tools in this article to eliminate as much duplicate content as you can, you'll help keep your site original and fresh.
                                                                                                   

                                                            अनुराग सिंह राणा

@$Rana

2 comments:

  1. BEETLE ORCHID

    INR : ` 4390/- Per Sq.Ft.

    Type : Studio Apartment
    Size : 380 Sq.Ft. - 635 Sq.Ft
    Assured Return : 12%
    Builder : BEETLE ORCHID
    Address : Knowledge Park III, Greater Noida


    CALL NOW! - 8010583793

    ReplyDelete
  2. Prices of 3 BHK flats in Noida Extension are high and are expected to increase further. Indeed, 3 BHK flats in Noida Extension present a fruitful investment opportunity in Noida.

    ReplyDelete