Jonathan Leger – SEO And Internet Marketing Blog Internet Marketing Blog

29Nov/11Off

Why Google Is Ignoring Your Content And Links

What is the goal of a search engine? More specifically, what is Google's goal?

To deliver high quality, relevant results to searchers. If they can't do that, the traffic dries up, along with the revenue.

So when you're working on your website, whether it's writing content or getting back links, it's important to ask youself whether or not what you're doing is what Google is looking for.

If you get it wrong Google will ignore your content and links, and all of your time and effort will be wasted. After all, if Google doesn't think your site will add to their search quality (and, thereby, their bottom line), they're not going to be interested in your work at all.

On the other hand, give Google what Google is looking for and the traffic will come a runnin'. So let's talk about what Google wants from your website.

For starters, let's talk about the content you're submitting to article directories, blogs and blog networks and the various web 2.0 properties that Google indexes. It's true that you can generate traffic directly from the content you post to those sites (more on that in a bit), but your primary goal is probably to get links to your site that will get that site ranked in Google.

Those are good, valuable practices, to be sure. But are you submitting the same article to multiple sites? If so, ask yourself: how does that contribute to the quality of search results? A little, perhaps, but only when there's nothing else for Google to show.

Google has explicitly stated that, when possible, for any given keyword search they will try and show unique content in the results. That is, unless there's nothing else to show, Google is not going to show the same article from multiple sites in the top 10 results for any keyword search you perform.

That makes sense, right? Would you want to read a book about nutrition, for instance, where every chapter was just a copy of the chapter before it? What good would that do anybody?

In the same way, when people search Google, they expect to get a variety of information back -- not duplicate copies of the same article. Google is designed to give people what they want and expect, and people do not want or expect duplicate content.

So it makes sense that Google will only show one copy of an article in the search results, which means the content on your site should not appear anywhere else. But have you ever thought about unique content in terms of the links you get to your site?

You probably know that to rank well in Google you need to get links aimed at your site from other web sites. But are you making sure that the links you are getting appear on unique pages of content? When you post content to the sites mentioned earlier, is the content you're submitting unique? It should be. Let's talk about why.

Think about this: lets say you are reading the reviews for a product on Amazon. You want to see what other people think about it before you buy.

What would you think if all 100 reviews were posted under different user names but said exactly the same thing? Would you believe those reviews were fair and honest? No way! You would assume, and rightly so, that the "reviewers" were up to no good. In fact, you would ignore those reviews completely and try to find ones that were unique. True reviews might have some thoughts in common, but they would not be word for word the same.

Well, to Google your web site is the product, and the pages that link to your site are the reviews. One big way that Google makes sure that the pages it displays in the top search results are "quality" pages is by judging the "honesty" of the "reviews" that link to those pages. A BIG factor in that is whether or not those linking pages are unique.

Google will count links from duplicate pages, but it won't apply nearly as much importance to them. Why should it? They are obviously "reviews" written by the same person, even if they appear in different sites (under different "user names", as it were). Google wants to rank sites that get good "reviews" from a lot of different people. That is a better indication of real quality.

So when you're getting links to your site, you need to make sure that the content those links appear on is high quality, unique content. That just makes good sense.

There's another really good reason to make sure that every page linking to yours is unique, though. You see, Google is what's called a "full text search engine." What that means is that Google breaks down content into the words and phrases that it's made up of, and catalogs all of those keywords in an index.

Then, when somebody performs a search, it looks through that index and finds pages that are related to the keywords being searched for. If your page is one of them, and the other quality factors are in place (like links to the page), then Google shows that page in the search results for the keywords.

That means that if all of the pages that link to your site are the same, then all of the keywords that get put into Google's index for those pages will also be the same, which seriously limits the probability of your linking pages popping up in the search results.

For instance, let's say you post the same article to 100 sites, and that article contains 10 phrases that Google considers important enough to put into its index. That gives you 10 keyword phrases that all 100 of those linking pages have a chance of showing up for in the search results.

On the other hand, let's say you distribute a unique article to each of those 100 sites. If each one of those articles contains 10 different phrases that make it into Google's index, that gives you 1,000 keyword phrases that your linking pages have a chance of showing up for.

If each unique article can deliver only one new visitor per day, then all 100 duplicate articles would only deliver 30 visitors a month (because remember, Google will only show one of those pages in the search results). On the other hand, if you have 100 unique articles each delivering one new visitor per day, that's 3,000 new visitors per month.

Start multiplying that out by 1,000 articles, or 10,000 articles, or even 100,000 articles, and you begin to see why unique content is so valuable.

The problem with this inescapable fact is that to rank in Google, even for keywords that don't bring a ton of traffic, you have to get hundreds, even thousands of links. For really high traffic keywords you need hundreds of thousands of links!

To write hundreds or thousands of unique articles would take an enormous amount of time. And you would have to repeat that process for every set of keywords you want to rank for. That's not a problem for huge web properties with massive budgets and hundreds of writers, but for the lone webmaster that presents a real challenge.

You could hire other people to write the articles, but even at a few dollars an article you would quickly run out of money! Not to mention the fact that it would still take a long time for somebody else to write them.

So what's the individual web marketer to do? Can you compete with those big dog web properties that have buckets of cash and hundreds of full time writers?

Yes, you can.

You can compete by using what I'm calling a "Super Spun Article." This is a high quality document of around 50,000 words of content that is capable of generating a virtually limitless number of article variations.

All you have to do is either drop the Super Spun article into a content spinner (or use the one that's built into the Super Spun Articles web site) and click a button. Every time you click that button, you get a new, high quality, human edited article that is guaranteed to be at least 75% unique from every other article generated by that same document (and it's usually much more unique than that).

In my experience, any content that is at least 70% different from other content is considered unique in Google's eyes. The odds of two articles generated by a Super Spun Article containing 30% duplicate content is one in ten billion. So that document would have to be used to generate ten billion articles, on average, before that ever happened. (Can you tell I did my homework when designing these documents?)

Once you've added a link to your website into the unique articles you generate, you then submit them to article directories, blogs (or blog networks) and other web 2.0 properties, making sure you get a unique article onto each site.

When Google sees that unique content with your link in it, it doesn't see it as a duplicated, low quality "review" of questionable value. Rather, it sees it for what it is: a high quality, unique page linking to your site. It's those kinds of links that will get your site ranked in the search engines, and it's having a huge number of those unique pages spread out across the web that will generate a snowball traffic effect.

One or two new visitors doesn't seem like much, but if you get that many new visitors per unique page, and have thousands of unique pages -- it adds up in a big way.

So if you're a lone webmaster trying to rank in Google, get traffic and earn revenue from your products or advertising, I strongly recommend you go take a look at my Super Spun Articles. There's a video on the home page demonstrating how they work.

Please post your thoughts in a comment below.