Web Content, Seo, Lsi, And The New Google Patent
So you've optimized your web content for your niche keyword terms, made sure you use those terms in the meta title, the meta keywords, meta description, H tags, top of the page, bottom of the page, and scattered throughout the Web page. You've spent weeks getting links with your keywords in the anchor text. You've blogged and pinged, tagged and pinged, posted to forums, and prayed, "Please Google, list my site as #1 for my main keywords!"
And then somebody just had to mention the new Google patent and latent semantic indexing (LSI) and how it will affect your search engine optimization (SEO).
Like many, you're probably just starting to hear about LSI. So what the heck is it?
Latent semantic indexing, or LSI, is a way for Google to determine whether a web page is really about the keywords you've stuffed into it.
If you're in to black hat SEO, this is probably making you nervous. It used to be that you could rank pretty well with just on-page search engine optimization. Sprinkle the right keywords in the right places in your Web content and Bam! You're at the top of the search engine results.
Then it started to get more difficult. You had to get many other sites to link to you with your keywords in the anchor text (the link text). And if the sites or pages linking to your site were on the same theme, that was best too.
But then the spam sites took over -- millions of junk blogs (splogs) stuffed full of scraped keyword-rich Web content that often wasn't even very readable for humans. It got so that the top search engine results for any keyword were often these junk sites. Imagine if you were Google. Wouldn't you start to worry whether people would stop using your search engine if you couldn't deliver the Web content they were really looking for?
The new Google patent indicates that Google is now using (or intends to use) new algorithms to try to weed out the black hat spam sites. Latent semantic indexing is one way of doing that. For any given topic, you can calculate what
other words and even ideas you would expect to see in the content on the web page and even on the web site. You can even calculate how often they should appear, what phrases should appear, etc.
If a keyword appears many times on a page, but an LSI analysis indicates the page does not really seem to naturally be about that topic, it is not going to rank high in Google's search engine results any more. And if the keyword-rich links leading to a page indicate that the Web content should be about something other than what LSI indicates it is actually about, it is not going to rank high in Google's results either.
So what can you do in an age when it's hard to fool your way to the top of Google's listings for your keywords?
This may sound overly simplistic, but maybe you should actually just write normal Web content about the topic! After all, that's really what Google wants to see and it's really what the people who surf to your Web site want to see. Maybe you should give it to them, if you're not already. Instead of trying to trick Google, provide great Web content that is naturally written about the topic. Without even trying, the words that Google would expect to be on the page with your keywords will just naturally be there. After all, they are comparing your page to their analysis of other real content on the topic.
When the Web first came into being, people wrote about whatever they wanted to write about, and people looking for that topic found those sites. But then people started playing tricks, almost ruining the search engines. Now things have gone full circle. The pages that are really most about a given topic will start to be the ones that show up at the top of the search engine results. Search engine users will be happy, Google will be happy, and you just might find it's easier and faster to produce natural Web content than to spend all of your time trying to trick Google.
Article Source: http://www.article-outlet.com/