Copywriting for search engines is one of the more important aspects of the SEO industry. Ensuring your content is relevant and keyword rich is an effective method to “impress” the search engines. By impress, I mean that the majority of search engines value optimized content with keyword-laden text. However, if you have too much keyword density, Google and others can consider this spam, and may penalize your site.

With this in mind, a the question was asked about how much of a keyword density percentage is too much? Taking a quick glance around at the various SEO experts, there is no exact answer. However, by using the advice and knowledge of these industry gurus, I will attempt to clarify the issue a little.

According to an article written by Karon Thackston that appeared on HighRankings.com called The Magical Keyword Density Formula, keyword density is only part of the total SEO package. Karon says, “Copywriting, in my opinion and the opinions of respected search engine optimizers, is 1/3 of the puzzle; but there are other pieces to the puzzle, too.” Because of this, Karon states that there is no magical keyword density formula to determine how much is too much and how much isn’t.

Which brings us back to the WebProWorld discussion. Jade456 posed this request: “I’d like to get some opinions on the best keyword density percentage. I’ve heard some people say to keep it under 10% and other say the higher the better. I’m thinking that it should stay under 10, for fear of being dropped for spamming. Anyone have any thoughts?”

Jade’s request was almost immediately answered by cbp, who said, “I have some pages that (accidentally) have keyword density’s of >50% and still manage to rank high – I have seen all the different advice all over the place. My approach is to make sure that the keyword(s) I am targeting have the highest density on the page, but kept as low as possible (yes 10% preferably) – but most importantly the use of language should be natural. Google is getting smarter.”

cbp’s point illustrates what Karon was trying to say. There is no set standard with which to adhere to when it comes to creating search engine-friendly text. However, Webnauts, offers some information that serves as a caution:

“I learned that the average keywords density should be between 3 – 7 to every 100 words.” The reason Webnauts doesn’t go past this mark is because he knows of other sites that have been banned because they were considered spammers. This indicates that while there is no standard, there is definitely a point where the search engines will consider too many keywords as a spam technique. However, determining what constitutes as spam is where the trick lies.

Spam can occur when your density exceeds what is the necessary amount to gain a respectable ranking. WPW moderator bhartzer explains this thought a little further, “Every keyword phrase is different. One might do well with a 3.2 percent keyword density and another whereas the average density for another keyword phrase is close to about 6 percent. And if you were to use the 6 percent keyword density on your page to try to rank for the 3.2, then you’d be way too high, on the border of spamming.”

With keyword density, there doesn’t appear to be any rules that are set in stone. Most advice surrounding the topic centers on keeping your web text as natural as possible by not stuffing keywords into the text. Karon finishes this thought by saying:

“I focus on natural language. If the copy sounds forced after including keyphrases, I scrap it and start over. Read your copy out loud. If it sounds stupid or redundant to you, it will sound stupid and redundant to your site visitor. Don’t compromise the flow of natural language for the sake of search engines.”

Similar Posts