Keyword stuffing is when webmasters overload their pages with commonly searched keywords to increase search engine rankings and revenue. However, search engines penalize this practice and use algorithms to detect it. Keyword stuffing can be done by hiding keywords in meta tags or embedding them in the same color as the background. It can also involve creating multiple domains with the same content to build links. Search engines now use various indicators to determine site relevance and flag sites with an unusually large number of keywords as spam.
Keyword stuffing is a practice where webmasters load their pages with instances of commonly searched keywords. The practice is designed to drive traffic to the site by increasing the site’s ranking in search results, in hopes of increasing the site’s revenue. However, keyword stuffing can be an unwise choice; many search engines penalize keyword stuffing and use complex algorithms to detect signs of keyword stuffing, even when it’s done intelligently.
The most obvious example of this practice would be placing the keyword in a block of text. Often the author takes the time to make the text reasonably plausible, in the hope of avoiding penalization by a search engine. For example, if the webmaster knew that the keyword “evil goat” is commonly searched for, he would try to load it on a page as many times as possible.
Most commonly, keyword stuffing is accomplished by hiding the keyword, so that visitors don’t see it, but search engines do. Keywords can be hidden in meta tags and can also be embedded in a page, in text the same color as the background so that it is invisible to the user or in more subtle ways. In many cases, the keywords aren’t even related to the site’s content; they are designed only to elevate search rankings so that users visit, generating advertising revenue. While a certain amount of keyword stuffing is considered legitimate and even smart for search engine optimization (SEO), webmasters who go overboard give the practice a bad name.
Before search engines realized the practice, they would return full sites when users searched for common keywords. Users assume that the site is relevant, as it is on the first page of search results, and click on the link. At best, the site could be a pile of nonsense with ads interspersed with it; at worst, it could load malicious software onto your computer. The practice could be extremely frustrating for people trying to surf the web for information, and as a result, search engines have fine-tuned their algorithms to use a variety of indicators to determine how relevant a site is.
Most search engines will flag sites with an unusually large number of keywords as spam, rather than legitimate websites, leading to the alternative slang term “spamdexing” to refer to keyword stuffing. In the case of “evil goat” above, there are really only so many times such a phrase could be mentioned in a legitimate web article; if a search engine found 40 instances of the term on a single page, it might assume the page was filled.
In a variation of keyword stuffing, some webmasters enter the same site into multiple domains. While mirroring one site is a perfectly reasonable way to handle high volumes of web traffic, 20 sites advertising sports equipment in exactly the same text isn’t necessarily legitimate. Webmasters can also use these sites to build links, which can also boost search engine rankings. As a result of this practice, many search engines examine the source of links to a site to determine whether they are valid or merely spamming tools.
Protect your devices with Threat Protection by NordVPN