Poison Words On Google

Apple Roof Cleaning

Roof Cleaning Instructor
Here is an interesting read for you frontrunners in the web site game.



What Are Poison Words? Do They Matter?
Feb
19SEO Question: I'm researching poison or forbidden words and I've only found a few vague or older posts from 2000 in a few SEO forums. Supposedly if a site uses poison words in the title etc. it is pushed way down in the SERPs. Any idea if this is fact or fiction? I'd love a complete list of poison words, although right now I'm specifically trying to find out if sale, best, about, contact us, website, or free shipping are poison because I have a retail product site with those words in the home page title, description, and body text. SEO Answer:
Poison words were a way to deweight low quality content pages:
I have actually never put much effort into researching poison words, but I will try to give my opinion on the subject.

The initial research and information about poison words came out well before I jumped into the SEO market. This page talks about the idea of poison words:

Poison words, are words that are known to decrease your pages rankings if a search engine finds them in the title, description or in the url. They don't kill, they just bury pages in rankings.

Generally, people think of adult words first. Adult words (obscene) often put your page in an adult category where it is filtered out by various filters at search engines.

Newer non-adult Poison Words are being uncovered. These words don't throw you into a different category, then just decrease your rankings. Poison Words signal to a search engine, that this page is of low value.
Forums are Bad?
That same page goes on to cite how forum may have been a bad word around that time:

The worst of the lot would probably be the word "forum". Chat and BBS forum systems have taking body shots by all the major search engines this year. Two well know search engines now specifically look for links to BBS software makers and kill the pages in the index outright - possibly the whole domain.

Other possible poison title/url words and phrases that come to mind: UBB, BBS, Ebay, and all variations on the pa-id to surf program keywords.
Why Would Forums Have Been Bad?
As stated above, I was not around on the web during that time period, so I can only guess as to why forum would have been such a bad word.

Largely I think it would have came down to two factors:

overweighting of forums in the search results

how easy it was (and still is) to spam forums
In early 2000 there were far fewer pages on the web than there are today. Because of the textual nature of forums and how many pages forum conversations generated it would not surprise me if forums tended to make up too large of a percentage of the search results, and thus they had to offset that by deweighting forums.

Things which show either a lack of moderation of content or page contents that are not vetted by the site publisher may make search engines want to consider deweighting a page. Imagine a page with few inbound links from outside sites and 100 external links on the page, and all 100 links used the nofollow attribute. If you were an engine would you want to trust that page much? I wouldn't.

The Web Was Much Smaller:
To put it in perspective, back in early 2000 Google was still pushing people toward the Google Directory on their home page, had a link to their awards page pushing Yahoo! and did not even yet have the number of documents page count that they had for about 4 or 5 years. On June 26th of 2000 Google announced that they had 560 million full-text indexed web pages and 500 million partially indexed URLs. Right now Webmasterworld has over 2 million pages in Google's index, so you can see how a few large forum sites would be able to dominate a search index that small. Combine that with many forums being hit by internet marketers aggressively spamming them and the content seems less desirable.

Deweighting User Interaction:
As far as deweighting pages that allow user interaction that makes sense as well. Why? Because for most sites the page and site gain authority primarily for the actions of the site owner or paid editors. If third parties can add content to a page they can influence the relevancy of that document, and thus leverage the authority of the original author without much expense. That is why search engineers pushed the nofollow attribute so hard.

Plus if pages and sites are legitimate and allow value added useful community interaction typically those sites will get more links and authority, so knocking them down a bit for allowing interactivity and third party publishing does not really hurt them - since the legitimate sites would make that right back through gaining more citations.

Turning a Page Into Spam:
I don't search as much as I would like to because I spend too much time reading and writing stuff (and not enough time researching), but on occasion I search around. I have seen totally unrelated blog posts rank #1 on Google for certain types of niche pornography because someone came by and left a comment that made that document become relevant to the uber gross porn query.

Blog Comment and RSS Spam:
In a recent post on SEO Buzz Box DaveN hinted that comments may make a page be seen as less clean, and thus give a search engine a reason to deweight it. Combine that with the vastly growing field of citation spam and it makes sense that Google would not want to promote similar content that is only differentiated by ad placement and a few third party comments.

Ebb and Flow:
So given that forums were a type of content that may have been overrepresented and undesirable I think it is worth noting that maybe right now they may be considered to be better than they once were. Perhaps contextual advertising programs and the rebound of online advertising may have gave forum owners more compensation which allow them to run better forums. Also algorithms are more link focused and most forum pages tend to score naturally poor because there are so many pages as compared to the quantity and quality of inbound links to most forums.

Search engines constantly battle with marketers for what types of sites to rank in the search results.

Sometimes you will notice Amazon and large vertical sites ranking for almost everything under the sun. At other times directories are given more weight than would seem logical.

In late 2003, around the time of the Google Update Florida directories started showing up way too much in the search results. People took advantage of the opportunity and thousands of vertically focused or general PageRank selling directories sprung up.

Since then many of those directories seem to be packing less punch in the SERPs - in direct rankings and with how much their links help other sites.

Closing Holes Opens New Ones:
So what you see is wave after wave of content type. As search engines close some holes they open up others. When WG and Oilman interviewed Matt Cutts they also spoke about how the face of spam has - at least for now - moved from blog spam sites to subdomains off established sites. Right now Google is putting too much weight on old established sites.

Blogs Getting Away With a Bit Much:
With all of the blog networks springing up right now I wouldn't be surprised if some search engineers were starting to get sick of blogs, and looking for ways to deweight some of those networks as well. That is another example of why forums may become more desirable...if blogs are so hot that everyone and their dog has 5 of them maybe the people who are looking to make a quick buck are going to be more inclined to run blogs than forums.

Poison Words No Longer Needed?
That sorta leads me into my next point. I don't think poison words in their old traditional sense are as important as they may have been.

I still think the concept of poison words has a roll, but it is likely minimal other than how much search engines can trust citations. IE: pages that flag for poison words may not pass as much outbound link authority.

The inverse rule of link quality states that the effect of a link is going to be inversely proportional to how easy it is for a competing site to gain that same link.

So if the words "add URL" and "buy PageRank" are on the page those links may not count as much as other types of links. On this page Ciml noted how guestbook pages were not passing PageRank, but then Google undid that, at least to some extent. Stop words may not be necessary to deweight low quality links though. De-weighting may occur fairly naturally via other algorithmic mechanisms that generally parallel the effect of stop words:

Far more people practice SEO today than did in 2000, so many of the loopholes that are exploited are hit faster and harder. (see The Tragedy of the Commons).

Most people selling links do it in a manner that it is painfully obvious to search engines.
Here is an example of how the average low quality greed driven general directory is ran.
Search engines collect more data and have far better technology as well. If pages are not found useful by searchers then they will eventually rank lower in the search results.

Establishing Trust:
So right now - and going forward - search relevancy will be about establishing trust. How trust is established will continue to evolve. Those who have more trust will also be able to get away with more aggressive marketing. Some new sites that use the DP coop network do not do that well with it, but sites that are either old and/or have built up significant usage data via email or viral marketing seem to be able to do more with it.

Google's Informational Bias:
Also note that Google tends to be a bit biased toward sites they believe to be informational in nature. Yahoo! Mindset shows how easy it is for search engines to adjust that sort of bias. You could think of words like shopping carts and checkout as being treated as poison words, but odds are highly likely that if a merchant site provides a useful feature rich page that search engines want that content. Most merchant sites that are getting whacked in Google are likely getting whacked for having thin sites with near duplicate content on most pages or for having unnatural linkage profiles.

Many thin affiliate sites are also getting hit for having no original content and outbound affiliate links on nearly every page.

Improving Content Quality:
With all informational databases Google pushes they first push getting as much of it as possible, and then as time passes they learn to better understand it (looking ultimately at human interaction), and try to push for the creation of higher quality content. Most web based publishers will face a huge strugle with balancing content quality and content cost.

The only way their business model works is if others allow them to give people free access to high quality content. I don't think that poison words are necessarily needed to do that though...at least not for most natural created-for-human pages in their general search database.

Vertical Search:
Some vertical search engines may use certain words for inclusion or exclusion in their database. For example look at Edgeio or NFFC's post on Become.com.

Alternate Meaning for Poison Words:
Some people have also called terms poison words because some of them throw off contextual ad targeting.

Google allows you to use section targeting to help target your AdSense ads away from common generic words like blog.
 

Attachments

  • 2007-08-02Spam.jpg
    2007-08-02Spam.jpg
    23.4 KB · Views: 9
Here is some more SEO Thought on this subject.

Click'ability is the click me now factor of any listing. Being listed is just the beginning. Top five rankings wont produce anything without click-me-now factor. I'm sure we've all seen top 10 rankings that just didn't seem to produce. We've also seen top 10 rankings that seem to out perform themselves.

Title makeup and descriptions are such an art form, that most seo's all but refuse to talk about it. Some of the most productive titles and descriptions would make you scratch your head as to why.

Those that produce have a few features in common:
- no hype. Powerful "call to action" statements just don't work like they used to do. They are counter productive. Three examples. I pulled similar to the extra descriptions straight from altavista searches.

a) WebmasterWorld, the best and greatest webmaster forum on the net.
b) WebmasterWorld, click here to visit now.
c) New and improved with daily updates. 100% fat free.
d) WebmasterWorld, news and discussion groups for the independent web professional.

Which one will produce the most clicks?
Which one will produce the most targeted traffic?

Lets look at them in order.
A: has the word "best" in the description. Adjectives and adverbs are getting "sub ranked" when they analyze the description. These type of "hype" words that try to make something sound better than it is, are getting beat down in some engines. They don't want hype in the descriptions. Go compare top ten rankings in alta with say 200-250. You'll start to form a word list of words that hurt you on there.

b: almost the same as A. Descriptions with the word "click" in them are history. Users generally ignore them now and engines also push them down.

Some commonly used description words you want to avoid. (new, fresh, updated, click, best, largest, cool, awesome, top.. I'm sure you can think of more and their derivatives)

c: It's interesting. It has curiosity and a touch of humor factor (humor is always good if you can slip it by). However it contains no keywords. It would actually have a fairly high click rate if you could get the page listed in decent position (highly unlikely without kw's in the title or description and because of the word "new"). The targeting would also be blown since users would have no idea what they were clicking onward to see.

d: Will out produce the other three by a factor of five to ten. Low hype, no poison words and pre-filters the users. Those types of descriptions and low fuss titles are really coming into vogue. They really produce and get you up in the rankings via click factor engines like direct hit.
 
Back
Top