Dan Thies on Links...Great Free Video!

Dan Thies has a great free video covering link strategy. It is from week two of his last link training class.

People look for concrete yes or no answers to link questions, but link strategy shifts as your market position shifts. Anyone new to linking and looking to have a long view on how the dynamics shift and how to weigh their risks and techniques would do well to watch that free video.

If you like that video and want more Dan's next 8 week link building Teleclass starts March 22nd and costs $795. Money well spent if you can afford it and are new to linking. This message is totally unsponsored. Although Dan gave me a coupon code I did not use it because I wanted readers to know this was not a sponsored recommendation.

What are Google Supplemental Results?

SEO Question: Much of my website is in Google's Supplemental index? What is their supplemental index? How does it work?

SEO Answer: What a timely question...where to start...well if the supplemental problem has only hit your site recently (as compared to the date of this post) it may be a Google specific problem that has caused them to dump thousands of sites recently.

Believe it or not, other than the home page most of this site is currently in supplemental results as of typing this, and with the current Google hole you can throw sites into supplemental hell within 72 hours.

Matt Cutts, a well known Google engineer, recently asked for feedback on the widespread supplemental indexing issue in this thread. As noted by Barry, in comment 195 Matt said:

Based on the specifics everyone has sent (thank you, by the way), I'm pretty sure what the issue is. I'll check with the crawl/indexing team to be sure though. Folks don't need to send any more emails unless they really want to. It may take a week or so to sort this out and be sure, but I do expect these pages to come back to the main index.

In this thread SEM4U points out that 72.14.207.104 was showing fewer supplemental sites than he saw on others like 64.233.179.104.

Some people are conspiring that generally lots of listed pages were dropped and only the longstanding supplemental pages remain, but that theory is garbage on my site...since I still see a strong PageRank 6 Supplemental page that was recently ranking in the SERPs for competitive phrases (prior to going supplemental) that recently went supplemental.

I have done a site redesign just after this supplemental deal occured, but that was sorta in coincidence with this happening. One good thing about that MovableType update is that the last version of MovableType I was using created these aweful nuclear waste redirect pages...it don't do that on version 3.2.

As far as other reasons this site could have possibly been hit supplemental:

  • too much similar text on each page - but I do think it is common to have common sales elements on many pages of a site, so I doubt that is it
  • redirect links - affiliate links via Clickbank and the direct affiliate program might have flagged some sort of trigger if Google was trying to work on 301 & 302 issues... but whatever they did I don't think they did it better ;)
  • Google is a bit hosed right now

What are supplemental results?

Supplemental results usually only show up in the search index after the normal results. They are a way for Google to extend their search database while also preventing questionable pages from getting massive exposure.

How does a page go supplemental?

From my experiences pages have typically gone supplemental when they became isolated doorway type pages (lost their inbound link popularity) or if they are deemed to be duplicate content. For example, if Google indexes the www. version of your site and the non www. version of your site then likely most of one of those will be in supplemental results.

If you put a ton of DMOZ content and Wikipedia content on your site that sort of stuff may go supplemental as well. If too much of your site is considered to be useless or duplicate junk then Google may start trusting other portions of your site less.

Negative side effects of supplemental:
Since supplemental results are not trusted much and rarely rank they are not crawled often either. Since they are generally not trusted much and rarely crawled odds are pretty good that links from supplemental pages likely do not pull much - if any - weight in Google.

How to get out of Google Supplemental results?
If you were recently thrown into them the problem may be Google. You may just want to give it a wait, but also check to make sure you are not making errors like www vs non www, content manangement errors delivering the same content at multiple URLs (doing things like rotating product URLs), or too much duplicate content for other reasons (you may also want to check that nobody outside your domain is showing up in Google when you search for site:mysite.com and you can also look for duplicate content with Copyscape).

If you have pages that have been orphaned or if your site's authority has went down Google may not be crawling as deep through your site. If you have a section that needs more link popularity to get indexed don't be afraid to point link popularity at that section instead of trying to point more at the home page. If you add thousands and thousands of pages you may need more link popularity to get it all indexed.

After you solve the problem it still may take a while for many of the supplementals to go away. As long as the number of supplementals is not growing, your content is unique, and Google is ranking your site well across a broad set of keywords then supplementals are probably nothing big to worry about.

SEO Tools

These are SEO Tools that I use and recommend to others. Most of these SEO tools are free. Warning on SEO Tools:
Many SEO tool vendors sell software which has been outdated and rendered useless by improving search technology. Worse yet, some of these people intentionally lie to get you to buy their software, even if it will get your site banned from the search engines.

Some of them will tell you that keyword density is the key to seo. That's a lie. Some of them will tell you that trading links off topic and lots of low quality link trades are all you need to rank in Google. That typically don't work well either.

Before buying any SEO software check to see if similar or better free software is available here or here.

Keyword Suggestion Tools:

  • Keyword research review - I prettymuch review most of the best keyword research / keyword suggestion software on the market. Most of the tools worth using are free. My two favorite tools are listed below

  • Google Keyword Tool - shows 12 month trending data. Can offer keyword suggestions based on page content or a word you enter.
  • SEO Book keyword research tool - driven off of Yahoo!'s keyword research tool, it makes it easy to cross reference the various keyword research techniques. It is a bit feature rich, but if you like lots of data you will love this tool. You can give it a test drive by searching in the box below:

Keyword Suggestions for:



Expand Your Keyword List:

Formating Your Keyword List for Google AdWords:
Formatting can be done inside AdWords. Google is also beta testing desktop software for managing your AdWords account. The keyword list generator above also makes it easy to set matching types on your keywords when you generate your keyword list.

Google AdWords also has a function called dynamic keyword insertion which inserts search queries into your AdWords ad copy.

Keyword Density Analysis:
Generally aiming for a keyword density is a waste of time for the following reasons:

  • each engine and query type is unique. there is no such thing as a universal perfect keyword density.

  • getting people to like your content and link to it is more important than what algorithms think of it
    • link popularity is weighted more heavily than page copy for competitive queries (I have even ranked pages that did not exist)

    • and readable useful to human pages are important to conversion
  • most pages made with bots in mind read like they are meant to be read by bots
  • this article covers keyword density from a more scientific standpoint, stating why it is useless

Having said all of that, if you want to look at keyword density here is a free keyword density analyzer.

Keyword Rank Checking Tools:
Digital Point Keyword Ranking Monitor - (free) Offers a quick way to check the backlinks and keyword rankings of a site on Google, Yahoo!, and MSN. Takes a bit to set up, but provides free graphs of ranking vs time and works within the Google terms of service.

Link Analysis Tools:

  • Hub Finder - free open source tool looks for co-occuring backlinks. If you read about hubs and authorities (and why they are important) you will find this tool exceptionally useful.

  • Link Harvester - free open source tool looks at unique linking domains, .edu & .gov backlinks, unique C block IP addresses. Tool has CVS export option. Both Link Harvester and Hub Finder work with the Yahoo! TOS.
  • Back Link Analyzer - like OptiLink and SEO Elite, but free. It is downloadable software that does anchor text analysis on links.
  • I would recommend sticking away from link exchange hubs. Typically the easier it is for the average person to get a link the less value it has. Many people using link exchange networks employ underwaged third world workers to make automated sites they know will eventually get banned.

Other Tools:

If you have a cool SEO tool you would like me to review just leave a comment below or send me an email :)

Why Do Search Engines Favor Informational Sites Over Commercial Sites?

SEO Question: I have noticed many more content heavy websites in Google's search results over the last year or two. Why does it seem it is getting harder for commercial sites to rank?

SEO Answer: Within the commercial realm there are more and more competing sites. Building content, at one time primarily a hobby only project, has become far more lucrative in recent years. Not only have content management systems like Movable Type and Wordpress became cheaply or freely available, but AdSense and affiliate marketing have vastly increased the number of real and fake content sites on the market over the last couple years.

Duplicate content filters have improved, and many shell product catalogs have been filtered out of Google's search results. It seems like some older sites are getting away with some rather shoddy stuff in Google, but as they get more user data and more people create quality content you can look for the search engine to shift away from that loophole.

Search algorithms prefer informational websites over commercial ones for many reasons:

  • they want commercial sites to buy their ads

  • the search ads provide commercial results. they prefer to have some informational results to help balance out the search results.
  • in competitive marketplaces there tends to be many more commercial sites than informational sites
  • if multiple merchants have similar product databases it does not drastically improve the user experience to show hollow shell pages over and over again from a wide variety of merchants
  • many quality informational sites link to related resources that lead searchers to more abstract answers that search engines are not yet advanced enough to answer
  • many informational sites are monetized using contextual ads provided by search engines. those give engines a second chance at revenue after the search

Also keep in mind that most merchant sites focus on the same small core group of keywords. Anything involved with big business can take weeks or months to do...or longer if the company is big or the content management system is highly complex.

For a content based website it takes no time at all to do keyword research using some of the keyword research tools on the market, and then quickly create pages around common customer questions, concerns and buying points. If few sites cover those topics with specific pages then it is low hanging fruit waiting to be claimed. I think it was Peter D who said the key to making money on search was to dig where other people were not digging.

Yahoo! currently offers a paid inclusion program (sidenote: which I generally recommend avoiding) which ensures sites are indexed in Yahoo!. Yahoo! charges those sites a flat rate per click for traffic Yahoo! delivers. That per click fee means that for many search queries it may make sense for them to allow many commercial sites to rank in the search results.

As the largest content site, Yahoo!'s search results also offers quick links to many of their internal content channels, which lessens their need for content from other sources. Make no mistake though, Yahoo! has the ability to try to determine how commercial a website is. See their Yahoo! Mindset tool for an example of how results can be weighted toward either commercial or informational resources.

If you look at the Mindset dial and use it to compare the default search results from Yahoo! and Google think of Google as being turned much further toward research. If Yahoo! drops their paid inclusion program you can bet that they will dial their results more toward the research angle, just like Google is.

Some commercial websites, like Amazon, offer rich interactive features that make them easy to reference (both from a webmaster perspective and a search engine perspective), but generally most commercial sites are not highly interactive and most webmasters would typically be far more inclined to link to quality content sites than overtly commercial sites.

If you are in a competitive field it may make sense to look at Librarians' Internet Index or read this newsletter to see what sort of content sites librarians prefer and trust.

The average person on the web may not be as information savvy as librarians are, so it may also help to look at ideas that go viral by looking at sites like Digg, Memeorandum, or Del.icio.us.

You can also learn a lot content ideas by looking at some of the top ranked content sites in your vertical and related verticals which you are interested in and knowledgeable.

Even commercial sites can still be highly linkable if they are feature rich or offer quality answers to relevant topical questions that competing sites typically ignore.

How do I do Search Engine Optimization for a Small Site?

SEO question: How do I do SEO for a small commercial website? Adding more pages will make it look more unprofessional, and so not something I really want to do?

SEO Answer: Sometimes small sites can be easier to do SEO for than big sites.

Faults of big commercial sites:
Some big sites that are product catalogs may require significant link popularity to get indexed. Also if you are dealing with thousands and thousands and thousands of pages it can be hard to make them unique enough to stay indexed as search algorithms continue to advance. Search engines are getting better at comparing pages and sites. If the only difference across most pages of your many thousand page site are a few part numbers then many pages may be considered duplicate content.

Benefits of a small site:
If a site is small that makes it easy to concentrate your internal link popularity on the most important issues, ideas, and keywords. Small hyper targeted sites also work well at being able to dominate those niche markets. You can create a site name based on the vertical and use the domain name to your advantage.

If you are trying to tackle insurance then a small site is not going to get you anywhere unless you are targeting niche types of insurance.

I tend to be a bit verbose (which is perhaps why I wrote an ebook ;) but I also do not buy that adding pages to a commercial site makes a site less professional. Web pages are just a bunch of bits, but those bits are your salesmen.

Which site would YOU trust more:

  1. Get the lead or sale or the prospective client can screw off. If they want anything they must pay first.

  2. Offers substantial information about the products they sell. Also builds credibility with FAQ section, answering common questions along the buying cycle with content focused on the issues that people tend to think are important before making a purchase.

If you hype it enough, have a high price point and get affiliates pushing it hard enough #1 may win, but in most markets most of the time site #2 will win.

If their site is exceptionally small then adding a few pages with about us and frequently asked questions should allow you to build credibility and target new traffic streams.

If competing sites have a huge brand that you can't afford to compete with one of the best ways to chip away at them is to create useful content, tools, and ideas that solve market problems that have not yet been solved.

If your content is great then it may garner some natural citations, but you need to build at least a few links for search engines to trust your site enough to where others will find it.

Some webmasters are also afraid to link out to relevant resources. I think that most good websites link out to at least a few decent resources. Don't be afraid to link at relevant .gov or .edu pages, industry trade organizations, local chamber of commerce sites and other sites that make sense to reference.

Yahoo! to Ban Comparitive Search Ads

Danny points at a SEW thread noting that starting next month Yahoo! will no longer allow competing businesses to bid on trademark phrases:

"On March 1, 2006, Yahoo! Search Marketing will modify its editorial guidelines regarding the use of keywords containing trademarks. Previously, we allowed competitive advertising by allowing advertisers to bid on third-party trademarks if those advertisers offered detailed comparative information about the trademark owner's products or services in comparison to the competitive products and services that were offered or promoted on the advertiser's site.

In order to more easily deliver quality user experiences when users search on terms that are trademarks, Yahoo! Search Marketing has determined that we will no longer allow bidding on keywords containing competitor trademarks."

Trademark terms are some of the most valuable words in the search space. While this move may not be surprising given Yahoo!'s past activities, will this move cause other engines to change their policies? How will this policy effect comparison sites which offer many brands on the landing page? Is Yahoo! trying to commoditize the search marketplace to help them make more money away from search?

They still support typosquatting and cracking sites away from search, but may that be coming to an end too? The recent Perfect 10 vs Google lawsuit points to newtwork quality becoming a more important issue.

Conversational Advertising

There has been buzz about conversational marketing recently, including exposure on Poynter and Performancing.

I think conversational advertising works primarily for the following groups:

  • those who can give away their entire product free because they realize that the viral buzz around it will cause many more follow on customers...this works especially well if the product is informational related or downloadable software that has negligible per unit cost

  • network based companies that can offer a free trial (perhaps even lifetime free trial) of a high value product which increases in value through subscriber growth. Think VoIP companies, etc.

When CashKeywords sponsored Threadwatch it was a hit, largely because they offered the option of getting their entire product free of charge. Typically though marketers are greedier and/or short sighted, you get people who:

While idealistically conversational marketing should work great there are many fundamental errors with it.

  1. People are skeptical of advertising.

  2. By default the group of people asked to comment on an ad are going to be more inclined to offer negative feedback.
  3. The people who buy and like your product and comment on it would likely give you more useful feedback directly.
  4. Threads often run on tangents. If it is a paid ad the odds of the tangent being a negative one are much higher.
  5. Most legitimate companies have made a few mistakes and/or have a few skeletons in their closet. If they have not made any mistakes then they probably are not interesting enough to be comment worthy.

The problem that makes conversational marketing sound appealing is that many of the best content providers do not make near enough off their content due to limited ad sales resources and content topic selection of hypersaturated low value topics.

As an ad buyer, when I am buying ad space in hyper-saturated markets I respect the fact that there is going to be under-priced ad inventory. Marketers market on spyware because it has a positive ROI. Marketers market on stolen or garbage content funded by Google AdWords because it is profitable.

When doing the pay per influence model you don't buy the influence of those with reach. If they are selling that they lose their credibility...and eventually their reach. All you are merely doing is overpaying for ad space near their content.

Look at the Superbowl. Those ads are likely overpriced largely because they give advertisers such large exposure. Now some of them may have viral follow up elements that add value in other ways, but most ads do not do that.

I sell conversational ads on Threadwatch and get like 1 enquery a month. Not much considering that is one of the 10 or so most powerful sites in this industry.

I have also cut back most of my ad spend for this site outside of AdWords because most of it offers a net negative ROI...whereas I might make a slight profit with AdWords.

Do I get some love from conversational marketing? You bet your ass I do. In the last couple days I stumbled across this SEO Book mention and another great thread at Digital Point. That is conversational marketing. You commenting on this thread is likely going to be good conversational marketing. Sites that make you feel you know the owner will continue to grow in reach.

You can't make happy customers want to give positive feedback on someone else's site by advertising there. They have to already want to do it. And you can't pay for it or some people will question it for being fake.

Is It Worth Creating a Site Broader than My Niche?

SEO Question: I am interested in a topic, but am not sure if I should create a niche site within that topic or create a site about that topic?

SEO Answer: As long as there is a functional business model there it is almost always worth niching down a site. Having said that sometimes it makes sense to create a second site slightly broader nature as it will teach you more about how your niche fits into the broader category.

For a while a gave the advice that it might be a good idea to create a directory site one level above your category. For example:

  • if you did link building you could create a directory of SEO resources.

  • if you focused on SEO you could create a meta search engine, search rating system, or a site about search
  • If you focused on currency trading or currency collecting you could make a site about currency or the history of currency.

The broader sites need not be directories specifically, just informational sites you can use to help learn about your market. Other advantages of creating a site that relates to your end business are:

  • social networking: learn who the players in your market are. give them another way to find out who you are.

  • learn more about your business: if your portal becomes popular you may be able to sell ad space on it. The categories with the most interest or highest paying advertisers may be good businesses to jump in. I can't tell you how many SEO companies create sites based on ideas from blowhard prospective clients.
  • drive leads: The guy who owns StateCollege.com also uses that site to sell internet marketing services to local companies. You can target locally or topically.
  • nepotistic links: while listing other good resources you can list your site near the top of your category to help build your brand. If you keep the site fairly non commercial and make it useful (to where you often find yourself going back to it to use it) then odds are you should be able to pick up some good links.

After you get established and know what niche you want to work in it is probably best to focus in on the main site, but off the start it does not hurt to have a foot in a few different ponds until you figure out what you really want to do.

Also worth noting that it is easy to get discouraged because sometimes the only thing separating you and success is time and there is only so much that you can force it. After a year or so the logarithmic and profitable growth really kicks in though.

What Are Poison Words? Do They Matter?

SEO Question: I'm researching poison or forbidden words and I've only found a few vague or older posts from 2000 in a few SEO forums. Supposedly if a site uses poison words in the title etc. it is pushed way down in the SERPs. Any idea if this is fact or fiction? I'd love a complete list of poison words, although right now I'm specifically trying to find out if sale, best, about, contact us, website, or free shipping are poison because I have a retail product site with those words in the home page title, description, and body text.

SEO Answer:
Poison words were a way to deweight low quality content pages:
I have actually never put much effort into researching poison words, but I will try to give my opinion on the subject.

The initial research and information about poison words came out well before I jumped into the SEO market. This page talks about the idea of poison words:

Poison words, are words that are known to decrease your pages rankings if a search engine finds them in the title, description or in the url. They don't kill, they just bury pages in rankings.

Generally, people think of adult words first. Adult words (obscene) often put your page in an adult category where it is filtered out by various filters at search engines.

Newer non-adult Poison Words are being uncovered. These words don't throw you into a different category, then just decrease your rankings. Poison Words signal to a search engine, that this page is of low value.

Forums are Bad?
That same page goes on to cite how forum may have been a bad word around that time:

The worst of the lot would probably be the word "forum". Chat and BBS forum systems have taking body shots by all the major search engines this year. Two well know search engines now specifically look for links to BBS software makers and kill the pages in the index outright - possibly the whole domain.

Other possible poison title/url words and phrases that come to mind: UBB, BBS, Ebay, and all variations on the pa-id to surf program keywords.

Why Would Forums Have Been Bad?
As stated above, I was not around on the web during that time period, so I can only guess as to why forum would have been such a bad word.

Largely I think it would have came down to two factors:

  • overweighting of forums in the search results

  • how easy it was (and still is) to spam forums

In early 2000 there were far fewer pages on the web than there are today. Because of the textual nature of forums and how many pages forum conversations generated it would not surprise me if forums tended to make up too large of a percentage of the search results, and thus they had to offset that by deweighting forums.

Things which show either a lack of moderation of content or page contents that are not vetted by the site publisher may make search engines want to consider deweighting a page. Imagine a page with few inbound links from outside sites and 100 external links on the page, and all 100 links used the nofollow attribute. If you were an engine would you want to trust that page much? I wouldn't.

The Web Was Much Smaller:
To put it in perspective, back in early 2000 Google was still pushing people toward the Google Directory on their home page, had a link to their awards page pushing Yahoo! and did not even yet have the number of documents page count that they had for about 4 or 5 years. On June 26th of 2000 Google announced that they had 560 million full-text indexed web pages and 500 million partially indexed URLs. Right now Webmasterworld has over 2 million pages in Google's index, so you can see how a few large forum sites would be able to dominate a search index that small. Combine that with many forums being hit by internet marketers aggressively spamming them and the content seems less desirable.

Deweighting User Interaction:
As far as deweighting pages that allow user interaction that makes sense as well. Why? Because for most sites the page and site gain authority primarily for the actions of the site owner or paid editors. If third parties can add content to a page they can influence the relevancy of that document, and thus leverage the authority of the original author without much expense. That is why search engineers pushed the nofollow attribute so hard.

Plus if pages and sites are legitimate and allow value added useful community interaction typically those sites will get more links and authority, so knocking them down a bit for allowing interactivity and third party publishing does not really hurt them - since the legitimate sites would make that right back through gaining more citations.

Turning a Page Into Spam:
I don't search as much as I would like to because I spend too much time reading and writing stuff (and not enough time researching), but on occasion I search around. I have seen totally unrelated blog posts rank #1 on Google for certain types of niche pornography because someone came by and left a comment that made that document become relevant to the uber gross porn query.

Blog Comment and RSS Spam:
In a recent post on SEO Buzz Box DaveN hinted that comments may make a page be seen as less clean, and thus give a search engine a reason to deweight it. Combine that with the vastly growing field of citation spam and it makes sense that Google would not want to promote similar content that is only differentiated by ad placement and a few third party comments.

Ebb and Flow:
So given that forums were a type of content that may have been overrepresented and undesirable I think it is worth noting that maybe right now they may be considered to be better than they once were. Perhaps contextual advertising programs and the rebound of online advertising may have gave forum owners more compensation which allow them to run better forums. Also algorithms are more link focused and most forum pages tend to score naturally poor because there are so many pages as compared to the quantity and quality of inbound links to most forums.

Search engines constantly battle with marketers for what types of sites to rank in the search results.

Sometimes you will notice Amazon and large vertical sites ranking for almost everything under the sun. At other times directories are given more weight than would seem logical.

In late 2003, around the time of the Google Update Florida directories started showing up way too much in the search results. People took advantage of the opportunity and thousands of vertically focused or general PageRank selling directories sprung up.

Since then many of those directories seem to be packing less punch in the SERPs - in direct rankings and with how much their links help other sites.

Closing Holes Opens New Ones:
So what you see is wave after wave of content type. As search engines close some holes they open up others. When WG and Oilman interviewed Matt Cutts they also spoke about how the face of spam has - at least for now - moved from blog spam sites to subdomains off established sites. Right now Google is putting too much weight on old established sites.

Blogs Getting Away With a Bit Much:
With all of the blog networks springing up right now I wouldn't be surprised if some search engineers were starting to get sick of blogs, and looking for ways to deweight some of those networks as well. That is another example of why forums may become more desirable...if blogs are so hot that everyone and their dog has 5 of them maybe the people who are looking to make a quick buck are going to be more inclined to run blogs than forums.

Poison Words No Longer Needed?
That sorta leads me into my next point. I don't think poison words in their old traditional sense are as important as they may have been.

I still think the concept of poison words has a roll, but it is likely minimal other than how much search engines can trust citations. IE: pages that flag for poison words may not pass as much outbound link authority.

The inverse rule of link quality states that the effect of a link is going to be inversely proportional to how easy it is for a competing site to gain that same link.

So if the words "add URL" and "buy PageRank" are on the page those links may not count as much as other types of links. On this page Ciml noted how guestbook pages were not passing PageRank, but then Google undid that, at least to some extent. Stop words may not be necessary to deweight low quality links though. De-weighting may occur fairly naturally via other algorithmic mechanisms that generally parallel the effect of stop words:

Search engines collect more data and have far better technology as well. If pages are not found useful by searchers then they will eventually rank lower in the search results.

Establishing Trust:
So right now - and going forward - search relevancy will be about establishing trust. How trust is established will continue to evolve. Those who have more trust will also be able to get away with more aggressive marketing. Some new sites that use the DP coop network do not do that well with it, but sites that are either old and/or have built up significant usage data via email or viral marketing seem to be able to do more with it.

Google's Informational Bias:
Also note that Google tends to be a bit biased toward sites they believe to be informational in nature. Yahoo! Mindset shows how easy it is for search engines to adjust that sort of bias. You could think of words like shopping carts and checkout as being treated as poison words, but odds are highly likely that if a merchant site provides a useful feature rich page that search engines want that content. Most merchant sites that are getting whacked in Google are likely getting whacked for having thin sites with near duplicate content on most pages or for having unnatural linkage profiles.

Many thin affiliate sites are also getting hit for having no original content and outbound affiliate links on nearly every page.

Improving Content Quality:
With all informational databases Google pushes they first push getting as much of it as possible, and then as time passes they learn to better understand it (looking ultimately at human interaction), and try to push for the creation of higher quality content. Most web based publishers will face a huge strugle with balancing content quality and content cost.

The only way their business model works is if others allow them to give people free access to high quality content. I don't think that poison words are necessarily needed to do that though...at least not for most natural created-for-human pages in their general search database.

Vertical Search:
Some vertical search engines may use certain words for inclusion or exclusion in their database. For example look at Edgeio or NFFC's post on Become.com.

Alternate Meaning for Poison Words:
Some people have also called terms poison words because some of them throw off contextual ad targeting.

Google allows you to use section targeting to help target your AdSense ads away from common generic words like blog.

Flash Designer Marketing Idea...

I still want this blog to primarily be about SEO, but I am going to start posting a bunch more about other web aspects and other marketing ideas I have, because as the algorithms advance those who have great holistic or viral ideas will be the ones who win. Those who chase the algorithms will need to have smarts and resources beyond what the average person has. Almost anyone can be creative and if you tune in to culture the marketing ideas tend to throw themselves at you.

Recently on a hunting expedition Dick Cheney shot a 78 year old man.

I am not a flash designer, but if I were I would love to create a flash game called Hunting With Dick.

If someone does it and does it well they should easily get a PageRank 7, a higher PageRank than this site has. Some source material:

I really would love to see this game. Anyone think I should hold a prize giving contest?

Won't It Piss Some People Off?
Of course it would, but recently a ringtone company created a fake sexual ringing tone site called Pheretones. It spread like wildfire.

"You run the risk in any campaign like this that you might offend somebody," he said. "But even if you offend somebody, it seems to spread the gospel of the campaign."

Conversation is the key to traffic.

Ultimately most people working on the web are going to get squeezed as marketing inefficiencies get taken care of.

Why not create many doorways to your personality so people with similar interests can find you? Why not work for clients that you can be passionate about? Imagine if every new client was your favorite person to work with.

When I interviewed NFFC he stated:

I think the best brands, the best sites have a large portion of their founders personality in them. Never be afraid to be yourself, after all there are 1/2 billion people on the www, not all of them have to agree with you. Concentrate on the ones that share your views, concentrate on making their experience the very best it can be, the rest forget them.

Or to put it another way, the best sites say - this is what we do, this is how we do it, if you don't like it go somewhere else.

Pages