eBay Subdomain Spam

23 out of the top 32 Google search results for titleist provx golf balls are ebay.com, subdomain.ebay.com, spam.subdomain.ebay.com, popular.spam.subdomain.ebay.com, etc.

Maybe they didn't intend to use the subdomains so aggressively. Maybe it is not search spam. But even if that is the case, it is a display of pathetic relevancy algorithms by Google. Time to move away from core domain authority or put a cap on the subdomains Google.

Google is the Ad Agency

Google is further commoditizing the role of the ad agency. They are signing up advertisers interested in trying their new Ad Creation Marketplace:

In the Ad Creation Marketplace, you'll find industry professionals who can provide script writing, editing, production, and voice-over talent at an affordable package cost. It's free to search for and send project bids to specialists, and you aren't under any obligation to work with them until you accept a bid.

Google's great value has been in targeting and tracking, but text and image ads are not as emotionally appealing as video ads. As they try to move up the value chain to caputre branding ad dollars they are trying to create a support ecosystem that helps the market grow quickly and keeps the market as efficient as possible.

How much does Google expect ad production to cost? Crumbs:

Once you accept a specialist's bid, you might expect to spend anywhere between $100 and $1000 for your ad.

Warning: This Site May Distribute Spyware & Adware

Nick Carr posted about Google's plans to police the web. Imagine if you give Google your data that they certify you with some symbol of trust. And if you don't, you are less likely to be certified unless you have a preponderance of other quality signals. Guilty until proven innocent is the way of the relevancy algorithms. Why would the safety algorithms be any different?

What happens to your sites rankings and sales if it is unrated or deemed potentially risky? Fear is a compelling marketing mechanism. AOL has used it for how many years?

Associated Press Considers Selling News Ala Carte

From a story about AP by AP:

As newspapers focus increasingly on locally relevant news, Curley said the AP is proposing changes that would allow members to subscribe to a core package of breaking news and then add other news packages. Currently, it offers broader packages of news defined mainly by the volume of news delivered -- small, medium or large.

So a near monopoly is breaking up how it sells content to other news agencies? I think that more than anything else shows the effect search and the internet are having on news agencies.
The media is addicted to search, and Google is keeping them addicted by giving them a bit more traffic. The NYT is already republishing old stories to spam Google. Eventually I wouldn't be surprised to see the AP sell chunks of stories that local papers can chose to wrap their own content around, to get past duplicate content filters.

Microsoft just launched AdWriter (a free tool to write ad copy), and Thomson Financial already admits to using robots to automatically write some of their stories:

Thomson Financial has been using automatic computer programs to generate news stories for almost six months. The machines can spit out wire-ready copy based on financial reports a mere 0.3 seconds after receiving the data.

This movement toward efficiency and recycling is the exact opposite of what the papers need to do if they want to stay relevant, but the machines are already in motion, doing everything from writing the news to trading stocks:

Quants seek to strip human emotions such as fear and greed out of investing. Today, their brand of computer-guided trading has reached levels undreamed of a decade ago. A third of all U.S. stock trades in 2006 were driven by automatic programs, or algorithms, according to Boston-based consulting firm Aite Group LLC. By 2010, that figure will reach 50 percent, according to Aite.

As established trusted authorities and rich power sources move toward automation and efficiency who could beat them? Probably Google, but then whats left to trust but robots?

Search Engines Giving You the Tools to Kill Yourself

Many publishers hide additional information sections that they want people to be able to select viewing if they show interest in the topic. For example, each of Think Progress's navigational sections are expandable, and some publishers have more information or other informational cues to make additional page content visible. These can be used deceptively, but if you have a strong brand and are trying to use them with the end user in mind, I doubt search engines will think the intent is bad.

AdSense Section Targeting:

As search has taken a larger and larger piece of the web search engines have given us ways to mark up our pages to suit their needs. AdSense section targeting made it easier for Google to target content ads to your site. That sounds like a good idea, but they also offer tags that offer publishers no value.

Google's NoFollow:

Nofollow was originally recommended to stop blog comment spam, but it has morphed into a tag that Matt Cutts wants you to use on any paid or unnatural link. What makes a link unnatural? In one form or another almost everything is paid for, by giving away value, exchanging currency, or nepotism.

Do You Trust Yourself?

If a page has many nofollow tags on it isn't that another way of saying that the publisher does not trust their own content? If a publisher says that they don't trust their own content or their own advertisers then why would search engines (or savvy webmasters) want to trust them?

The Machine is Broken:

Bob Massa recently highlighted how absurd the current use of the nofollow attribute is:

Mr. Cutts, speaking on behalf of Google presumably, made the comment, "if you want to buy links just for traffic, totally fine just don’t do it so they affect search engines".

This concept is completely flawed. This self serving philosophy is also at the very core of the problem. When the machine attempts to modify the behavior of people to satisfy it’s own ends, the machine is broken. What people do should not be seen as affecting the search engine. What people do should be the very reason for the engine to exist in the first place. If the search engine is being affected by the actions of people, is any logical person going to honestly assume that it is the people that are broken? That is exactly what is happening here.

Yahoo!'s Robots-Nocontent Attribute:

Search engines have got better at identifying duplicate content. Some search engines may boilerplate strip obvious navigational elements from pages. Some may place pages with too much duplicate content in supplemental results. Some may sites with too much duplicate content in reduced crawling status.

There are all of these ways to fight off content duplication and Yahoo! offers a robots-nocontent tag. One of the first people to comment on the news was Google's Matt Cutts, who said:

Danny, can you ask how Yahoo intends to treat links in the "robots-nocontent" section?

Don't Use the Robots-Nocontent Attribute:

It might be easy to add class="robots-nocontent" to some of your divs, but should you? I think it has little value. Sure you could use it in a sneaky way, as suggested by Jay Westerdal, but the problems with that are:

  • it looks sneaky

  • you are removing content from your pages (and will thus rank for fewer phrases)
  • there are easier and more effective ways of changing the meaning of a page without looking so sneaky...like just rewriting an article, adding a spammy comment that looks like it came from a third party, or adding a few additional words here or there.

Yahoo! is the top network of sites on the web. Internally they have publishing teams and an SEO team. If their search engineers can't figure out how to use their own internal traffic stats and other relevancy measurements to refine their duplicate detection algorithms they deserve to bleed marketshare until they no longer have relevancy in the marketplace.

How to Change the Focus of a Page Without Using Robots-Nocontent:

If you want to change the focus of your pages here are some of the best ways to do it

  • Ensure your page title and meta description are unique. Do not place the same words at the start of every page title on all the pages of a new website.

  • Make your h1 headings and subheadings target a slightly different word set than your page title.
  • If your page is thin on content, add more additional relevant unique content to the page. The solution to not getting killed by duplicate content filters is adding more unique content, not stripping out obvious required duplication (such as navigation and advertisements) that search engines should be able to figure out.
  • If your site has comments or consumer feedback you can post or encourage feedback that targets other keywords. Comments offer free text. A 500 word page with an additional 1,000 words in the comment section may rank for 2 or 3 times as many search queries. Don't throw away the free content.
  • For those who are really aggressive and have crusty links that will never be removed, consider placing your commercial messages on one of your highly trusted high ranking pages. People buy and sell websites, who is to say that the contents of a URL can't change?

Conversion Opportunity Pie

When looking at the potential upside of conversion improvements we tend to overestimate the opportunity. Avinash Kaushik looks at filtering bounce rates, bots, and user intent to see your true conversion opportunity.

Avinash recently did a podcast interview with Simon Chen.

Castles Made of Sand

The .TV relaunch was not very successful because the premium domain name prices are yearly recurring fees (which may increase beyond that price buy some unknown amount). People who would create great content and later stumble into a business model are not likely to do so on a premium .tv name...which means most of those domain names won't have high quality content on them. Those that do may see thin profit margins because they have no control over their domain names...as they make them more valuable the registrar can increase prices without mercy, and when the registrant can no longer afford the domain names the registry gets to keep or sell any brand value the registrant built up. John Scott had a great post about people feeling guilty for buying links:

If you feel guilty about buying links, you’re probably feeling guilty for a reason. Perhaps your guilt comes from the fact that you are trying to rank a site for a keyword it doesn’t deserve ranking for. If that’s the case, stop trying to rank it for that keyword, and get you’re business in order. Get the site and your business to the place where you can honestly say that your site deserves to rank #1 for that keyword, and your business deserves to be #1 based on the merits of your business.

Yesterday at a market I bought this terrific soap with my girlfriend. The problem is that the packaging has a URL on it. No need to go to the vendor again. It is hard for the end vendor to get any traction as the supplier sells directly, has pricing control, and is more convenient to order from.

If you are too dependant on any one supplier or any one source of leads then you need to re-evaluate your position to decrease your risk profile and come up with ways to build your brand value.

The Overton Window & Shifting Trends in Public Policy

Social policy (and profitable business models) shift over time. Swords Crossed describes the Overton Window

The mission of a think tank is to introduce ideas into public discourse and normalize them within the public discourse. ...

One useful tool is the Overton window. Named after the former vice president of the Mackinac Center for Public Policy who developed the model, it's a means of visualizing where to go, and how to assess progress. ...

If you're of an analytic bent, and want to figure out where a legislative or policy strategy is heading, try constructing the scale of possibilities and the Overton window for the subject at hand. Change can happen by accident, true: but it is just as often the product of deliberation and intent, and it does all of us well to understand the mechanisms by which it occurs.

Increasingly, bogus advertisements are being packaged and distributed as content. If you watch how people like Frank Luntz or Glenn Beck aim to manipulate public perception and language you can predict trends quicker than competitors do. Even if you do not agree with their message, you can still profit from the trend while undermining their goals.

Why I Love the Google's Supplemental Index

Forbes recently wrote an article about Google's supplemental results, painting it as webpage hell. The article states that pages in Google's Supplemental index is trusted less than pages in the regular index:

Google's programmers appear to have created the supplemental index with the best intentions. It's designed to lighten the workload of Google's "spider," the algorithm that constantly combs and categorizes the Web's pages. Google uses the index as a holding pen for pages it deems to be of low quality or designed to appear artificially high in search results.

Matt Cutts was quick to state that supplemental results are not a big deal, as Rand did here too, but supplemental results ARE a big deal. They are an indication of the health of a website.

I have worked on some of the largest sites and network of sites on the web (hundreds of millions+ pages). When looking for duplicate content or information architecture related issues, the search engines do not allow you to view deep enough to see all indexing problems, so one of the first things I do is use this search to find low quality pages (ie: things that suck PageRank and do not add much unique content to their site). After you find some of the major issues you can dig deeper by filtering out some of the core issues that showed up on your first supplemental searches. For example, here are threadwatch.org supplemental results that do not contain the word node in the URL.

If you have duplicate content issues, at best you are splitting your PageRank, but you might also affect your crawl priorities. If Google thinks 90% of a site is garbage (or not worth trusting much) I am willing to bet that they also trust anything else on that domain a bit less than they otherwise would, and are more restrictive with their willingness to crawl the rest of the site. As noted in Wasting Link Authority on Ineffective Internal Link Structure, ShoeMoney increased his search traffic 1400% after blocking some of his supplemental pages.

Google Promoting the Hell Out of YouTube

YouTube is taking over the organic search results, and they are buying a ton of AdWords ads. Google doesn't even take the time to write relevant ads when promoting YouTube. Their second ranked ad for the word music doesn't even contain the word music.

Clearly with an ad so irrelevant they are not factoring in quality score into YouTube's ad position, or if they are they are paying themselves over $5 a click for the word music. How is a competing service to compete when they are forced to write relevant ads to be able to afford the click, and Google serves itself sloppy targeted broad reaching branded ads?

As Google arbitrages itself one vertical at a time the only businesses that are safe are:

  • those that get traffic AROUND Google

  • those that provide Google with the high quality content that Google is forced to rely on to keep spam out of their SERPs
  • those that Google does not have enough data or brand strength to compete with, or have some important offline component
  • those that are so small that Google doesn't want to compete

Pages