The Power of a Generic Domain Name

Recently I have been getting A LOT of queries asking about how my book compared to a newsletter service from a competing company. I guess the reason why is they formatted their salesletter to promote my site. I love my domain mame! :)

SEO Book membership.

The SEO Bubble

I was just interviewed about SEO for articles by Forbes and the Wall Street Journal last week. This week Forbes, which hosts doorway mesothelioma pages, has another one titled "Should You Hire a Search Engine Consultant?" Due to Google's push of their news vertical, the Forbes article quickly ranked #4 in Google for "seo", which helps push me down another spot. Arg ;)

The WSJ also published another article about SEO, which includes news of a mom naming her child after a toilet bowl company because the name is rare.

As if the news coverage wasn't bad enough for heating up market competition, some SEO firms are investing heavily in automation technology and are sharing that story publicly, while Google is emphasizing old authority sites, (killing small new sites both in organic search and on ad quality scores).

By the time something is widely talked about the easy ROI is already on the downward slope. Buying domains was really profitable about 5 years ago when the first web bubble burst, but some of the sharpest domainers are buying domains for 140 years revenues. If you are new to a market how can you compete with that?

And since search is almost as old as the web is, and search engines collect so much usage data it is hard to compete without a serious budget or an original marketing angle. Many of the sharpest minds in SEO have moved beyond just doing SEO, because if you only do SEO you will only make a fraction of what you would if you spent that same amount of time doing things that are becoming relatively easier for real SEOs, like folding SEO into a holistic marketing mix and creating real brands. But if one's core profession was not SEO how would there be enough time? Who has time to be a subject matter expert, provide customer service, while learning branding, marketing, monetization, etc etc etc on the side?

Worse yet, the window of opportunity for each new opportunity gets shorter and shorter. Social media is already too hyped to be of any value for most webmasters. People buy votes from top contributors and PR firms are sending out iPods for publishers to keep if they are willing to review it and associate it with a specific merchant. Everyone is buying links one way or another, and if you don't have a budget or some serious creativity you are screwed as a would be SEO.

eBay Subdomain Spam

23 out of the top 32 Google search results for titleist provx golf balls are ebay.com, subdomain.ebay.com, spam.subdomain.ebay.com, popular.spam.subdomain.ebay.com, etc.

Maybe they didn't intend to use the subdomains so aggressively. Maybe it is not search spam. But even if that is the case, it is a display of pathetic relevancy algorithms by Google. Time to move away from core domain authority or put a cap on the subdomains Google.

Google is the Ad Agency

Google is further commoditizing the role of the ad agency. They are signing up advertisers interested in trying their new Ad Creation Marketplace:

In the Ad Creation Marketplace, you'll find industry professionals who can provide script writing, editing, production, and voice-over talent at an affordable package cost. It's free to search for and send project bids to specialists, and you aren't under any obligation to work with them until you accept a bid.

Google's great value has been in targeting and tracking, but text and image ads are not as emotionally appealing as video ads. As they try to move up the value chain to caputre branding ad dollars they are trying to create a support ecosystem that helps the market grow quickly and keeps the market as efficient as possible.

How much does Google expect ad production to cost? Crumbs:

Once you accept a specialist's bid, you might expect to spend anywhere between $100 and $1000 for your ad.

Warning: This Site May Distribute Spyware & Adware

Nick Carr posted about Google's plans to police the web. Imagine if you give Google your data that they certify you with some symbol of trust. And if you don't, you are less likely to be certified unless you have a preponderance of other quality signals. Guilty until proven innocent is the way of the relevancy algorithms. Why would the safety algorithms be any different?

What happens to your sites rankings and sales if it is unrated or deemed potentially risky? Fear is a compelling marketing mechanism. AOL has used it for how many years?

Associated Press Considers Selling News Ala Carte

From a story about AP by AP:

As newspapers focus increasingly on locally relevant news, Curley said the AP is proposing changes that would allow members to subscribe to a core package of breaking news and then add other news packages. Currently, it offers broader packages of news defined mainly by the volume of news delivered -- small, medium or large.

So a near monopoly is breaking up how it sells content to other news agencies? I think that more than anything else shows the effect search and the internet are having on news agencies.
The media is addicted to search, and Google is keeping them addicted by giving them a bit more traffic. The NYT is already republishing old stories to spam Google. Eventually I wouldn't be surprised to see the AP sell chunks of stories that local papers can chose to wrap their own content around, to get past duplicate content filters.

Microsoft just launched AdWriter (a free tool to write ad copy), and Thomson Financial already admits to using robots to automatically write some of their stories:

Thomson Financial has been using automatic computer programs to generate news stories for almost six months. The machines can spit out wire-ready copy based on financial reports a mere 0.3 seconds after receiving the data.

This movement toward efficiency and recycling is the exact opposite of what the papers need to do if they want to stay relevant, but the machines are already in motion, doing everything from writing the news to trading stocks:

Quants seek to strip human emotions such as fear and greed out of investing. Today, their brand of computer-guided trading has reached levels undreamed of a decade ago. A third of all U.S. stock trades in 2006 were driven by automatic programs, or algorithms, according to Boston-based consulting firm Aite Group LLC. By 2010, that figure will reach 50 percent, according to Aite.

As established trusted authorities and rich power sources move toward automation and efficiency who could beat them? Probably Google, but then whats left to trust but robots?

Search Engines Giving You the Tools to Kill Yourself

Many publishers hide additional information sections that they want people to be able to select viewing if they show interest in the topic. For example, each of Think Progress's navigational sections are expandable, and some publishers have more information or other informational cues to make additional page content visible. These can be used deceptively, but if you have a strong brand and are trying to use them with the end user in mind, I doubt search engines will think the intent is bad.

AdSense Section Targeting:

As search has taken a larger and larger piece of the web search engines have given us ways to mark up our pages to suit their needs. AdSense section targeting made it easier for Google to target content ads to your site. That sounds like a good idea, but they also offer tags that offer publishers no value.

Google's NoFollow:

Nofollow was originally recommended to stop blog comment spam, but it has morphed into a tag that Matt Cutts wants you to use on any paid or unnatural link. What makes a link unnatural? In one form or another almost everything is paid for, by giving away value, exchanging currency, or nepotism.

Do You Trust Yourself?

If a page has many nofollow tags on it isn't that another way of saying that the publisher does not trust their own content? If a publisher says that they don't trust their own content or their own advertisers then why would search engines (or savvy webmasters) want to trust them?

The Machine is Broken:

Bob Massa recently highlighted how absurd the current use of the nofollow attribute is:

Mr. Cutts, speaking on behalf of Google presumably, made the comment, "if you want to buy links just for traffic, totally fine just don’t do it so they affect search engines".

This concept is completely flawed. This self serving philosophy is also at the very core of the problem. When the machine attempts to modify the behavior of people to satisfy it’s own ends, the machine is broken. What people do should not be seen as affecting the search engine. What people do should be the very reason for the engine to exist in the first place. If the search engine is being affected by the actions of people, is any logical person going to honestly assume that it is the people that are broken? That is exactly what is happening here.

Yahoo!'s Robots-Nocontent Attribute:

Search engines have got better at identifying duplicate content. Some search engines may boilerplate strip obvious navigational elements from pages. Some may place pages with too much duplicate content in supplemental results. Some may sites with too much duplicate content in reduced crawling status.

There are all of these ways to fight off content duplication and Yahoo! offers a robots-nocontent tag. One of the first people to comment on the news was Google's Matt Cutts, who said:

Danny, can you ask how Yahoo intends to treat links in the "robots-nocontent" section?

Don't Use the Robots-Nocontent Attribute:

It might be easy to add class="robots-nocontent" to some of your divs, but should you? I think it has little value. Sure you could use it in a sneaky way, as suggested by Jay Westerdal, but the problems with that are:

  • it looks sneaky

  • you are removing content from your pages (and will thus rank for fewer phrases)
  • there are easier and more effective ways of changing the meaning of a page without looking so sneaky...like just rewriting an article, adding a spammy comment that looks like it came from a third party, or adding a few additional words here or there.

Yahoo! is the top network of sites on the web. Internally they have publishing teams and an SEO team. If their search engineers can't figure out how to use their own internal traffic stats and other relevancy measurements to refine their duplicate detection algorithms they deserve to bleed marketshare until they no longer have relevancy in the marketplace.

How to Change the Focus of a Page Without Using Robots-Nocontent:

If you want to change the focus of your pages here are some of the best ways to do it

  • Ensure your page title and meta description are unique. Do not place the same words at the start of every page title on all the pages of a new website.

  • Make your h1 headings and subheadings target a slightly different word set than your page title.
  • If your page is thin on content, add more additional relevant unique content to the page. The solution to not getting killed by duplicate content filters is adding more unique content, not stripping out obvious required duplication (such as navigation and advertisements) that search engines should be able to figure out.
  • If your site has comments or consumer feedback you can post or encourage feedback that targets other keywords. Comments offer free text. A 500 word page with an additional 1,000 words in the comment section may rank for 2 or 3 times as many search queries. Don't throw away the free content.
  • For those who are really aggressive and have crusty links that will never be removed, consider placing your commercial messages on one of your highly trusted high ranking pages. People buy and sell websites, who is to say that the contents of a URL can't change?

Conversion Opportunity Pie

When looking at the potential upside of conversion improvements we tend to overestimate the opportunity. Avinash Kaushik looks at filtering bounce rates, bots, and user intent to see your true conversion opportunity.

Avinash recently did a podcast interview with Simon Chen.

Castles Made of Sand

The .TV relaunch was not very successful because the premium domain name prices are yearly recurring fees (which may increase beyond that price buy some unknown amount). People who would create great content and later stumble into a business model are not likely to do so on a premium .tv name...which means most of those domain names won't have high quality content on them. Those that do may see thin profit margins because they have no control over their domain names...as they make them more valuable the registrar can increase prices without mercy, and when the registrant can no longer afford the domain names the registry gets to keep or sell any brand value the registrant built up. John Scott had a great post about people feeling guilty for buying links:

If you feel guilty about buying links, you’re probably feeling guilty for a reason. Perhaps your guilt comes from the fact that you are trying to rank a site for a keyword it doesn’t deserve ranking for. If that’s the case, stop trying to rank it for that keyword, and get you’re business in order. Get the site and your business to the place where you can honestly say that your site deserves to rank #1 for that keyword, and your business deserves to be #1 based on the merits of your business.

Yesterday at a market I bought this terrific soap with my girlfriend. The problem is that the packaging has a URL on it. No need to go to the vendor again. It is hard for the end vendor to get any traction as the supplier sells directly, has pricing control, and is more convenient to order from.

If you are too dependant on any one supplier or any one source of leads then you need to re-evaluate your position to decrease your risk profile and come up with ways to build your brand value.

The Overton Window & Shifting Trends in Public Policy

Social policy (and profitable business models) shift over time. Swords Crossed describes the Overton Window

The mission of a think tank is to introduce ideas into public discourse and normalize them within the public discourse. ...

One useful tool is the Overton window. Named after the former vice president of the Mackinac Center for Public Policy who developed the model, it's a means of visualizing where to go, and how to assess progress. ...

If you're of an analytic bent, and want to figure out where a legislative or policy strategy is heading, try constructing the scale of possibilities and the Overton window for the subject at hand. Change can happen by accident, true: but it is just as often the product of deliberation and intent, and it does all of us well to understand the mechanisms by which it occurs.

Increasingly, bogus advertisements are being packaged and distributed as content. If you watch how people like Frank Luntz or Glenn Beck aim to manipulate public perception and language you can predict trends quicker than competitors do. Even if you do not agree with their message, you can still profit from the trend while undermining their goals.

Pages