The Yahoo! Story

Worth a click, and full of cautionary reminders :D

The Yahoo! Story.

Money is an Arbitrary Value System*

* If you think otherwise, do you care to debate the following video?

"Perhaps the only lessons learned by the banks in the crisis is that the rules don't apply to them." Ain't that the truth. The video is completely accurate:

Richard M. Bowen, former chief underwriter for Citigroup’s consumer-lending group, said he warned his superiors of concerns that some types of loans in securities didn’t conform with representations and warranties in 2006 and 2007.

“In mid-2006, I discovered that over 60 percent of these mortgages purchased and sold were defective,” Bowen testified on April 7 before the Financial Crisis Inquiry Commission created by Congress. “Defective mortgages increased during 2007 to over 80 percent of production.”

Read that bold sentence again.

The bankers KNEW they were committing fraud that would create trillions of Dollars in losses.

Rather than suing them for the appraisal fraud, accounting control fraud, securities fraud, mortgage fraud & foreclosure fraud, instead we have a government that assists them in the fraud by papering over their mistakes and forcing other institutions to waive the right to sue over the fraud.

Everyone else is forced to eat the socialized losses that mirror the "profits" generated by the banks in years past. We have a central bank holding rates at 0 while running the printing presses in overdrive. That is destroying pension accounts and the living standard of the elderly as their life savings is literally stolen from them to prop up a bunch of criminal thugs on Wall Street.

The goal is not to create value or stability, but rather to engineer instability, with sharp market moves which can be profited from on the way up AND on the way down:

Bubble, Crash, Bubble, Crash, Bubble ...

We will continue this cycle until we catch on. The problem isn't only that the Fed is treating the symptoms instead of the disease. Rather, by irresponsibly promoting reckless speculation, misallocation of capital, moral hazard (careless lending without repercussions), and illusory "wealth effects," the Fed has become the disease.

The average person is fed backwards looking misinformation by the media, which is sponsored by those selling the bag to others.

If you are in debt for consumption driven reasons it is worth noting that the game is rigged. These people have done, are doing, and will continue to act as criminals to rip you off. There is no reason to go out of your way to make yourself easy prey.

Are you anti-American? Do you have an axe to grind? Did you somehow fail at life? These are questions we are trained to think when anyone complains about scams embedded in the political economy.

Speaking for myself, I have never been as successful as I am today, and much of that success comes from the free market ideals that have been preached (and meeting my wonderful wife). The issue I have right now is that we still preach those free market ideals, but are doing nothing to follow them (and are actively undermining them). If I was born 5 years or 10 years later I might be a street beggar, simply because my opportunity to succeed had been marginalized by the market corruption of the bankers. Maybe I still would have done ok, but a lot of people are being driven to the margins and off a cliff. Many of them are even sadly conned into believing it is their own fault.

Certainly consumers are to blame for some stuff, but they had no idea how much fraud was embedded in the financial engineering going on beyond the headlines. They didn't know that the central banker believed that there was no reason police fraud. Most people are too busy to be macro-economic experts on the side, and yet if you made the mistake of trusting the bankers in some cases they may have destroyed your life savings or left you living a meager life as a debt serf.

Back when the FBI warned that there was an epidemic of mortgage fraud (in 2004), the US federal government shifted some of those resources to policing other areas like homeland "security." When states tried to protect their citizens against predatory loans the federal government used preemption to block them, allowing the fraud to continue. As Alan Greenspan states: there was fraud, it was intentional, and it was indeed a series of criminal acts.

None of that is a matter of debate.

And none of the banking executives are in jail.

The federal government sided with the criminals on the way up. And it is siding with the criminals on the way down. They would rather debase the currency and steal your savings than let their criminal buddies on Wall Street go bankrupt.

Society is nothing more than a system of laws and the culture they promote. Betray someone's trust and they become less trusting toward everyone else. That creates friction in the marketplace which shrinks the economy and living standards. Make heroes out of criminals & at some people a lot of people are going to say "screw it, the laws do not apply to me either." When that happens (as it will) then at some point there will be a sharp increase in violence.

This is not about promoting communism, or some other such label. What we have now in the capital markets is far worse than communism, as they have privatized profits AND socialized losses. If you believe in capitalism OR communism then what is happening now is broken.

And yet the people who just privatized the profits and socialized the losses are to be worshiped and followed. You should be thankful that the government bailed them out & you should just suck it up. The role of government is to protect the wealth of the opulent from the stupid majority. It has always been.

The stupid thing about it is that all (or at least most) of that pain & suffering is easily avoidable by letting the people who preach free market values to eat their losses and fail. By pushing those off onto everyone else, ultimately the government just creates uncertainty and makes people less trusting. And as society breaks down, the government won't be able to sort out the issues, as they already gave away all the money to a bunch of brigands on Wall Street.

They will dance while your family starves.

"I believe that banking institutions are more dangerous to our liberties than standing armies . . . If the American people ever allow private banks to control the issue of their currency, first by inflation, then by deflation, the banks and corporations that will grow up around [the banks] . . . will deprive the people of all property until their children wake-up homeless on the continent their fathers conquered . . . The issuing power should be taken from the banks and restored to the people, to whom it properly belongs." —Thomas Jefferson

Google Ranking Internal Pages Rather Than Home Pages

When you get to *really* competitive keywords the results typically tend to be fairly stable because the cost of entry into the game is so high & many of the top players keep building additional signals of quality. You might get minor fluctuations from time to time, but large fluctuations on highly competitive keywords are fairly rare.

Over the last day or 2 Google has done yet another algorithm change (the 3rd or 4th noticeable one in 2 weeks), where on some searches they are ranking an internal page over the homepage. It is almost as if the best mental model for the algorithm that is doing this is...

  • find the top SITES that deserve to rank well & rank them based on that criteria
  • however, rather than ranking THOSE PAGES, instead do internal site searches & back in other relevancy factors to look for other popular & relevant pages on those sites
  • test to see how well searchers respond to them

Here is a pretty overt example, where Google changed 2 of the listings for "SEO" to internal pages.

I have seen other examples, some where Google also highlighted a new information-less blog post with only a couple automated backlinks pointing at it. I don't want to "out" that site though, but this is the type of image Google was showing beneath that entry

Google could conceivably use this sort of process to further adjust the search results based on demographics, searcher location, recent searches, searcher interests, and so on. Add in the ability to send searchers down a known path optimized for profitability, the ability to select vertical databases on the fly and change the titles on the fly and it allows whoever has the most search market share to keep refining the results to make them more appealing to users at an ever increasing level of granularity & greater profitability.

I have no problems keeping up with the increasing complexity of search, but Google is setting up some serious barriers to entry for new players. It is hard to explain in a straightforward manner that page A might be ranking due to relevancy signals pointing into page B, but these are the SERPs through which we make a living. And it is only going to keep growing more complex. ;)

Depending on how far Google pushes with this, it can have major implications in terms of rank tracking, SEO strategy, site architecture & conversion optimization. More on that stuff in the community forums.

Google Replacing Page Titles in Search Results With On-Page Headings

When Bing launched, one of the interesting things they did to make the organic search results appear more relevant was to use link anchor text to augment page titles (where relevant). This would mean if people searched for a phrase that was mostly in your title (but maybe your page title was missing a word or 2 from the search) then Bing might insert those words into the page title area of your listing if they were in some of the link anchor text pointing into your page.

Before being switched over to Bing, Yahoo! would sometimes display the H1 heading as the clickable link to your site (rather than the page title). Bing also uses on page headings to augment page titles.

Historically if Google has thought it would appear more relevant to searchers, sometimes they have shown a relevant machine generated piece of your page displaying those keywords in context rather than the meta description in the snippet, but typically Google has been far more conservative with the page titles. Sometimes Google would list the ODP title for the title of a page, most until recently they have generally typically just listed the page title as the clickable link to your site.

Recently Google has grown more experimental on this front, being willing to use link anchor text and on-page headings as part of the listing. In addition, if the page title is short, Google may add the site's name at the end of the title.

Here is an example in Google of the page title being replaced by part of an on-page heading & also showing the site's name being added to the end of the link

And here is the associated on-page heading for the above

I have also seen a few examples of the link anchor text being added to the page title in Google, however it was on a client project & the client would prefer that I didn't share his new site on an SEO blog with 10's of thousands of readers. :D

Last November Matt Cutts recently did a video on the topic of Google editing the page titles for relevancy & how it was a fairly new thing for Google. Even back then Google was quite conservative in editing the clickable link ... I think they have only grown more aggressive on that front in the past month or so.

Interview of Rich Skrenta

Rich Skrenta has been at the core of search longer than I have been in the SEO market. He is famous for launching sites like DMOZ and Topix. His most recent project is a search engine called blekko, and I recently had a chance to chat with him about blekko, the web, and marketing.

Blekko search engine.

Blekko just launched publicly today. Be sure to check out their search engine, all their SEO features, and the Blekko toolbar.

Most start ups fail. And yet you have multiple successes under your belt and are going at it again. If you could boil success down to a few points, what really separates what you have done from the statistics?

Paul Graham said that the most important thing for a startup is to not die each day. If you can keep existing, that's survival for a company. Generally I like to keep costs low and hire carefully. Also, the first idea doesn't always work. We had to pivot Topix several times to find the right model. For blekko, we just want to make a site that a segment of people will find useful. If we can do that we'll be happy.

It seems openness is a great marketing angle to use online. Why do you feel that it is so under-utilized by most companies?

It feels counter-intuitive to take all our your company IP and secrets and just put them all out there. Little companies also tend to be insecure and want to be appear to be larger and more successful. They want to put on a big company face to the world, but being honest and transparent about who they are and letting the public see "behind the curtain" can often win people over better than a facade of success.

From my perspective, it seems your approach to marketing is heavily reliant on organic, viral & word of mouth strategies. What is broken with the old model of marketing? Is its death happening slower or quicker than you expect?

The internet and social media have made word-of-mouth stronger and stronger, and in many ways they eclipse traditional marketing channels now. This started with blogging and has accelerated with Twitter and Facebook. Everybody is media now. You used to fly around and do a 2 week media tour to launch a product. The aperture to get in the trade press was small, there was a handful of reporters you had to go pitch. Now there are thousands of people who have audience for every trade niche, so it's easier to get the word out about something new. But it has to be genuinely interesting, or your message won't get pickup.

A lot of people who are good at programming make ugly designs. Likewise many people are either programmers or marketers. What formal training or experiences have you had that have allowed an engineer to become such a sophisticated marketer? What strengths do you have that allow you to bridge the disciplines so well?

We joke that we have always made ugly web sites. Fortunately I was able to hire a good designer for blekko and he's been doing a great job taking our early ugly versions and making them a lot more attractive and workable.

I read a lot of stuff about marketing and positioning that we're trying to apply at blekko. I'm a big fan of Trout & Ries. I loved Kathy Sierra's stuff when she was writing. There is some fantastic material also in Kellog on Branding. We also worked with some great positioning consultants that tested various ideas on focus groups to see what would resonate with users best as a message. Every product has a bunch of features, but you want to find the one to talk about that's going to stick in people's heads the best.

I noticed you baked many social elements into your marketing strategy (friend us on Facebook, follow us on Twitter) as well as baking many social elements into your product (personal slashtags, allowing people to share their slashtags, etc.). There is some talk on the web of apps or social stuff replacing search as the center of the web, however from a marketing perspective I see much higher traffic value in search traffic. Do you think that one day social and apps will largely replace global search? Or do you feel it will generally continue to play a secondary role to search?

Social media can drive tons of attention, awareness and traffic. But the search box is the best way to navigate to stuff you want. Now what will drive those results - if I type in "pizza", what should I get? The answer can be very different depending on whether the results are coming from the web, Yelp, or Facebook. So I guess my answer is that I still see search being the core way to navigate, but I think what gets searched is going to get a lot more structured and move away from simple keyword matches against unstructured web pages.

A good number of the social sites are doing redirects for security purposes & to some degree are cannibalizing the link graph. Do you feel that links from the social graph represent an important signal, or that most of that signal still gets represented well on the remaining link graph?

There is very definitely signal in social graph links - potentially more than in the web graph. In 2000, a hyperlink was a social vote. Most links were created by humans and represented an editorial vote. That's no longer true - the web today is inundated with bulk-generated links. To the extent that humans can be separated from bots, there's more true signal in social graphs. The challenge is to get enough coverage to rank everything you need to rank. Delicious had great search results for the corpus of links they knew about, but it wasn't nearly big enough to be comprehensive. Facebook and Twitter are certainly a lot bigger, it will be interesting to see if they start to apply their data to ranking and recommending material from outside of their own sites.

When Google was young Sergey Brin at an SEO conference stated that there was no such thing is spam, only bad relevancy algorithms. When I saw some of your talks announcing Blekko you mentioned that you never want to see eHow in your personal search results. Do you feel that spam is largely down to a personal opinion? If you had to draw a line in the sand between spam & not spam, how would you characterize the differences?

Search must serve an editorial function. You can call this editorial position "relevancy", but that's hiding behind the algorithm. Of course someone wrote the algorithm, and tinkered with it to make some sites come up and others not to come up.

The web has grown 100-fold since 2000. There is most definitely spam out there. Let's take a clear-cut example, like phama links being injected via exploits into unpatched WordPress blogs. Then there is gray-area stuff, like eHow.com. Some people like eHow. Some don't. That's why we let users develop their own /spam filters.

Eric Schmidt mentioned that sharing their ranking variables would be disclosing trade secrets that could harm Google. Yet you guys are sharing your web graph publicly. Are you worried about doing this impacting your relevancy in a negative way? Or do you feel the additional usage caused by that level of awareness will give you more inputs into your search relevancy algorithms?

When I first moved to Silicon Valley I worked in computer security. In security there's an idea that "security through obscurity" isn't very good. What this means is that if you have some new encryption algorithm, but don't let anyone see the details of how it works, it probably is full of holes. The only way to get a strong encryption algorithm is to publish all of the details about how it works and have public review. Once the researchers can't punch any more holes in your algorithm, only then is it good enough to trust.

We see search the same way. If this magic 200-variable equation is so sensitive that if it leaked out the results would be completely overrun with spam, well then the algorithm doesn't actually sound that strong to me. We'd rather work towards a place where there can be public review of the mechanisms driving ranking, and where many eyes can make the spam problem shallow.

Certainly the big search engines have hundreds of human raters that help identify spam and train their algorithms. These are contractors that are the knowledge workers behind the scenes. As a little startup, we asked ourselves how we could get many more people helping us to make our results better, and also be a lot more open about the process. Formerly we had experience running a big crowdsourced search site with the Open Directory, where we had 80,000 editors classifying urls. What if we could get 80,000 people to help us curate search verticals, identify spam, and train classifiers? That would be cool.

You had a blog post comparing pornographers to SEOs. Do you feel the SEO game is mostly adversarial? Or do you feel that paying attention to the SEO industry is a great way to quickly improve the quality of a search product? Or both? :)

I think my comparison noted that pornographers have often been early adopters of new technology. :-)

There is aggressive seo, and then there is what I call appropriate discoverability. Aggressive seo can go over the line - if someone hacks your server to add links, that's borderline criminal activity. But if you have great content and it's not showing up, that's a shame. After we sold topix to the newspapers, we spent some time evangelizing seo within their organizations. Think of all of the movie reviews and restaurant reviews the US newspaper sites collectively have. Wonderfully written material by well-paid professional journalists. But you don't see their content anywhere for a restaurant or movie search. That's a shame.

Recently Ask sorta rebranded away from search & towards more of a QnA format, and Yahoo! bowed out of search through a Bing partnership. Are the cost scales that drive such changes just a legitimate piece of the business model, or were those organizations highly inefficient? How were you able to bring a competitive product to market for so much less?

I was a fan of Ask's Teoma technology, and what Jim Lanzone had been doing with the site. And Yahoo was delivering very high quality results, and had interesting initiatives like the BOSS apis and SearchMonkey. This was all great stuff. I'm disappointed that they lost heart. Running a big company that has been around for a long time is not an easy job.

From an SEO perspective I think that Google tends to have a large index, but crawling so deeply likely allows a lot of junk into their index. Bing seems to be a bit more selective with their crawling strategy. How would you compare Blekko against the other major search engines in terms of depth? Do you feel that relevancy boosts offered through vertical search (via your Slashtags) allows you guys to provide a similar or better experience without needing as large of an index?

Our crawler tends to go into highly ranked sites more deeply than poorly ranked sites. We have a 3 billion page crawl, and so we need to choose the best content to include. This starts at crawl time - should we crawl this url or that url? There are a whole set of heuristics which drive what crawl budget an individual site gets.

The web keeps getting deeper and deeper - the challenge is how to return the good stuff and not sink. This is why we believe human curation needs to be brought back to search. Only by curating the best content in every vertical can the most relevant results be returned.

Amongst SEOs the issue of "brand" as a relevancy signal has been a topic of heated debates. How important do you feel brand is as a signal of relevancy & authority?

One of the things we look at is how natural the pattern of mentions of a site looks. Real brands tend to have a natural pattern of mentions on the web.

You had a blog post a few years back titled "PageRank wrecked the web." How do you feel about paid links? What editorial actions do you guys take when you find paid links?

If links have an economic value, they're going to be bought and sold. It's that simple. What happens in our ranker is that we classify different sources of signals, and then let the machine learning figure out what the signal is telling us. Is this a good source of anchortext? Or maybe a certain class of links even has a negative contribution to rank, if what the links are telling us doesn't correlate with the direction we want the ranker to be going.

How hard is it to detect paid links? What has been the most challenging part of launching a world class search engine?

The whole thing has been hard. Search has so many sub-components, and even things that sound trivial like DNS turn into big projects when you need to scale them up to billions of web pages.

---

Thanks Rich! Be sure to check out blekko. You can follow them on Twitter & read Rich's musings on the web, search, and marketing at Skrentablog.

Localization, Unique Data Sets & the Future of Search

Local is Huge

Google's US ad revenue is roughly 15 billion & the size of the US Yellow Pages market is roughly 14 billion. Most of that money is still in print, but that shift is only accelerating with Google's push into local.

Further, cell phones are location aware, can incorporate location into search suggest, and on the last quarterly conference call Google's Jonathan Rosenberg highlighted that mobile ads were already a billion Dollar market for Google.

Google has been working on localization for years, and as a top priority. When asked "Anything you’ve focused on more recently than freshness?" Amit Singal stated:

Localization. We were not local enough in multiple countries, especially in countries where there are multiple languages or in countries whose language is the same as the majority country.

So in Austria, where they speak German, they were getting many more German results because the German Web is bigger, the German linkage is bigger. Or in the U.K., they were getting American results, or in India or New Zealand. So we built a team around it and we have made great strides in localization. And we have had a lot of success internationally.

The Big Shift

I have been saving some notes on the push toward local for a while now, and with Google's launch of the new localized search results it is about time to do an overview. First here is Google's official announcement, and some great reviews from many top blogs.

Some of the localized results not only appear for things like Chicago pizza but also for single word searches in some cases, like pizza or flowers.

Promoting local businesses via the new formats has many strategic business benefits for Google

  • assuming they track user interactions, then eventually the relevancy is better for the end users
  • allows local businesses to begin to see more value from search, so they are more likely to invest into a search strategy
  • creates a direct relationship with business owners which can later be leveraged (in the past Google has marketed AdWords coupons to Google Analytics users)
  • if a nationwide brand can't dominate everywhere just because they are the brand, it means that they will have to pony up on the AdWords front if they want to keep 100% exposure
  • if Google manages to put more diversity into the local results then they can put more weight on domain authority on the global results (for instance, they have: looked at query chains, recommended brands in the search results, shown many results from the lead brand on a branded search query, listed the official site for searches for a brand + a location where that brand has no office, etc.)
  • it puts eye candy in the right rail that can make searchers more inclined to look over there
  • it makes SEO more complex & expensive
  • it allows Google to begin monetizing the organic results (rather than hiding them)
  • it puts in place an infrastructure which can be used in other markets outside of local

Data Data Data

Off the start it is hard to know what to make of this unless one draws historical parallels. At first one might be inclined to say the yellow page directories are screwed, but the transition could be a bit more subtle. The important thing to remember is that now that the results are in place, Google can test and collect data.

More data sources is typically better than better algorithms, and Google has highlighted that one of their richest sources of data is through tracking searcher behavior on their own websites.

Pardon Me, While I Steal Your Lunch

There are 2 strong ways to build a competitive advantage on the data front:

  • make your data better
  • starve competing business models to make them worse

Off the start yellow page sites might get a fair shake, but ultimately the direction they are headed in is being increasingly squeezed. In a mobile connected world with Google owning 97% search marketshare, while offering localized search auto-complete, ads that map to physical locations, and creating a mobile coupon offers network, the yellow page companies are a man without a country. Or perhaps a country without a plot of land. ;)

They are so desperate that they are cross licensing amongst leading competitors. But that just turns their data into more of a commodity.

Last December I cringed when I read David Swanson, the CEO of R.H. Donnelley, state: "People relate to us as a product company -- the yellow-pages -- but we don't get paid by people who use the yellow-pages, we get paid by small businesses for helping them create ad messages, build websites, and show up in search engine results. ... Most of the time today, you are not even realizing that you are interacting with us."

After seeing their high level of churn & reading the above comment, at that point I felt someone should have sent him the memo about the fate of thin affiliates on AdWords. Not to worry, truth would come out in time. ;)

Making things worse, not only is local heavily integrated into core search, with search suggest being localized, but Google is also dialing for Dollars offering flat rate map ads (with a free trial) and is testing fully automated flat rate local automated AdWords ads again.

Basic Economics

How does a business maximize yield? Externalize costs & internalize profits. Pretty straightforward. To do this effectively, Google wants to cut out as many middle men out of the game as possible. This means Google might decide to feed off your data while driving no traffic to your business, but rather driving you into bankruptcy.

Ultimately, what is being commoditized? Labor. More specifically:

  • the affiliate who took the risk to connect keywords and products
  • the labor that went into collecting & verifying local data
  • the labor that went into creating the editorial content on the web graph and the links which search engines rely on as their backbone.
  • the labor that went into manually creating local AdWords accounts, tracking their results, & optimizing them (which Google tracks & uses as the basis for their automated campaigns)
  • the labor that went into structuring content with the likes of micro-formats
  • the labor that went into policing and formatting user reviews
  • many other pieces of labor that the above labor ties into

Of course Google squirms out of any complaints by highlighting the seedy ends of the market and/or by highlighting how they only use such data "in aggregate" ... but if you are the one losing your job & having your labor used against you, "the aggregate" still blows as an excuse.

But if Google drives a business they are relying on into bankruptcy, won't that make their own search results worse?

Nope.

For 2 big reasons:

  • you are only judged on your *relative* performance against existing competitors
  • after Google drives some other players out of the marketplace and/or makes their data sets less complete, the end result is Google having the direct relationships with the advertisers and the most complete data set

The reason many Google changes come with limited monetization off the start is so that people won't question their motives.

Basically I think they look at it this way: "We don't care if we kill off a signal of relevancy because we will always be able to create more. If we poison the well for everyone else while giving ourselves a unique competitive advantage it is a double win. It is just like the murky gray area book deal which makes start up innovation prohibitively expensive while locking in a lasting competitive advantage for Google."

You would never hear Google state that sort of stuff publicly, but when you look at their private internal slides you see those sorts of thoughts are part of their strategy.

What is Spam?

The real Google guidelines should read something like this:

Fundamentally, the way to think about Google's perception of spam is that if Google can offer a similar quality service without much cost & without much effort then your site is spam.

Google doesn't come right out and say that (for anti-trust reasons), but they have mentioned the problem of search results in search results. And their remote rater documents did state this:

After typing a query, the search engine user sees a result page. You can think of the results on the result page as a list. Sometimes, the best results for "queries that ask for a list" are the best individual examples from that list. The page of search results itself is a nice list for the user.

...But This is Only Local...

After reading the above some SEOs might have a sigh of relief thinking "well at least this is only local."

To me that mindset is folly though.

Think back to the unveiling of Universal search. At first it was a limited beta test with some news sites, then Google bought Youtube, and then the search landscape changed...everyone wanted videos and all the other stuff all the time. :D

Anyone who thinks this rich content SERP which promotes Google is only about local is going to be sorely disappointed as it moves to:

  • travel search (Google doesn't need to sell airline tickets so long as they can show you who is cheapest & then book you on a high margin hotel)
  • any form of paid media (ebooks, music, magazines, newspapers, videos, anything taking micro-payments)
  • real estate
  • large lead generation markets (like insurance, mortgage, credit cards, .edu)
  • ecommerce search
  • perhaps eventually even markets like live ticketing for events

Google does query classification and can shape search traffic in ways that most people do not understand. If enough publishers provide the same sorts of data and use the same types of tags, they are creating new sets of navigation for Google to offer end users.

No need to navigate through a publisher's website until *after* you have passed the click toll booth.

Try #3 at Reviews

Google SearchWiki failed in large part because it confused users. Google launched SideWiki about a year ago, but my guess is it isn't fairing much better. When SideWiki launched Danny Sullivan wrote:

Sidewiki feels like another swing at something Google seems to desperately desires — a community of experts offering high quality comments. Google says that’s something that its cofounders Larry Page and Sergey Brin wanted more than a system for ranking web pages. They really wanted a system to annotate pages across the web.

The only way they are going to get that critical mass is by putting that stuff right in the search results. It starts with local (& scrape + mash in other areas like ecommerce), but you know what they want & they are nothing if not determined to get what they want! ;)

Long Term Implications

Scrape / mash / redirect may be within the legal limits of fair use, but it falls short in spirit. At some point publishers who recognize what is going on will align with better partners. We are already seeing an angry reaction to Google from within the travel vertical and from companies in the TV market.

Ultimately it is webmasters, web designers & web developers who market and promote search engines. If at some point it becomes consensus that Google is capturing more value than they create, or that perhaps Google search results have too much miscellaneous junk in them, they could push a lot more searchers over to search services which are more minimalistic + publisher friendly. Blekko launches Monday, and their approach to search is much like Google's early approach was. :)

Google's Profits: to Infinity & Beyond

Marin software manages about 5% of Google AdWords spend for clients, and they noticed that since Google Instant was unveiled, AdWords ad clicks are up 5%. Since the launch Google's Jonathan Rosenberg has mentioned that the impact on AdWords was "not material."

I found the repeated use of those exact words suspicious and diversionary, and, as it turned out, with good reason! When Google Instant launched I highlighted what Google was doing to screen real estate & predicted this shift.

Turns out that the "tin foil hat wearing SEOs" were right once again.

And that 5% lift in AdWords clicks is on top of the lift Google has seen from

  • creating a 4th ad slot for comparison ads (in high paying verticals like "credit cards" and "mortgage")
  • sitelinks, merchant ratings, and other ad extensions, that gave Google another lift. On the last quarterly call Jonathan Rosenberg stated: "These ads appear on more than 10% of the queries where we show ads and people like them. We see this because click-through rates are up for some formats as much as 10% and up more than 30% on some others."

It is thus no surprise that Google's move into other verticals is met with resistance. The travel industry recently put together the Fair Search site to oppose Google's purchase of ITA Software.

The Google as Monopoly meme continues to grow.

Is Google a Monopoly?Graphic by Scores.org

As Google continues to make enemies this is a great time for the launch of a back to the basics approach to core algorithmic search. Blekko is launching publicly on November 1st.

WordTracker's Free SEO Videos

Mike Mindel from Wordtracker has put together some nice free SEO overview videos for beginners in the industry. Check them out:

Exact Match Domains

Are exact match domains "too" powerful?

Not in my humble opinion. :)

Sure across the entire web exact match domains can rank for a wide variety of keywords, but there are a couple things to think about when stating that...

  • those rankings are spread across many different domains
  • the bonus any domain gets is only relevant to their 1 exact match phrase
  • Many domains are seen as exact match, but the keyword is popular precisely because the keyword is a brand (like eBay, Amazon.com, Monster.com, Google, Yahoo!, or Bing).
  • Many brand owners (especially small & local ones, where there are few signals of quality) are not heavily engaged in SEO. If Google doesn't show the official site on a brand search they look bad (in 2005 there was a brief period of time when Paypal.com wasn't ranking for "Paypal" due to botched aggressive Google link anchor text filters), whereas if they rank an exact match domain where it is relevant it doesn't really significantly detract from the searcher's experience.

At SES San Jose 2009, Nick Fox stated that Google has about 30 million words in the AdWords advertiser database. In spite of their database being that large, they keep trying to push advertisers toward broad match (and searchers down a well worn path with Google Instant) because roughly 25% of searches are unique.

Adam Lewis highlighted how advertisers can get a glimpse into the endless sea of words searchers use & how impractical it is to presume they can know everything in advance:

One of the most impactful new features lies within the keywords tab and is called "see search terms". This option allows advertisers to choose one or more keywords and see the search term users typed in to trigger that keyword. It also shows which ones are being clicked most often and which are not being clicked.

Often the exact keyword it not what users are actually typing in. Guessing all the possible variations that a user might enter to find your product is essentially impossible. "See search terms" gives you the most popular user queries that triggered your ads. Not only does it help people learn about their user, but it can also potentially save money on SEM by exposing highly specific keywords with less competition and better quality scores.

Note the sentence that I bolded...guessing everything that is searched for that is relevant is roughly impossible. In SEO there are a variety of implications associated with that, but one of the most important ones is this: when you pick an exact match domain it is mainly only helping you with that 1 main keyword that you chose.

Yes there are implications in terms of perceived credibility and such, but those impacts can be created through brand building. With an EMD you pay thousands of Dollars (sometimes 10's or 100's of thousands) to target that one keyword. If a person were to buy MyKeywordStore.com (or similar) for $8 & spend that $10,000 on marketing, then in many cases that $10,000 would generally / typically more than make up for any advantage MyKeyword.com gets.

Much like often overstated type-in traffic, when you look beyond brands, there are not many individual keywords that represent a huge market by themselves.

We have built a database of 10 million + keywords & few of them (less than 10,000 of them) have a combined CPC * estimated search volume of $1,000 or more per month (presuming you captured 100% of the search traffic for that keyword & monetized it as well as Google does).

However, those numbers overstate the market ...

  • many of those valuable keywords *are* brands (seo book wasn't much of a keyword until *after* it was a brand, which is why the domain name was available to me for $8)
  • brands that are created on keywords can be forced to change due to market conditions (FreeCreditReport.com ---> FreeCreditScore.com, legislation whacked the student loan consolidation market, Google Instant promotes some keywords at the expense of others, the US government has launched names like Cars.gov, StudentLoans.gov, Change.gov, etc. ... who wants to compete directly against the government when they control legislation, can create an EMD on the fly, and can cross-link new sites in their network ... allowing them to outrank you in a couple weeks)
  • If you are not a brand & rank #1 in the organic search results (with 3 AdWords ads above you) then you might only get about 25% to 40% of the search traffic. Worse yet, in some of the largest markets Google puts a 4th "Google comparison" ad above the organic search results, further driving down the organic search results.
  • Google's search volume data & suggested bid prices have typically overstated the market (because they want to create bidding wars on core keywords & drive bids upward)
  • Almost nobody monetizes as well as Google does. In many cases when their number shows $100,000+ per month the actual publisher earnings for that keyword might only be a few thousand Dollars.

There are at most a few hundred exceptionally potent keywords where the single word will build a business for a generalist webmaster. That number would be higher if you combined them with professional training in an area and significant industry knowledge, but if you know your industry well and have access to capital and are investing into a premium domain name then odds are good you are investing heavily elsewhere and doing quality work elsewhere. The idea that there are tons of lucrative exact match domains on the market which anyone can use to build thriving businesses on and are available at a discount is somewhat (perhaps completely?) inaccurate.

Exact match only gives you that bonus on exact match. Not a collection of keywords - just that 1 word. And tying your business to 1 keyword can be risky. Just ask anyone who is on a singular version of a domain name where Google Instant promotes the plural version of that keyword. Some of those folks likely had chunks of cinder block falling out their pants the day that launched.

Whereas brand allows you to keep spreading ... but it can take a lot of work to turn a generic keyword into a brand. And by the time you do, your business model and/or the market may have already moved elsewhere. An exact match domain name can sorta box you in and make your business less flexible. SEO Book is a bit of a weird fit for a private SEO community & training website, and Oakland Pizza will *never* become Dominoes or Pizza Hut.

And (when compared against generic keywords) brands are not only more flexible, but they are more memorable, make it easier for you to differentiate, allow to engage at a deeper emotional level & charge more for your products or services.

I don't regret choosing SeoBook.com in 2003 (it certainly worked out awesome in the short run), however if I had more foresight I would have shifted to a different domain in the 2004 to 2005 timeframe. So often when people join our community they are amazed by the depth and breadth of discussion outside of SEO, but a rebrand at this point would be brutal. ;)

Owning SearchEngine.com doesn't really do much for you when there is a Google or a Bing in your market. Owning Auction.com (or maybe Auctions.com) doesn't do much against eBay. Owning Portal.com (or maybe WebPortal.com) isn't going to compete against Yahoo!. Microblogging.com is no Twitter, SocialNetwork.com is no Facebook, VideoHosting.com is no YouTube.

It is basically a choice of short-term vs long-term goals:

  • do you want to pick a specific keyword & try to sell something relevant today (with less flexibility going forward)
  • do you have the assets available to build a brand that will remain flexible under changing market conditions

While exact match domains can box you in, it is a sign of relevancy for that specific keyword: as you have tied your business to it!

Either you got to the market early, or you shelled out thousands of Dollars. OnlineKredit.org just went for $36,400! Whoever bought it is not probably going to be signing guestbooks / comment spamming / auto-generating content /etc. And the guy who paid $1 million for Poker.org wouldn't have paid that unless he planned on building something sustainable there.

Even Matt Cutts recommends buying relevant domain names as gifts :)

The one area of exact match domains where I think Google has been (and will continue to) tighten up is some of the longtail cybersquatting, but...

  • tightening up can be tricky because the same word can have different meanings in different markets (perhaps continued efforts into localizing results will solve some of these issues)
  • earlier this year Google did whack some longtail EMDs that had few other signals of quality
  • more recently, Google has been showing far more results from 1 domain on navigational queries, and has been ranking official sites for related queries even if they didn't have some of the keywords in their content or link anchor text
  • even for generic search queries (like "cameras") Google sometimes lists suggested related brand navigation in the search results
  • trademarks protect usage & legislation is moving in the direction of making it easier / cheaper / faster for brand owners to whack cybersquatting

Ho Ho Ho, Go Google Go

Some sites have seen pretty drastic drops in Google search traffic recently, related to indexing issues. Google maintains that it is a glitch:

Just to be clear, the issues from this thread, which I have reviewed in detail, are not due to changes in our policies or changes in our algorithms; they is due to a technical issue on our side that will be visibly resolved as soon as possible (it may take up to a few days to be visible for all sites though). You do not need to change anything on your side and we will continue to crawl and index your content (perhaps not as quickly at the moment, but we hope that will be resolved for all sites soon). I would not recommend changing anything significantly at this moment (unless you spot obvious problems on your side), as these may result in other issues once this problem is resolved on our side.

An example of one site's search traffic that was butchered by this glitch, see the below images. Note that in the before, Google traffic is ~ 10x what Yahoo! or Bing drive, and after the bug the traffic is ~ even.

Not that long ago I saw another site with over 500 unique linking domains which simply disappeared from the index for a few days, then came right back 3 days later. Google's push to become faster and more comprehensive has perhaps made them less stable, as digging into social media highlights a lot of false signals & often promotes a copy over the original. Add in any sort of indexing issues and things get really ugly really fast.

Now this may just be a glitch, but as Tedster points out, many such "glitches" often precede or coincide with major index updates. Ever since I have been in the SEO field I think Google has done a major algorithmic change just before the holidays every year except last year.

I think the reasons they do it are likely 3 or 4 fold

  • they want to make SEO unpredictable & unreliable (which ultimately means less resources are spent on SEO & the results are overall less manipulated)
  • they want to force businesses (who just stocked up on inventory) to enter the AdWords game in a big way
  • by making changes to the core relevancy algorithms (and having the market discuss those) they can slide in more self promotion via their vertical search services without it drawing much anti-trust scrutiny
  • the holidays are when conversion rates are the highest, so if they want to make changes to seek additional yield it is the best time to do it, and the holidays give them an excuse to offer specials or beta tests of various sorts

As an SEO with clients, the unpredictability is a bad thing, because it makes it harder to manage expectations. Sharp drops in rankings from Google "glitches" erode customer trust in the SEO provider. Sometimes Google will admit to major issues happening, and other times they won't until well *after* the fact. Being proven right after the fact still doesn't take back 100% of the uncertainty unleashed into the marketplace weeks later.

Even if half your clients double their business while 1/3 lose half their search traffic, as an SEO business you typically don't generally get to capture much of the additional upside...whereas you certainly capture the complaints from those who just fell behind. Ultimately this is one of the reasons why I think being a diversified web publisher is better than being an SEO consultant... if something takes off & something else drops then you can just pour additional resources into whatever is taking well and capture the lift from those changes.

If you haven't been tracking rankings now would be a great time to get on it. It is worth tracking a variety of keywords (at various levels of competition) daily while there is major flux going on, because that gives you another lens through which to view the relevancy algorithms, and where they might be headed.

Pages