The Perpetual Time, Err, Hype Machine

Sometimes when reading TechCrunch I feel like I am watching the History Channel, and that there is a regularly scheduled show called WWII Wednesdays (or something like that). Ironically there was even a junk TechCrunch post about killing the hype cycle. But they know the truth on manipulation - they wouldn't have a profitable product without it.

So often marketers highlight how *this changes everything*

I just saw an article about how the Facebook like button kills SEO, promoting some white paper.

Perhaps it gets the white paper opened, and pre-qualifies prospective customers as ignorant corporate types who are willing to pay big bucks for misinformation, but it seems online marketing is so saturated that we have to act comical or absurd to pull in attention. And then make revisions a week later stating "my bad."

Promptly followed by "...but this changes everything..."

(or hope that the sea of noise is so loud that people forget yesterday by the time tomorrow happens)

"The most important idea in advertising is new. It creates an itch. You simply put your product in there as a kind of calomine lotion." - Don Draper

Part of the reason I have decided to post less is that I decided I would rather not participate in that. At some point it helps to believe in what you sell, even if another path is more profitable. :)

I want to make sure we highlight what is new and interesting, but a lot of online marketing is just blocking and tackling ... the basics. A few months back I wrote "its the boring stuff that makes the money" largely because that which is boring has been refined, is predictable, and can scale.

New Twitter might change everything. But then next week Facebook will. And the following week the sun will come up from the west. And everything changes once more!

But the pendulum never swings in 1 direction forever. Blockbuster was once a sure win, today Google is a monopoly, but things will change.

Though those changes will likely be slow and gradual. In aggregate, the impact of Google Instant is not as extreme as many thought (outside of a few edge cases). IE9 also puts some interesting wrinkles in the search space - particularly for domainers. When the address bar becomes a search box with search suggestions in it, how many people will type in undeveloped URLs? Though I suspect that change will also be gradual. After all, Yahoo! is still doing well in spite of all their blunders, brands creating their own media, the tectonic economic decline, and the US being functionally bankrupt.

How Google Instant Changes the SEO Landscape

Google Instant launched. It is a new always-on search experience where Google tries to complete your keyword search by predicting what keyword you are searching for. As you type more letters the search results change.

Short intro here:

Long view here:

Not seeing it yet? You can probably turn it on here (though in some countries you may also need to be logged into a Google account). In time Google intends to make the is a default feature turned on for almost everyone (other than those with slow ISPs and older web browsers). And if you don't like it, the feature is easy to turn off at the right of the search box, but to turn it off it uses a cookie. If you clear cookies the feature turns right back on.

Here is an image using Google's browser size tool, showing that when Google includes 4 AdWords ads only 50% of web browsers get to see the full 2nd organic listing, while only 20% get to see the full 4th organic listing.

Its implications on SEO are easy to understate. However, they can also be overstated: I already saw one public relations hack stating that it "makes SEO irrelevant."

Nothing could be further from the truth. If anything, Google instant only increases the value of a well thought out SEO strategy. Why? Well...

  • it consolidates search volume into a smaller basket of keywords
  • it further promotes the localization of results
  • it makes it easier to change between queries, so its easier to type one more letter than scroll down the page
  • it further pollutes AdWords impression testing as a great source of data

Lets dig into these, shall we?

Rand Fishkin Interview

It is no secret that in the past Rand and I have had some minor difference of opinions (mainly on outing). ;)

But in spite of those, there is no denying that he is an astute marketer. So I thought it would be fun to ask him about his background in SEO and to articulate his take on where some of our differences in opinions are. Interestingly, it turns out we shared far more views than I thought! Hope you enjoy the interview. :)

Throughout your history in the SEO field, what are some of your biggest personal achievements?

The first one would have to be digging myself (and my Mom) out of bankruptcy when we were still a small, sole proprietorship. Since then, there have been a lot of amazing times:

  • The first time I spoke at a conference (SES Toronto in 2004)
  • Transitioning from a consulting to a software business
  • Taking venture capital
  • Building a team (not just making hires)
  • Having dinner with the UN Secretary General (Ban Ki Moon) and presenting to their CTO on SEO - it was amazing to hear stories about how people in conflict-ridden parts of the world used search to find safe havens, escape and transmit information and the UN's missed opportunities around SEO. I'd never really thought of our profession as having life-or-death consequences until then.
  • Making the Inc 500 list for Fastest Growing Companies in the US (during a nasty recession)
  • Probably my biggest personal achievement, though, is my relationship with my wife. I know that no matter what happens to me in any other part of my life, I have her support and love forever. That gets a guy like me through a lot of tough times.

Geraldine & Rand in San Francisco

My wife and I in San Franicsco (via her blog)

What are the biggest counter-intuitive things you have learned in SEO (eg: that theoretically shouldn't work, but wow it does (or the opposite - should work but doesn't)?

The most obvious one I think about regularly is that the "best content rarely wins." The content that best leverages (intentionally or not) the system's pulleys and levers will rise up much faster than the material the search engines "intended" to rank first.

Another big one includes the success of very aggressive sales tactics and very negative, hateful content and personalities. Perhaps because of the way I grew up or my perspective on the world, I always thought of those things as being impediments to financial success, but that's not really the case. They do, however, seem to have a low correlation with self-satisfaction and happiness, and I suppose, for the people/organizations with those issues, that's even worse.

A very specific, technical tactic that I'm always surprised to see work is the placement of very obvious paid text links. We realized a few months back that with Linkscape's index, we could ID 90%+ of paid link spam with a fairly simple process:

  1. Grab the top 10K or 100K query monetizable terms/phrases (via something like a "top AdSense payout" list)
  2. Find any page on the web that contains 2+ external anchor text links pointing to separate websites (e.g. Page A has a link that says "office supplies" linking to 123.com and another link that says "student credit card" linking to 456.com)
  3. Remove the value passed by those links in any link metric calculation (which won't hurt the relevancy/ranking of any pages, but will remove the effects of nearly all paid links)

We've not done the work to implement this, so perhaps there's some peculiar reason why applying it is harder than we think. But, it strikes me that even if you could only do it for pages with 3 or 4+ links in this fashion, you'd still eliminate a ton of the web's "paid" link graph. The fact that Google clearly hasn't done this makes me think it must not work, but I'm still struggling to understand why.

BTW - I asked some SEOs about making this a metric available through Linkscape/Open Site Explorer (like a "liklihood this page contains paid links" metric) and they all said "don't build it!" so we probably won't in the near term.

One of the big marketing angles you guys tried to push hard on was the concept of transparency. Because of that you got some pretty bad blowback when Linkscape launched (& perhaps on a few other occasions). Do you feel pushing on the transparency angle has helped or hurt you overall?

I think those inside the SEO community often perceive a conflict or tiff internally as having a much broader reach than it really does. I'd agree that folks like you and I, and maybe even a few hundred or even a thousand industry insiders are aware of and take something away from those types of events, but SEOmoz as a software company with thousands of paying subscribers and hundreds of thousands of members seems to be far less impacted than I am personally.



Re: Linkscape controversy - there have been a few - but honestly, the worst reputation/brand problems we ever had have always been with regards to personal issues or disputes (a comment on someone's blog or something we wrote or allowed to be published on YOUmoz). I don't have a good explanation for why they crop up, but I can say that they seem to have a nearly predictable pattern at this point (I'm sure you recognize this as well - think I've seen you write fairly eloquently on the subject). That does make it easier to handle - it's the unpredictable that's scary.

We certainly maintain transparency as a core value and we're always trying to do more to promote it. To me, core value means "things we value more than revenue or profits" and so even if it's had some hard-to-measure, adverse impact, we'd maintain it. We've actually got a poster hanging up in the office that our design team made:
The "T" in TAGFEE

An excerpt from our TAGFEE poster

There's a quote I love on this topic that explains it more eloquently than I can:

"(Our) core values might become a competive advantage, but that is not why we have them. We have them because they define for us what we stand for, and we would hold them even if they became a competitive disadvantage." - Ralph Larson, CEO of Johnson and Johnson

What type of businesses do you think do well with transparency? What type of businesses do you feel do poorly with it?

Hmm... Not something I've tried to apply to every type of business, but my feeling is that nearly every company can benefit from it, though it also exposes you to new risk. Even being the transparency-loving type, I'd probably say that military contractors, patent trolls and sausage manufacturers wouldn't do so well.

How have you been able to manage the transparency angle while having investors?

I thought it would be tougher after taking investment, but they've actually been very supportive in nearly every case (some parts of Linkscape, particularly those re: our patent filings being exceptions). I don't know if that would be true had we taken on different backers, but that's why the startup advice to choose your investors like you choose your husband/wife is so wise.

When you took investment money did you mainly just get capital? What other intangibles came with it? How have your investors helped shape your business model?

It certainly made us much more focused on the software model. As you noted, we dropped consulting in 2010 entirely, and we've generally limited any form of non-scalable revenue to help fit with the goals of a VC-backed business. We did gain some great advisors and a lot more respect in many technology and startup circles that would have been tough without the presence of venture funds (although I think that's shifting somewhat given the changes of the past 2-3 years in the startup world).

Have you guys ever considered buying out your investors? Are you worried what might happen to your company if/when it gets sold?

While we'd love to, I doubt that would ever be possible (barring some sort of massive personal windfall outside of SEOmoz). Every dollar we make gets our investors more excited about the future of the company and less likely to want to sell their shares before we reach our full potential. Remember that with VC, the idea is high risk, high reward, so technically, they'd rather we go for broke and fall to pieces than do a mid-size, but profitable deal. Adding $5 or $10 million dollars back to a $300+ million fund is largely useless to a VC, so a bankruptcy while trying to return $50 or $100 million is a very tolerated, sometimes preferable result.

VC Chart of Returns

I wrote about this more in my Venture Capital Process post (where I talked about failing to raise money in summer 2009)

Now that you are already well known & well funded you are taking a fairly low risk strategy to SEO, but if you were brand new to the space & had limited capital would you spam to generate some starting capital? At what point would you consider spamming being a smaller risk than obscurity?

You ask great questions. :-)

While I don't think spam has any moral or ethical problems, I don't know that I'd ever be able to convince myself that spam would be a more worthwhile endeavor than brand building for a white hat property. Overnight successes take years of hard work, and I'd much rather get started as a scrappy, bootstrapping company than build up a reserve with spam dollars and waste that time. However, I certainly don't think that applies to everyone. As you know, I've got lots of friends who've done plenty of shady stuff (probably a lot I don't even want to know about!), but that doesn't mean I respect them any less.

Speaking of low risk SEO, why do you think neither of our sites has hit the #1 slot yet in Google for "seo"? And do you think that ranking would have much business impact?

We've looked at the query in our ranking models and I think it's unlikely we could ever beat out the Wikipedia result, Google or SEO.com (unless GG pulls back on their exact-match domain biasing preference). That said, we should both be overtaking SEOchat.com fairly soon (and some of the spammier results that temporarily pop in and out). Some of our engineers think that more LDA work might help us to better understand these super-high competitive queries.

Analysis of "SEO" SERPs in Google

SERPs analysis of "SEO" in Google.com w/ Linkscape Metrics + LDA (click for larger)

In terms of business impact - yeah, I think for either of us it would be quite a boon actually (and I rarely feel that way about any particular single term/phrase). It would really be less the traffic than the associated perception.

As an SEO selling something unique (eg: not selling a commodity that can be found elsewhere & not as an affiliate) I have found word of mouth marketing is a much more effective sales channel than SEO. Do you think the search results are overblown as a concern within the SEO industry? Do you find most of your sales come from word of mouth?

I see where you're coming from, but in our analyses, it's always been a combination of things that leads to a sale. People search and find us, then browse around. Or they hear of us and search for information about us. Then they'll find us through social media or referring site and maybe they'll sign up for a free account. They'll get a few emails from us, have a look at PRO and go away. Then a couple months later they'll be more serious about SEO and search for a tool or answer and come across us again and finally decide, "OK, these guys are clearly a good choice."



This is what makes last touch attribution so dangerous, but it also speaks to the importance of having a marketing/brand presence across multiple channels. I think you could certainly make the case that many of us in the SEO field see every problem as a nail and our profession as the hammer.

What business models do you feel search fits well with, and what business models do you feel search is a poor fit for?

I think it's terrific for a business that has content or products they can monetize over the web that also relate to things people are already searching for. It's much less ideal for a product/service/business that's "inventing" something new that's yet to be in demand by a searching population. If you're solving a problem that people already have an identified pain point around, whether that's informational, transactional or entertainment-driven, search is fantastic. If that pain point isn't sharp enough or old enough to have generated an existing search audience, branding, outreach, PR and classic advertising may actually do better to move the needle.

Have you ever told a business that you felt SEO would offer too low of a yield to be worth doing?

Actually yes! I was advising a local startup in Seattle a couple years ago called Gist and told them that SEO couldn't really do much for them until people started realizing the need for social-plugins to email and searching for them. This is the case with a lot of startups I think.

In an interview on Mixergy you mentioned up racking up a good bit of debt when you got started in search. If a person is new to the web, when would you recommend them using debt leverage to grow?

Never, if you're smart. Or, at least, never in the quantities I did. The web is so much less costly to build on nowadays and the lean startup movement has produced so many great companies (many of them only small successes, but still profitable) from $10K or less that it just doesn't make sense, especially with the horror that is today's debt market, to go too far down that route. If you can get a low-cost loan from a family member or a startup grant through a government-backed, low interest program, sure, but credit card debt (which is where I started) is really not an option anymore.

How were you able to maintain presence and generally seem so happy publicly when you first got started, even with the stress of that debt?

To be honest, I really just didn't think about it much. If you have $30K in debt, you're constantly thinking about how to pay it off month by month and day by day. When you're $450K in debt with collectors coming after you and your wife paying the rent, you think about how to make a success big enough to pay it all off or declare bankruptcy - might as well go with the former until life runs you into the latter. There's just not much else to do.



As Bob Dylan says - "when you got nothing, you got nothing to lose."

Many people new to the field are afraid to speak publicly, but you were fairly well received right off the start. What prepared you for speaking & what are keys to making a good presentation?

Oh man - I sucked pretty hard my first few presentations. I think everyone does. The only reason I was well received, at least in my opinion, is because I'd already built a following on the web and had a positive reputation that carried over from that. The only thing that really prepared me for big presentations (things like the talk to Google's webspam/search quality team or keynotes at conferences) was lots and lots of experience and for that I'll always be grateful to Danny Sullivan for giving me a shot.

I'd say to others - start small, get as many gigs as you can, use video to help (if you're great on camera, you'll be good in front of a live audience) and try to emulate speakers and presentations you've loved.

When large companies violate Google's guidelines repeatedly usually nothing happens. To cite a random example...I don't know...hmm Mahalo. And yet smaller companies when outed often get crushed due to Google's huge marketshare. Because of the delta between those 2 responses, I believe that outing smaller businesses is generally bogus because it strips freedoms away from individuals while promoting large corporations that foist ugly externalities onto society. Do you disagree with any of that? :D

I think I agree with nearly all of that statement, though I'd still say it's no more "bogus" to out small spammers than it is to spam. I would agree it's not cool that Google applies its standards unfairly, but it's hard to imagine a world where they didn't. If mikeyspaydayloans.info isn't in Google's index, no ones thinks worse of Google. If Disney.com isn't in Google (even if they bought every link in the blogosphere), searchers are going to lose faith and switch engines. The sensible response from any player in such an environment is to only violate guidelines if you're big enough to get away with it or diversified enough to not care.

I'm unhappy with how Google treats these issues, but I'm equally unhappy with how spam distorts the perception of the SEO field. Barely a day goes by without a thought leader in the technology field maligning our industry - and 9 times out of 10 that's because of the "small" spammers. If we protect them by saying SEOs shouldn't "out" on another, we bolster that terrible impression. I don't think most web spam should even have the distinction of being classified as "SEO" and I don't think any SEO professionals who want our field to be taken seriously by marketing and engineering departments should protect those who foist their ugly externalities onto us.

I know we disagree on this, but it's always an interesting discussion :-)

One of the most remarkable things about the SEO industry is the gap in earnings potential between practicing it (as a publisher) and teaching it / consulting. Why do you think such a large gap exists today?

Teaching has always been an altruist's pursuit. Look at teachers in nearly every other field - they earn dramatically less than their production/publishing oriented peers. Those who teach computer science never earn what computer scientists who work at Google or Microsoft make. Those who teach math are far less well compensated than their compatriots working as "qaunts" on Wall Street. It's a sad reality, but it's why I have so much respect for people like Market Motive, Third Door Media and Online Marketing Connect, who are trying to both teach and build profitable businesses. I love the alignment of noble pursuits with profitable ones.

You guys exited the consulting area in spite of being able to charge top rates due to brand recognition. Do you think lots of consultants will follow suit and move into other areas? How do you see SEO business models evolving over the next 3 to 5 years?

I don't think so - our consulting business was going very well and I've heard and seen a lot of growth from my friends who run SEO consulting firms. The margins and exit price valuations wouldn't have made sense for VCs, but I don't think it was a bad business at all and others are clearly doing remarkable things. Just look at iCrossing's recent sale to Hearst for $325million. You can build an amazing company with consulting - it's just not the route we took.

In regards to the evolution of the SEO business model, I'd say we're likely to see more sophistication, more automation, more scalability (and hopefully, more software to help with those) over the next few years from both in-house SEOs and external agencies/consultants. It's sometimes surprising to me how little SEO consulting has progressed from 2002 vs. things like email marketing or analytics, where software has become standard and tons of great companies compete (well, Google's actually made competition a bit more challenging in the analytics space, but creative companies like KissMetrics and Unbounce are still doing cool, interesting things).

Small businesses in many ways seem like the most under-served market, but also the hardest to serve (since they have limited time AND small budgets). Do you think the rise of maps & other verticals gives them a big opportunity, or is it just more layers of complexity they need to learn?

Probably more the former than the latter. The small business owners I know and interact with in my area (and wherever I seem to visit) are only barely getting savvy to the web as a major driver of revenue. I think it might take another 10 years or more before we see true maturity and savvy from local businesses. Of course, that gives a huge competitive advantage to those who are willing to invest the time and resources into doing it right, but it means a less "complete" map of the local world in the online one, which as a consumer (or a search engine) is less than ideal.

When does the delta between paid search & SEO investment begin to shrink (if ever)?

I think it's probably shrinking right now. Paid search is so heavily invested in that I think it's fair to call it a mature market (at least in global web search, though, re: your previous question, probably not in local). SEO is ramping up with a higher CAGR (Compound Annual Growth Rate) according to Forrester, so that delta should be shrinking.

Forrester Growth of SEO vs. Paid Search

via Forrester Research's Interactive Marketing Forecast 2009-2014

Often times a Google policy sounds like something coming out of a conflicted government economist's mouth. But even Google has invested in an affiliate network which suggests controlling your HTML links based on payment. How much further do you think Google can grow before they collapse under complexity or draw enough regulatory attention to be forced to change?

I think if they tread carefully and invest heavily in political donations and public relations, they can likely maintain another very positive 5-10 years. What the web looks like at that time is anyone's guess, and the unpredictable nature and wild shifts probably help them avoid most regulation. Certainly the rise of Facebook has been a boon to their risk exposure from government intervention, even if they may not be entirely happy with their inability to compete in the social web.

I remember you once posted about getting lots of traffic from Facebook & Twitter, but almost 0 sales from it. Does there become a point where search is not the center of the web (in terms of monetization), or are most of these networks sorta only worthwhile from a branding perspective?

As direct traffic portals, it's hard to imagine a Facebook/Twitter user being as engaged in the buying/researching process as a Google searcher. Those companies may launch products that compete with Google's model or intent, but as they exist today, I don't foresee them being a direct sales channel. They're great for traffic, branding, recognition and ad-revenue model sites, but they're of little threat to marketers concerned with the relevance or value of search disappearing.

What are the major differences between LDA & LSI?

They're both methodologies for building a vector space model of terms/phrases and measuring the distance between them as a way to find more "relevant" content. My understanding is that LSI, which was first developed in 1988, has lots of scaling issues. It's cousin, PLSI (probabilistic LSI) attempted to address some of those when it came out in 1999, but still has scaling problems (the Internet is really big!) and often will bias to more complex solutions when a basic one is the right choice.

LDA (Latent Dirichlet Allocation), which started in 2002, is a more scalable (though still imperfect) system with the same intuition and goals - it attempts to mathematically show distances between concepts and words. All of the major search engines have lots of employees who've studied this in university and many folks at Google have written papers and publications on LDA. Our understanding is that it's almost universally preferred to LSI/PLSI as a methodology for vector space models, but it's also very likely that Google's gone above and beyond this work, perhaps substantially.

The "brand" update was subsequently described as being due to looking at search query chains. In a Wired article Amit Singhal also highlighted how Google looks for entities in their bi-gram breakage process & how search query sequences often help them figure out such relationships. How were you guys able to build a similar database without access to the search sessions, or were you able to purchase search data?

In a vector space model for a search function, the distances and datasets leverage the corpus rather than query logs. Essentially, with LDA (or LSI or even TF*IDF), you want to be able to calculate relevance before you ever serve up your first search query. Our LDA work and the LDA tool in labs today use a corpus of about 8 million documents (from Wikipedia). Google's would almost certainly use their web index (or portions of it).

It's certainly possible that query data is also leveraged for a similar purpose (though due to how people search - with short terms and phrases rather than long, connected groups of words - it's probably in a different way). This might even be something that helps extend their competitive advantage (given their domination of market share).

Sometimes one can see Google's ontology change over time (based on sharp ranking increases and drops for outlier pages which target related keywords but not the core keyword, or when search results for 2 similar keywords keep bouncing between showing the exact same results to showing vastly different results). How do you guys account for these sorts of changes?

Thus far, we haven't been changing the model - it just launched last week. However, one nice thing we get to do consistently is to run our models against Google's search results. Thus, if Google does change, our scores (and eventually, the recommendations we hope to make) should change as well. This is the nice part about not having to "beat" Google in relevance (as a competing search engine might want to do) but simply to determine where Google's at today.

For a long time one of the thing I have loathed most in the SEO space was clunky all-in-one desktop tools that often misguide you into trying to change your keyword density on the word "the" and other such idiocy. Part of the reason we have spent thousands of Dollars offering free Firefox extensions was my disgust toward a lot of those all-in-one tools. A lot of the best SEOs tend to prefer a roll-your-own mix and match approach to SEO. Recently you launched a web application which aims to sorta do all-in-one. What were the key things you felt you had to get right with it to make it better than the desktop software so many loathe?

I think our impetus for building the web app was taken from the way software has evolved in nearly every other web marketing vertical. In online surveys, you had one-time, self built systems and folks like Wufoo and SurveyMonkey have done a great job making that a consolidated, simple, powerful software experience. That goes for lots of others like:

  • PPC - Google has really taken the cake here with Adwords integration and the launch of Optimizer and even GA
  • CRM - Salesforce, of course, was the original "all-in-one" web marketing software, and they've shown what a remarkable company you can build with that model. InfusionSoft and other players are now quickly building great businesses, too.
  • Email Marketing - Exact Target, Constant Contact, Mailchimp, MyEmma, iContact and many more have built tens-hundreds of millions of dollar/year businesses with "all-in-one" software for handling email marketing.
  • Banner Ads - platforms like Aquantive, DoubleClick, AdReady, etc. have and are building scalable solutions that drive billions in online advertising
  • Analytics - remember when we had one-off, log file analysis tools and analytics consultants who built their own tools to dig into your data? Those consultants are still here, but they're now armed with much more powerful tools - Google Analytics, Omniture, Webtrends, etc. (and new players like KISS Metrics, too)

You're likely spot-on in thinking that power players will continue to mash up and hack their own solutions, build their own tools and protect their secret processes to make them more exclusive in the market and (hopefully) competitive. But, these folks are on the far edge of the bell curve. In every one of the industries above (and many others), it looks like the way to build a scalable software product that many, many people adopt, use and love is to optimize of the middle to upper-end of the bell curve (what we'd probably call "intermediate" to "advanced" SEOs, rather than the outlier experts).

When you gather ranking data do you use APIs to do so? If not, how hard was it been on the technical front scaling up to that level of data extraction?

Some data we can get through APIs, but most isn't available in that fashion, so relatively robust networks are required to effectively get the information. Luckily, we've got a pretty terrific team of engineers and a VP of Engineering who's done data extraction work previously for Amazon, Microsoft and others. I'd certainly say that it ranks in the top 10 technical challenges we've faced, but probably not the top 3.

What do you gain by doing the all-in-one approach that a roll your own type misses out on?

Convenience, consistency, UI/UX, user-friendliness and scalability are all big gains. However, the compromise is that you may lose some of that "secret-sauce" feeling and the power that comes from handling any weird situation or result in a hands-on, one-to-one fashion. Plenty of folks using our web app have already pointed out edge-case scenarios where we're probably not taking the ideal approach, and those kinks will take time to be ironed out.

Some firms use predictive analytics to automatically change page titles & other attributes on the fly. Do you see much risk to that approach? Do you eventually see SEO companies offering CMS tools as part of their packages to lock in customers, while integrating the SEO process at a much deeper level?

When we were out pitching to take venture capital last summer, a lot of VCs felt that this was the way to go and that we should have products on this front.

Personally, I don't like it, and I'd be surprised if it worked. Here's why:

  • Editors/writers should be responsible for content, not machine-generated systems built to optimize for search engines. Yes, those machine systems can and should make recommendations, but I fear for the future of your content and usability should "perfect SEO" be the driving force behind every word and phrase on your site.
  • With links being such a powerful signal, it's far better to have a slightly less well-targeted page that people actually want to link to than a "perfect" page that reads like machine-generated content.
  • I think content creators who take pride in their work are the ones who'll be better rewarded by the engines (at least in the long term - hopefully your crusade against Demand Media, et al. will help with that), and those are the same type of creators who won't permit a system like this to automatically change their content based on algorithmic evaluation.

There are cases I could see where something like this would be pretty awesome, though - e.g. a 404 detector that automatically 301s pages it sees earning real links back to the page it thinks was the most likely intended target.

On your blog recently there was a big fuss after you changed your domain authority modeling scores. Were you surprised by that backlask? What caused such a drastic change to your scores?

We were surprised only until we realized that somehow, our internal testing missed some pretty obvious boneheaded scores.

Basically, we calculate DA and PA using machine learning models. When those models find better "correlated" results, we put them in the system and build new scores. Unfortunately, in the late August release, the models had much better average correlation but some really terrifically bad outliers (lots of junky single-page keyword-match domains got DAs of 100 for example).

We just rolled out updated scores (far ahead of our expected schedule - we thought it would take weeks), and they look much better. We're always open to feedback, though!

When I got into SEO (and for the first couple years) it seemed like you could analyze a person's top backlinks and then literally just go out and duplicate most of them fairly easily. Since then people have become more aware of SEO, Google has cracked down on paid links, etc. etc. etc. Based on that, a lot of my approach to SEO has moved away from analysis and more toward just trying to do creative marketing & hope some % of it sticks. Do you view data as being a bit of a sacred cow, or more of just a rough starting point to build from? How has your perception as to the value of data & approach to SEO changed over time?

I think your approach is almost exactly the same as mine. The data about links, on-page, social stats, topic models, etc. is great for the analysis process, but it's much harder to simply say "OK, I'll just do what they did and then get one more link," than it was when we started out.

That analysis and ongoing metrics tracking is still super-valuable, IMO, because it helps define the distance between you and the leaders and gives critical insight into making the right strategic/tactical decisions. It's also great to determine whether you're making progress or not. But, yes, I'd agree that it's nowhere near as cut-and-dried as it once was.

The frustrating part for us at SEOmoz is we feel like we're only now producing/providing enough data to be good at these. I wish that 6-7 years ago, we'd been able to do it (of course, it would have cost a lot more back then, and the market probably wasn't mature enough to support our current business model).

How much time do you suggest people should spend analyzing data vs implementing strategies? What are some of the biggest & easiest wins often found in the data?

I think that's actually the big win with the web app (or with competitive software products like Raven, Conductor, Brightedge, etc). You can spend a lot less time on the collection/analysis of data and a lot more on taking the problems/opportunities identified and doing the real work of solving those issues.

Big wins in our new web app for me have been ID'ing pages through the weekly crawl that need obvious fixing (404s and 500s are included, like Google Webmaster Tools, but so are 20+ other data points they don't show like 302s, incorrect rel canonicals, etc.)

Blekko has got a lot of good press by sharing their ranking models & link data. Their biggest downside so far in their beta is the limited size of their index, which is perhaps due to a cost benefit analysis & they will expand their index size before they publicly launch. In some areas of the web Google crawls & indexes more than I would expect, while not going to deeply into others. Do you try to track Google's crawls in any way? How do you manage your crawl to try to get the deep stuff Google has while not getting the deep stuff that Google doesn't have?

Yeah - we definitely map our crawls against Google, Bing and Majestic on a semi-regular basis. I can give you a general sense of we see ourselves performing against these:

  • Google - the freshest and most "complete" (without including much spam/junk) of the indices. A given Linkscape index is likely around 40-60% of the Google index in a similar timeframe, but we tend to do pretty well on coverage of domains and well-linked-to pages, though worse on deep crawling in big sites.
  • Bing - they've got a large index like Google, but we actually seem to beat them in freshness for many of the less popular corners of the web (though they're still much faster about catching popular news/blogs/etc from trusted sources since they update multiple times daily vs. our once-per-month updates).
  • Majestic - dramatically larger in number of URLs than Google, Bing or Linkscape, but not as good as any of those about freshness or canonicalization (we'll often see hundreds of URLs in the index that are essentially the same page with weird URL parameters). We like a lot of their features and certainly their size is enviable, but we're probably not going to move to a model of continuous additions rather than set updates (unless we get a lot more bandwidth/processing power at dramatically lower rates).


the problem with maintaining old URLs became more clear when we analyzed decay on the WWW

In terms of reaching the deep corners of the web, we've generally found that limiting spam and "thin" content is the big problem at those ends of the spectrum. Just as email traffic is estimated to be 90%+ spam, it's quite possible that the web, if every page were truly crawled and included, would have similar proportions. Our big steps to help this are using metrics like mozTrust, mozRank and some of our PA/DA work to help guide the crawl. As we scale up index size (probably December/January of this year), that will likely become a bigger challenge.

---

Thanks Rand. You can read his latest thoughts on the SEOmoz blog and follow him on Twitter at @randfish.

Universal Truth Of Selling On The Web: Easy & Simple Wins

The following is a guest post by Jim Kukral.

Google knows this. Now you do as well. Easy always wins. Take a moment and picture your website or your blog or your product or service in your head right now. Now, think of Google’s. Which one is easier? No, you're not a search engine, you're probably a small business owner with a variety of products services, entrepreneur with a business idea, or blogger . But the comparison remains because regardless of what it is you do easy will always win.

So keep thinking about your Web business. Is what you’re selling easy to buy? By that I mean; when somebody comes to buy from you, or to simply get information from you like a phone number or to download a white paper… is it easy to do? Or are you making it too hard?

Picture Google.com again in your head. It's pretty darn easy, no? There's a logo and a big input box underneath it. You put in what you're looking to find, and hit search and boom, you find it. Easy. Google understands that customers use them for one reason, to have a problem solved, and therefore, that’s what they deliver, without all the frills that other search portals like Aol or Yahoo! try to offer.

Your opportunity right now is to figure out the main one or two reasons people visit your website, because despite what you might think, your customers probably have only those one or two things on their mind when they visit you.

If you visit the home page of Orbtiz.com, you’re probably there to do one of a few things only. Book a flight, find a car, or make a hotel reservation. Possibly all three at once. But honestly, that’s pretty much it, right? I would bet that 99% of their traffic is trying to do one of those things. The same goes for you and your website, blog, membership site or anything you produce online.

What exactly are your customers looking for? You need to find out and find out right now! Check your analytics (I recommend Google Analytics, it's free! www.Google.com/Analytics) to find out things like the most viewed pages of your website, as well as the most exited pages too. You may find out that 90% of your visitors are focusing on the free white paper download page and ignoring the other pages you thought were important. That’s great news! Now, you at least know what your customers want. And now you can make it easier for them to get it. You may also find out that a large percentage of your visitors always leave your website on one specific page, giving you the insight that perhaps they aren't finding what they're looking for, getting frustrated, and surfing away. That's bad.

So what should you do with that knowledge to make things easier for your visitor, and better for your business? If you're getting a lot of traffic to your free white paper download, go ahead and take that download information and make it stand out on your home page. If done right, you'll make it as easy as possible for your visitors to get what they were looking for, and you’ll see even more downloads, and happier visitors because you didn’t make them work so hard.

Now, you may also find out that the page you really wanted your visitors to see is not being viewed enough. This could be the specials page on your e-commerce site, or the packages page on your consulting site or maybe your customer support contact information page. Whatever it may be, once you know what it is, that page obviously needs to be viewed more, and while you can’t force it down your visitors digital throats, you can redesign your page so that it limits the other choices that can distract your visitor.

Make it easy and simple, then win!

For over 15-years, Jim Kukral has helped small businesses and large companies like Fedex, Sherwin Williams, Ernst & Young and Progressive Auto Insurance understand how find success on the Web. Jim is the author of the book, "Attention! This Book Will Make You Money", as well as a professional speaker, blogger and Web business consultant. Find out more by visiting www.JimKukral.com. You can also follow Jim on Twitter @JimKukral.

Labor Day = Yeah

When you think of labor day what comes to mind? For me it is these 2 thoughts

  • lower earnings because few people are online today
  • since almost nobody is online, any hours worked today are me getting ahead of the market ;)

Working hard & working long hours can almost be a disease...the web makes it easy to be addicted.

But for every person who is putting in hard work trying to help people there is another person selling image.

The big issue with the image game is the risks. As the lies pile up they corner people into a bad situation, to where they can (and do) lose everything.

If I had to take a single point of reference to help a stranger judge the difference between a hack and someone who wants to honestly help people, I would say it is this: do they encourage you to take on debt.

  • If they do then there is a good chance they are the type of person who will go out of their way to screw you.
  • If they do not then they are likely not a maximizer type (because if they were then they would be encouraging you to go into debt to sell you more stuff).

It is not that all debt is evil (when I got started online I was naive enough to start on a credit card), but life and markets are unpredictable. If I wasn't smart enough to get a job to cover my 6 or so months of education before going full time online who knows where I would now be. What seems like a short term gain can lead to longterm failure. We are human, and so we are flawed. Wen you have debt/leverage you have no spare parts. So if something goes wrong you are done. Nassim Taleb spoke about the importance of savings and diversity of revenues as keys to survival, while noting that the very structure of our public markets encourages risk + leverage (options encourage short term performance & volatility rather than sustained growth, and you hope the guy on the next watch is stuck holding the Madoff ponzi bag).

The falls of past empires have typically been preceded by rapid inflation in food costs. Our food supply, like most other aspects of modern day life, has been so extended as to be poisonous. Fishes soaked in chemicals literally change sex back and forth, and shrimp in the ocean (with traces of Prozac) swim toward the light - where they get ate.

Its not about fixing the conversation. Its about filling in the blanks. If people are prone to click on something that is exactly what they will get, even if it is not something they want.

We misinform kids about sex in a way that can screw up the rest of their lives. Against the will of people data is collected so that they may be stalked and harassed. If you once thought you were fat in the past, long after becoming anorexic there will still be ads reminding you how fat you are, following you around the web.

When bits of culture die the life lessons wrapped in it fade as well. Sure there may be HTML codes for emotions, but (beyond ad targeting) it is hard to reduce people to number.

Is the push toward homoginization to increase yield and chasing the lowest common denominator making people happier or more miserable?

I realize that reading the above can quickly make me sound like some ultra left-winged hippie, but the point of this post is not a political one ... rather one on the basic rule of law.

We justify (or downplay) harming ourselves, our environments, and the environments of other animals so we can have more and better. But to do this we often take on debt and leverage and put ourselves in precarious situations. Worse yet, we often have *others* decide to take on leverage for us, without our desire or permission.

Why is it that the government is giving Google tax credits to build more low income housing while the Federal Reserve is sitting on over $1 trillion in bad mortgage paper? How can the government want to make housing cheaper / more affordable while simultaneously propping up (and thus ensuring overvaluation of) virtually the whole of the market? How can the government taking both sides of the same bet lead to anything but waste, fraud & abuse?

If you believe in efficient market theory then banking should represent a small portion of the profit pool (since banks are all dealing in the same commodity of cash). And yet the banking class keeps representing a growing portion of the profits, while the bad sides of their trades (the losses) are passed on to tax payers.

I don't mind someone else levering up with risk so long as they have to pay the consequences of their failures. But capitalism without failure is like religion without sin.

These banks threatened tanks in the street if they didn't get their bailouts.

They went so far as to say even auditing the Federal Reserve would threaten the financial system. Sorry, um, but that is exactly what the banking class did. If they are not punished for committing crimes then the lawlessness will only grow more extreme, as it has.

When the bubble popped some of these scammers, charlatans, shysters, swindlers, and tricksters claimed that "nobody saw it coming," but in fact as things started to go wrong these folks leaned into it and made it worse.

Rather than having CDOs go unsold they engaged in self-dealing & kept mixing the bad chunks in, sorta like making new sausage out of old sausage. They knew what they were doing. They intended to commit fraud:

"On paper, the risky stuff was gone, held by new independent CDOs. In reality, however, the banks were buying their own otherwise unsellable assets."
...
"One rival investment banker says Merrill treated CDO managers the way Henry Ford treated his Model T customers: You can have any color you want, as long as it's black."

Its labor day. The criminal bankers who ripped you off in the past, who are currently ripping you off with more crimes, and who will rip your children off are stealing your labor. And since neither political party cares to stop it its up to you how much you want to give...there is no end to how much they would love to take. Time for me to take a break. ;)

Are You Thinking Like Google?

No, not like that, but in the good way! :D

The following is a guest post by Jim Kukral highlighting one of the most fundamental tips to succeeding online.

Have you ever really taken a step back from all the technical SEO stuff and thought about why Google wins? The real reasons why they have mass-market share and why they continue to dominate? It's time you should, because once you understand how to start thinking like Google, you can finally begin to go beyond just ranking better, but also how to be a master Internet marketer so you can get more sales, leads and publicity.

After all, once you've been found, you now have to convert. Otherwise, it's a waste of time.

So why does Google win? Because Google is the world's biggest, and best, problem solver. The truth is that there are only two reasons why we all go online, using Google or not. Those two reasons are:

1. To have a problem solved
2. To be entertained

That's it. Everything, and I mean everything you do online falls under one of those categories. For example, let's say you're planning on cooking your wife her favorite chicken marsala dish for your anniversary. You go online and do a search for "chicken marsala recipes". Boom, you now have recipes, and videos, and images and cookbooks and all kinds of information to help you solve your problem.

As another example, let's say you wanted to relax after work and watch your favorite musician play some of your favorite songs. You go to YouTube and do a search for "Rolling Stones Videos" and boom, you're now watching video content that entertains you.

YouTube, which is owned by Google, is already the number two most searched search engine on the Internet (behind Google of course). That means that today billions of people are actively searching the Internet for video content. That also means that because of the public's fast-growing massive hunger for content in video form, that regular people and businesses alike are now able to profit from the creation of that said video content.

The truth is, Google (and your business) has to solve problems for their (your) customers, the Internet searcher. If they (you) can't do that, they (you) lose customers. It's that black and white.

So I'll ask you again. Are you thinking like Google? Have you sat down and figured out what your target audience's biggest problems are? If you haven't done that you need to do it now. Anticipate what they need. Figure out their pain and then create products/services that take that pain away.

Just like Google.

For over 15-years, Jim Kukral has helped small businesses and large companies like Fedex, Sherwin Williams, Ernst & Young and Progressive Auto Insurance understand how find success on the Web. Jim is the author of the book, "Attention! This Book Will Make You Money", as well as a professional speaker, blogger and Web business consultant. Find out more by visiting www.JimKukral.com. You can also follow Jim on Twitter @JimKukral.

How Many Companies Has Google Bought?

One of the best ways to track Google's strategies is through visualizing & analyzing their acquisitions. Which is what the following image helps you do. Click on it for the full enlarged version :)


via Scores

Your Favorite Eric Schmidt Quotes?

Do you want Google to tell you what you should be doing? Mr. Schmidt thinks so:

"More and more searches are done on your behalf without you needing to type. I actually think most people don't want Google to answer their questions," he elaborates. "They want Google to tell them what they should be doing next. ... serendipity—can be calculated now. We can actually produce it electronically."

Of course the problem with algorithms is they rely on prior experience to guide you. The won't tell you to do something unique & original that can change the world, rather they will lead you down a well worn path.

What are some of the most bland and most well worn paths in the world? Established brands:

The internet is fast becoming a "cesspool" where false information thrives, Google CEO Eric Schmidt said yesterday. Speaking with an audience of magazine executives visiting the Google campus here as part of their annual industry conference, he said their brands were increasingly important signals that content can be trusted.

"Brands are the solution, not the problem," Mr. Schmidt said. "Brands are how you sort out the cesspool."

"Brand affinity is clearly hard wired," he said. "It is so fundamental to human existence that it's not going away. It must have a genetic component."

If Google is so smart then why the lazy reliance on brand? Why not show me something unique & original & world-changing?

Does brand affinity actually have a hard wired genetic component? Or is it that computers are stupid & brands have many obvious signals associated with them: one of which typically being a large ad budget. And why has Google's leading search engineer complained about the problem of "brand recognition" recently?

While Google is collecting your data and selling it off to marketers, they have also thought of other ways to monetize that data and deliver serendipity:

"One day we had a conversation where we figured we could just try and predict the stock market..." Eric Schmidt continues, "and then we decided it was illegal. So we stopped doing that."

Any guess how that product might have added value to the world? On down days (or days when you search for "debt help") would Google deliver more negatively biased ads & play off fears more, while on up days selling more euphoric ads? Might that serendipity put you on the wrong side of almost every trade you make? After all, that is how the big names in that space make money - telling you to take the losing side of a trade with bogus "research."

Eric Schmidt asks who you would rather give access to this data:

“All this information that you have about us: where does it go? Who has access to that?” (Google servers and Google employees, under careful rules, Schmidt said.) “Does that scare everyone in this room?” The questioner asked, to applause. “Would you prefer someone else?” Schmidt shot back – to laughter and even greater applause. “Is there a government that you would prefer to be in charge of this?”

That exchange helped John Gruber give Eric Schmidt the label Creep Executive Officer, while asking: "Maybe the question isn’t who should hold this information, but rather should anyone hold this information."

But Google has a moral conscience. They think quality score (AKA bid rigging) is illegal, except for when they are the ones doing it!

"I think judgement matters. If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place," - Eric Schmidt

Which is why the blog of a certain mistress disappeared from the web. And, of course, since this post is on a blog, it doesn't matter:

If you're ever confused as to the value of newspaper editors, look at the blog world. That's all you need to see. - Eric Schmdit

Here is the thing I don't get about Google's rhetorical position on serendipity & moral authority: if they are to be trusted to recommend what you do, then why do they recommend illegal activities like pirating copyright works via warez, keygens, cracks & torrents?

Serendipity ho!

Yahoo! Search Now Powered by Bing

Pretty exciting day in search seeing Bing results live on Yahoo! Search results.

Yahoo! Search Powered by Bing.

There were some questions as to what might transfer and what might stay. It seems that generally algorithmically there was roughly a 1 to 1 transfer.

Same Rankings.

Yahoo! is still showing fewer characters in their page titles than Bing does. Site links (listed below some sites) may also use different anchor text. But the core results are the same. The big exceptions to the concept of the 1:1 representation would be vertical search results, left rail navigation customizations & the inline search suggestions Bing does in their search results for popular search queries.

The vertical search results & left rail navigation being home grown is no surprise, as many of the features aim to keep you on the parent portal, and that is Yahoo!'s bread and butter. Here is an example of the inline suggestions Bing does (in this example, for "loans")

Inline Suggest.

Instead of inline suggestions like that, you might see the following kinds of navigational cues from Yahoo!

Also Try.

There has been some speculation as to if any Yahoo! penalties will get rolled into Bing (or Yahoo!'s version of Bing) & so far it seems like that is generally a no. Of course, that could change over time. There also has been speculation of Yahoo! Site Explorer going away, but it seems it will remain through early 2012.

The Yahoo! Site Explorer team is planning tighter integration between Site Explorer and Bing Webmaster Center to make the transition as smooth as possible for webmasters. At this stage in the transition, it is important for webmasters to continue using Yahoo! Site Explorer to inform us about your website and its structure so you keep getting high quality traffic from searches originating on Yahoo! and our partner sites – even from markets outside the US and Canada that haven’t yet transitioned to Microsoft systems. To keep things simple, we will share site information you provide on Site Explorer with Microsoft during this transition period.

When Microsoft fully powers the Yahoo! Search back-end globally, expected in 2012, it will be important for webmasters to use Bing Webmaster Center as well. The Bing tool will manage site, webpage and feed submissions. Yahoo! Site Explorer will shift to focus on new features for webmasters that provide richer analysis of the organic search traffic you get from the Yahoo! network and our partner sites.

Unfortunately some of Yahoo!'s advanced link query operators seem to no longer work (say you wanted to find links to a domain from .gov pages). But you can get such link data (or at least a piece of it) from Majestic SEO or SEOmoz's Linkscape (also in OSE's export feature & eventually their online interface).

Some smaller search companies, like Exalead, still offer advanced filters while performing link searches. The ability to search a full web index allows you to do cool stuff you can't do with just a link graph. I haven't looked at it yet, but I have heard good things & owe the folks at InfluenceFinder a review soon. When Blekko launches they will have a boatload of free SEO features to share as well. Members of our community have been giving it rave reviews for the past month or so.

Why 'Spam' is Everywhere & Why That Means Nothing!

Sigh, not this again. ;)

Recently Rand highlighted his surprise at how prevalent search spam is. But the big issue with search today is not the existence of spam, but how it is dealt with. For a long period of time Google spent much of their resources fighting spam manually. That worked when spammers were sorta poor and one hit wonders fighting on the edge of the web & few people knew how search worked. But as technology advances & "spammers" keep building a bigger stock of capital eventually Google loses the manual game.

Search engines concede the importance of SEO. It is now officially mainstream.

  • Both Google and Microsoft offer SEO guides.
  • Microsoft and Yahoo! have in-house SEO teams.
  • Yahoo! purchased a content mill.
  • Microsoft's update email about powering Yahoo! search results later this week contained "After this organic transition is complete, Bing will power 5.2 billion monthly searches, which is 31.6 percent of the search market share in the United States and 8.6 percent share in Canada. You can take advantage of this traffic by using search engine optimization (SEO) to complement your search campaigns and boost the visibility of your business."

Sure you will still see some media reports about the "dark arts" of SEO, but that is mainly because they prefer publishing ignorant pablum to drive more page views, as self-survival is their first objective. Some of the same media companies alerting us of the horrors of SEOs have in-house SEO teams that call me for SEO consultations.

A Google engineer highlighted this piece by submitting it to Hacker News, using this as the title "sufficiently advanced spam is indistinguishable from content." We tend to over-estimate end users. If most people don't realize something is spam then to them it isn't. If the search engineers have a hard time telling if a blog is ESL or auto-generated, how is a typical web user going to distinguish the difference?

Some SEO professionals have huge networks of websites and are 8 or 9 figures flush in capital. They can afford to simply buy marketshare in any market they want to enter. Burn one of their sites and they get better at covering their tracks as they buy 5 more. At the same time the media companies are partnering with content mills & the leading content mill filed for an IPO where they are hoping for a $1.5 billion valuation.

Why does one form of garbage deserve to rank when another doesn't? If link buying is bad, then why did Google invest in Viglink? If link buying is so bad then is lying for links any better? If so, how?

How exactly can Google stop the move toward spam in a capitalistic market where domains can be registered with privacy and marketers can always rent an expert to speak for the brand? Is a celebrity endorsement which yields publicity spam? How can Google speak out against spam when they beta test search results that are 100% Google ads?

Wherever possible, Google is trying to replace part of the "organic" search results with another set of Google vertical results. If Google can roughly match relevancy while gaining further control over the traffic they will. Just look at how hard it is to get to the publisher site if you use Google image search. And Google is rumored to be buying Like.com, which will make image search far more profitable for Google.

As Google continues to try to suck additional yield out of the search results, I believe they are moving away from demoting spam (due to the point of diminishing returns & risks associated with demoting what they themselves do creating anti-trust issues). Instead of looking for what to demote, they are now shifting toward trying to find more data/signals to promote quality from.

The issue with manual intervention (rather than algorithmic advancements) is that it warps the web to promote large beaurocratic enterprises that are highly inefficient. That is ok in the short run, but in the long run it leaves search as a watered down experience. One lacking in flavor and variety. One which is boring.

Google is going to get x% of online ad revenues and y% of GDP. In the long run, them promoting inefficient organizations doesn't make the web (or search) any more stable. They need to push toward the creation of more efficient and more profitable media enterprises. Purchases of ITA Software and Metawebs allow Google to attack some of the broader queries and gain more influence over the second click in the traffic stream. Business models which are efficient grow, whereas inefficient ones are driven into bankruptcy.

As Paul Graham has highlighted, we might be moving away from a society dominated by large organizations to ones where more individuals are self-employed (or who work for smaller organizations). We hire about a dozen people, but they are sorta bucketed into separate clusters. Some work on SEO Book, some blog, some help create featured content, some help with marketing, etc. etc. etc. The net result of our efficient little enterprise is pushing terabytes of web traffic each month. Would you describe the site you are currently reading as being "spam" simply because it is efficient & profitable? Would a site that took VC capital and was less efficient be any more proper? How much less interesting is the average big media article on the field of SEO?

If a search engine gets too aggressive with penalizing "spam" then tanking competitors becomes a quite profitable business model. If they are to focus on what to demote search engineers need to figure out who is doing what AND who did it. Thus the role of SEO today is not to remain "spam free" (whatever that is) but to create enough signals of quality that you earn the benefit of the doubt. This protects you from the whims of search engineers, algorithmic updates, and attempts at competitive sabotage.

You can future-proof your SEO strategy to the point where your site never loses traffic because it never ranked! Or you can get in the game and keep building in terms of quantity and quality. If lower quality stuff is all that is typically profitable in a particular market then it isn't hard to stand out by starting out with a small high-quality website. That attempt to stand out might not be profitable, but it might give you a platform to test from. After all, Demand Media purchased eHow.com to throw up their "quality content" on.

Online the concept of meritocracy is largely a farce. Which is precisely why large search companies are willing to buy content mills. If search engines want to promote meritocracy they should focus more on rewarding individual efforts, though that might have a lower yield, and some people prefer to stay anonymous given competitive threats from outing AND some of the creepy ways online ad networks harvest their data to target them.

What does the lack of meritocracy mean for marketers? If you are a marketer you need to be aggressive at marketing your wares or someone with inferior product will out-market you and steal marketshare from you.

Will someone consider your site spam?

Sure.

But they will have worse rankings than you do!

Pages