Rand Fishkin Interview

It is no secret that in the past Rand and I have had some minor difference of opinions (mainly on outing). ;)

But in spite of those, there is no denying that he is an astute marketer. So I thought it would be fun to ask him about his background in SEO and to articulate his take on where some of our differences in opinions are. Interestingly, it turns out we shared far more views than I thought! Hope you enjoy the interview. :)

Throughout your history in the SEO field, what are some of your biggest personal achievements?

The first one would have to be digging myself (and my Mom) out of bankruptcy when we were still a small, sole proprietorship. Since then, there have been a lot of amazing times:

  • The first time I spoke at a conference (SES Toronto in 2004)
  • Transitioning from a consulting to a software business
  • Taking venture capital
  • Building a team (not just making hires)
  • Having dinner with the UN Secretary General (Ban Ki Moon) and presenting to their CTO on SEO - it was amazing to hear stories about how people in conflict-ridden parts of the world used search to find safe havens, escape and transmit information and the UN's missed opportunities around SEO. I'd never really thought of our profession as having life-or-death consequences until then.
  • Making the Inc 500 list for Fastest Growing Companies in the US (during a nasty recession)
  • Probably my biggest personal achievement, though, is my relationship with my wife. I know that no matter what happens to me in any other part of my life, I have her support and love forever. That gets a guy like me through a lot of tough times.

Geraldine & Rand in San Francisco

My wife and I in San Franicsco (via her blog)

What are the biggest counter-intuitive things you have learned in SEO (eg: that theoretically shouldn't work, but wow it does (or the opposite - should work but doesn't)?

The most obvious one I think about regularly is that the "best content rarely wins." The content that best leverages (intentionally or not) the system's pulleys and levers will rise up much faster than the material the search engines "intended" to rank first.

Another big one includes the success of very aggressive sales tactics and very negative, hateful content and personalities. Perhaps because of the way I grew up or my perspective on the world, I always thought of those things as being impediments to financial success, but that's not really the case. They do, however, seem to have a low correlation with self-satisfaction and happiness, and I suppose, for the people/organizations with those issues, that's even worse.

A very specific, technical tactic that I'm always surprised to see work is the placement of very obvious paid text links. We realized a few months back that with Linkscape's index, we could ID 90%+ of paid link spam with a fairly simple process:

  1. Grab the top 10K or 100K query monetizable terms/phrases (via something like a "top AdSense payout" list)
  2. Find any page on the web that contains 2+ external anchor text links pointing to separate websites (e.g. Page A has a link that says "office supplies" linking to 123.com and another link that says "student credit card" linking to 456.com)
  3. Remove the value passed by those links in any link metric calculation (which won't hurt the relevancy/ranking of any pages, but will remove the effects of nearly all paid links)

We've not done the work to implement this, so perhaps there's some peculiar reason why applying it is harder than we think. But, it strikes me that even if you could only do it for pages with 3 or 4+ links in this fashion, you'd still eliminate a ton of the web's "paid" link graph. The fact that Google clearly hasn't done this makes me think it must not work, but I'm still struggling to understand why.

BTW - I asked some SEOs about making this a metric available through Linkscape/Open Site Explorer (like a "liklihood this page contains paid links" metric) and they all said "don't build it!" so we probably won't in the near term.

One of the big marketing angles you guys tried to push hard on was the concept of transparency. Because of that you got some pretty bad blowback when Linkscape launched (& perhaps on a few other occasions). Do you feel pushing on the transparency angle has helped or hurt you overall?

I think those inside the SEO community often perceive a conflict or tiff internally as having a much broader reach than it really does. I'd agree that folks like you and I, and maybe even a few hundred or even a thousand industry insiders are aware of and take something away from those types of events, but SEOmoz as a software company with thousands of paying subscribers and hundreds of thousands of members seems to be far less impacted than I am personally.



Re: Linkscape controversy - there have been a few - but honestly, the worst reputation/brand problems we ever had have always been with regards to personal issues or disputes (a comment on someone's blog or something we wrote or allowed to be published on YOUmoz). I don't have a good explanation for why they crop up, but I can say that they seem to have a nearly predictable pattern at this point (I'm sure you recognize this as well - think I've seen you write fairly eloquently on the subject). That does make it easier to handle - it's the unpredictable that's scary.

We certainly maintain transparency as a core value and we're always trying to do more to promote it. To me, core value means "things we value more than revenue or profits" and so even if it's had some hard-to-measure, adverse impact, we'd maintain it. We've actually got a poster hanging up in the office that our design team made:
The "T" in TAGFEE

An excerpt from our TAGFEE poster

There's a quote I love on this topic that explains it more eloquently than I can:

"(Our) core values might become a competive advantage, but that is not why we have them. We have them because they define for us what we stand for, and we would hold them even if they became a competitive disadvantage." - Ralph Larson, CEO of Johnson and Johnson

What type of businesses do you think do well with transparency? What type of businesses do you feel do poorly with it?

Hmm... Not something I've tried to apply to every type of business, but my feeling is that nearly every company can benefit from it, though it also exposes you to new risk. Even being the transparency-loving type, I'd probably say that military contractors, patent trolls and sausage manufacturers wouldn't do so well.

How have you been able to manage the transparency angle while having investors?

I thought it would be tougher after taking investment, but they've actually been very supportive in nearly every case (some parts of Linkscape, particularly those re: our patent filings being exceptions). I don't know if that would be true had we taken on different backers, but that's why the startup advice to choose your investors like you choose your husband/wife is so wise.

When you took investment money did you mainly just get capital? What other intangibles came with it? How have your investors helped shape your business model?

It certainly made us much more focused on the software model. As you noted, we dropped consulting in 2010 entirely, and we've generally limited any form of non-scalable revenue to help fit with the goals of a VC-backed business. We did gain some great advisors and a lot more respect in many technology and startup circles that would have been tough without the presence of venture funds (although I think that's shifting somewhat given the changes of the past 2-3 years in the startup world).

Have you guys ever considered buying out your investors? Are you worried what might happen to your company if/when it gets sold?

While we'd love to, I doubt that would ever be possible (barring some sort of massive personal windfall outside of SEOmoz). Every dollar we make gets our investors more excited about the future of the company and less likely to want to sell their shares before we reach our full potential. Remember that with VC, the idea is high risk, high reward, so technically, they'd rather we go for broke and fall to pieces than do a mid-size, but profitable deal. Adding $5 or $10 million dollars back to a $300+ million fund is largely useless to a VC, so a bankruptcy while trying to return $50 or $100 million is a very tolerated, sometimes preferable result.

VC Chart of Returns

I wrote about this more in my Venture Capital Process post (where I talked about failing to raise money in summer 2009)

Now that you are already well known & well funded you are taking a fairly low risk strategy to SEO, but if you were brand new to the space & had limited capital would you spam to generate some starting capital? At what point would you consider spamming being a smaller risk than obscurity?

You ask great questions. :-)

While I don't think spam has any moral or ethical problems, I don't know that I'd ever be able to convince myself that spam would be a more worthwhile endeavor than brand building for a white hat property. Overnight successes take years of hard work, and I'd much rather get started as a scrappy, bootstrapping company than build up a reserve with spam dollars and waste that time. However, I certainly don't think that applies to everyone. As you know, I've got lots of friends who've done plenty of shady stuff (probably a lot I don't even want to know about!), but that doesn't mean I respect them any less.

Speaking of low risk SEO, why do you think neither of our sites has hit the #1 slot yet in Google for "seo"? And do you think that ranking would have much business impact?

We've looked at the query in our ranking models and I think it's unlikely we could ever beat out the Wikipedia result, Google or SEO.com (unless GG pulls back on their exact-match domain biasing preference). That said, we should both be overtaking SEOchat.com fairly soon (and some of the spammier results that temporarily pop in and out). Some of our engineers think that more LDA work might help us to better understand these super-high competitive queries.

Analysis of "SEO" SERPs in Google

SERPs analysis of "SEO" in Google.com w/ Linkscape Metrics + LDA (click for larger)

In terms of business impact - yeah, I think for either of us it would be quite a boon actually (and I rarely feel that way about any particular single term/phrase). It would really be less the traffic than the associated perception.

As an SEO selling something unique (eg: not selling a commodity that can be found elsewhere & not as an affiliate) I have found word of mouth marketing is a much more effective sales channel than SEO. Do you think the search results are overblown as a concern within the SEO industry? Do you find most of your sales come from word of mouth?

I see where you're coming from, but in our analyses, it's always been a combination of things that leads to a sale. People search and find us, then browse around. Or they hear of us and search for information about us. Then they'll find us through social media or referring site and maybe they'll sign up for a free account. They'll get a few emails from us, have a look at PRO and go away. Then a couple months later they'll be more serious about SEO and search for a tool or answer and come across us again and finally decide, "OK, these guys are clearly a good choice."



This is what makes last touch attribution so dangerous, but it also speaks to the importance of having a marketing/brand presence across multiple channels. I think you could certainly make the case that many of us in the SEO field see every problem as a nail and our profession as the hammer.

What business models do you feel search fits well with, and what business models do you feel search is a poor fit for?

I think it's terrific for a business that has content or products they can monetize over the web that also relate to things people are already searching for. It's much less ideal for a product/service/business that's "inventing" something new that's yet to be in demand by a searching population. If you're solving a problem that people already have an identified pain point around, whether that's informational, transactional or entertainment-driven, search is fantastic. If that pain point isn't sharp enough or old enough to have generated an existing search audience, branding, outreach, PR and classic advertising may actually do better to move the needle.

Have you ever told a business that you felt SEO would offer too low of a yield to be worth doing?

Actually yes! I was advising a local startup in Seattle a couple years ago called Gist and told them that SEO couldn't really do much for them until people started realizing the need for social-plugins to email and searching for them. This is the case with a lot of startups I think.

In an interview on Mixergy you mentioned up racking up a good bit of debt when you got started in search. If a person is new to the web, when would you recommend them using debt leverage to grow?

Never, if you're smart. Or, at least, never in the quantities I did. The web is so much less costly to build on nowadays and the lean startup movement has produced so many great companies (many of them only small successes, but still profitable) from $10K or less that it just doesn't make sense, especially with the horror that is today's debt market, to go too far down that route. If you can get a low-cost loan from a family member or a startup grant through a government-backed, low interest program, sure, but credit card debt (which is where I started) is really not an option anymore.

How were you able to maintain presence and generally seem so happy publicly when you first got started, even with the stress of that debt?

To be honest, I really just didn't think about it much. If you have $30K in debt, you're constantly thinking about how to pay it off month by month and day by day. When you're $450K in debt with collectors coming after you and your wife paying the rent, you think about how to make a success big enough to pay it all off or declare bankruptcy - might as well go with the former until life runs you into the latter. There's just not much else to do.



As Bob Dylan says - "when you got nothing, you got nothing to lose."

Many people new to the field are afraid to speak publicly, but you were fairly well received right off the start. What prepared you for speaking & what are keys to making a good presentation?

Oh man - I sucked pretty hard my first few presentations. I think everyone does. The only reason I was well received, at least in my opinion, is because I'd already built a following on the web and had a positive reputation that carried over from that. The only thing that really prepared me for big presentations (things like the talk to Google's webspam/search quality team or keynotes at conferences) was lots and lots of experience and for that I'll always be grateful to Danny Sullivan for giving me a shot.

I'd say to others - start small, get as many gigs as you can, use video to help (if you're great on camera, you'll be good in front of a live audience) and try to emulate speakers and presentations you've loved.

When large companies violate Google's guidelines repeatedly usually nothing happens. To cite a random example...I don't know...hmm Mahalo. And yet smaller companies when outed often get crushed due to Google's huge marketshare. Because of the delta between those 2 responses, I believe that outing smaller businesses is generally bogus because it strips freedoms away from individuals while promoting large corporations that foist ugly externalities onto society. Do you disagree with any of that? :D

I think I agree with nearly all of that statement, though I'd still say it's no more "bogus" to out small spammers than it is to spam. I would agree it's not cool that Google applies its standards unfairly, but it's hard to imagine a world where they didn't. If mikeyspaydayloans.info isn't in Google's index, no ones thinks worse of Google. If Disney.com isn't in Google (even if they bought every link in the blogosphere), searchers are going to lose faith and switch engines. The sensible response from any player in such an environment is to only violate guidelines if you're big enough to get away with it or diversified enough to not care.

I'm unhappy with how Google treats these issues, but I'm equally unhappy with how spam distorts the perception of the SEO field. Barely a day goes by without a thought leader in the technology field maligning our industry - and 9 times out of 10 that's because of the "small" spammers. If we protect them by saying SEOs shouldn't "out" on another, we bolster that terrible impression. I don't think most web spam should even have the distinction of being classified as "SEO" and I don't think any SEO professionals who want our field to be taken seriously by marketing and engineering departments should protect those who foist their ugly externalities onto us.

I know we disagree on this, but it's always an interesting discussion :-)

One of the most remarkable things about the SEO industry is the gap in earnings potential between practicing it (as a publisher) and teaching it / consulting. Why do you think such a large gap exists today?

Teaching has always been an altruist's pursuit. Look at teachers in nearly every other field - they earn dramatically less than their production/publishing oriented peers. Those who teach computer science never earn what computer scientists who work at Google or Microsoft make. Those who teach math are far less well compensated than their compatriots working as "qaunts" on Wall Street. It's a sad reality, but it's why I have so much respect for people like Market Motive, Third Door Media and Online Marketing Connect, who are trying to both teach and build profitable businesses. I love the alignment of noble pursuits with profitable ones.

You guys exited the consulting area in spite of being able to charge top rates due to brand recognition. Do you think lots of consultants will follow suit and move into other areas? How do you see SEO business models evolving over the next 3 to 5 years?

I don't think so - our consulting business was going very well and I've heard and seen a lot of growth from my friends who run SEO consulting firms. The margins and exit price valuations wouldn't have made sense for VCs, but I don't think it was a bad business at all and others are clearly doing remarkable things. Just look at iCrossing's recent sale to Hearst for $325million. You can build an amazing company with consulting - it's just not the route we took.

In regards to the evolution of the SEO business model, I'd say we're likely to see more sophistication, more automation, more scalability (and hopefully, more software to help with those) over the next few years from both in-house SEOs and external agencies/consultants. It's sometimes surprising to me how little SEO consulting has progressed from 2002 vs. things like email marketing or analytics, where software has become standard and tons of great companies compete (well, Google's actually made competition a bit more challenging in the analytics space, but creative companies like KissMetrics and Unbounce are still doing cool, interesting things).

Small businesses in many ways seem like the most under-served market, but also the hardest to serve (since they have limited time AND small budgets). Do you think the rise of maps & other verticals gives them a big opportunity, or is it just more layers of complexity they need to learn?

Probably more the former than the latter. The small business owners I know and interact with in my area (and wherever I seem to visit) are only barely getting savvy to the web as a major driver of revenue. I think it might take another 10 years or more before we see true maturity and savvy from local businesses. Of course, that gives a huge competitive advantage to those who are willing to invest the time and resources into doing it right, but it means a less "complete" map of the local world in the online one, which as a consumer (or a search engine) is less than ideal.

When does the delta between paid search & SEO investment begin to shrink (if ever)?

I think it's probably shrinking right now. Paid search is so heavily invested in that I think it's fair to call it a mature market (at least in global web search, though, re: your previous question, probably not in local). SEO is ramping up with a higher CAGR (Compound Annual Growth Rate) according to Forrester, so that delta should be shrinking.

Forrester Growth of SEO vs. Paid Search

via Forrester Research's Interactive Marketing Forecast 2009-2014

Often times a Google policy sounds like something coming out of a conflicted government economist's mouth. But even Google has invested in an affiliate network which suggests controlling your HTML links based on payment. How much further do you think Google can grow before they collapse under complexity or draw enough regulatory attention to be forced to change?

I think if they tread carefully and invest heavily in political donations and public relations, they can likely maintain another very positive 5-10 years. What the web looks like at that time is anyone's guess, and the unpredictable nature and wild shifts probably help them avoid most regulation. Certainly the rise of Facebook has been a boon to their risk exposure from government intervention, even if they may not be entirely happy with their inability to compete in the social web.

I remember you once posted about getting lots of traffic from Facebook & Twitter, but almost 0 sales from it. Does there become a point where search is not the center of the web (in terms of monetization), or are most of these networks sorta only worthwhile from a branding perspective?

As direct traffic portals, it's hard to imagine a Facebook/Twitter user being as engaged in the buying/researching process as a Google searcher. Those companies may launch products that compete with Google's model or intent, but as they exist today, I don't foresee them being a direct sales channel. They're great for traffic, branding, recognition and ad-revenue model sites, but they're of little threat to marketers concerned with the relevance or value of search disappearing.

What are the major differences between LDA & LSI?

They're both methodologies for building a vector space model of terms/phrases and measuring the distance between them as a way to find more "relevant" content. My understanding is that LSI, which was first developed in 1988, has lots of scaling issues. It's cousin, PLSI (probabilistic LSI) attempted to address some of those when it came out in 1999, but still has scaling problems (the Internet is really big!) and often will bias to more complex solutions when a basic one is the right choice.

LDA (Latent Dirichlet Allocation), which started in 2002, is a more scalable (though still imperfect) system with the same intuition and goals - it attempts to mathematically show distances between concepts and words. All of the major search engines have lots of employees who've studied this in university and many folks at Google have written papers and publications on LDA. Our understanding is that it's almost universally preferred to LSI/PLSI as a methodology for vector space models, but it's also very likely that Google's gone above and beyond this work, perhaps substantially.

The "brand" update was subsequently described as being due to looking at search query chains. In a Wired article Amit Singhal also highlighted how Google looks for entities in their bi-gram breakage process & how search query sequences often help them figure out such relationships. How were you guys able to build a similar database without access to the search sessions, or were you able to purchase search data?

In a vector space model for a search function, the distances and datasets leverage the corpus rather than query logs. Essentially, with LDA (or LSI or even TF*IDF), you want to be able to calculate relevance before you ever serve up your first search query. Our LDA work and the LDA tool in labs today use a corpus of about 8 million documents (from Wikipedia). Google's would almost certainly use their web index (or portions of it).

It's certainly possible that query data is also leveraged for a similar purpose (though due to how people search - with short terms and phrases rather than long, connected groups of words - it's probably in a different way). This might even be something that helps extend their competitive advantage (given their domination of market share).

Sometimes one can see Google's ontology change over time (based on sharp ranking increases and drops for outlier pages which target related keywords but not the core keyword, or when search results for 2 similar keywords keep bouncing between showing the exact same results to showing vastly different results). How do you guys account for these sorts of changes?

Thus far, we haven't been changing the model - it just launched last week. However, one nice thing we get to do consistently is to run our models against Google's search results. Thus, if Google does change, our scores (and eventually, the recommendations we hope to make) should change as well. This is the nice part about not having to "beat" Google in relevance (as a competing search engine might want to do) but simply to determine where Google's at today.

For a long time one of the thing I have loathed most in the SEO space was clunky all-in-one desktop tools that often misguide you into trying to change your keyword density on the word "the" and other such idiocy. Part of the reason we have spent thousands of Dollars offering free Firefox extensions was my disgust toward a lot of those all-in-one tools. A lot of the best SEOs tend to prefer a roll-your-own mix and match approach to SEO. Recently you launched a web application which aims to sorta do all-in-one. What were the key things you felt you had to get right with it to make it better than the desktop software so many loathe?

I think our impetus for building the web app was taken from the way software has evolved in nearly every other web marketing vertical. In online surveys, you had one-time, self built systems and folks like Wufoo and SurveyMonkey have done a great job making that a consolidated, simple, powerful software experience. That goes for lots of others like:

  • PPC - Google has really taken the cake here with Adwords integration and the launch of Optimizer and even GA
  • CRM - Salesforce, of course, was the original "all-in-one" web marketing software, and they've shown what a remarkable company you can build with that model. InfusionSoft and other players are now quickly building great businesses, too.
  • Email Marketing - Exact Target, Constant Contact, Mailchimp, MyEmma, iContact and many more have built tens-hundreds of millions of dollar/year businesses with "all-in-one" software for handling email marketing.
  • Banner Ads - platforms like Aquantive, DoubleClick, AdReady, etc. have and are building scalable solutions that drive billions in online advertising
  • Analytics - remember when we had one-off, log file analysis tools and analytics consultants who built their own tools to dig into your data? Those consultants are still here, but they're now armed with much more powerful tools - Google Analytics, Omniture, Webtrends, etc. (and new players like KISS Metrics, too)

You're likely spot-on in thinking that power players will continue to mash up and hack their own solutions, build their own tools and protect their secret processes to make them more exclusive in the market and (hopefully) competitive. But, these folks are on the far edge of the bell curve. In every one of the industries above (and many others), it looks like the way to build a scalable software product that many, many people adopt, use and love is to optimize of the middle to upper-end of the bell curve (what we'd probably call "intermediate" to "advanced" SEOs, rather than the outlier experts).

When you gather ranking data do you use APIs to do so? If not, how hard was it been on the technical front scaling up to that level of data extraction?

Some data we can get through APIs, but most isn't available in that fashion, so relatively robust networks are required to effectively get the information. Luckily, we've got a pretty terrific team of engineers and a VP of Engineering who's done data extraction work previously for Amazon, Microsoft and others. I'd certainly say that it ranks in the top 10 technical challenges we've faced, but probably not the top 3.

What do you gain by doing the all-in-one approach that a roll your own type misses out on?

Convenience, consistency, UI/UX, user-friendliness and scalability are all big gains. However, the compromise is that you may lose some of that "secret-sauce" feeling and the power that comes from handling any weird situation or result in a hands-on, one-to-one fashion. Plenty of folks using our web app have already pointed out edge-case scenarios where we're probably not taking the ideal approach, and those kinks will take time to be ironed out.

Some firms use predictive analytics to automatically change page titles & other attributes on the fly. Do you see much risk to that approach? Do you eventually see SEO companies offering CMS tools as part of their packages to lock in customers, while integrating the SEO process at a much deeper level?

When we were out pitching to take venture capital last summer, a lot of VCs felt that this was the way to go and that we should have products on this front.

Personally, I don't like it, and I'd be surprised if it worked. Here's why:

  • Editors/writers should be responsible for content, not machine-generated systems built to optimize for search engines. Yes, those machine systems can and should make recommendations, but I fear for the future of your content and usability should "perfect SEO" be the driving force behind every word and phrase on your site.
  • With links being such a powerful signal, it's far better to have a slightly less well-targeted page that people actually want to link to than a "perfect" page that reads like machine-generated content.
  • I think content creators who take pride in their work are the ones who'll be better rewarded by the engines (at least in the long term - hopefully your crusade against Demand Media, et al. will help with that), and those are the same type of creators who won't permit a system like this to automatically change their content based on algorithmic evaluation.

There are cases I could see where something like this would be pretty awesome, though - e.g. a 404 detector that automatically 301s pages it sees earning real links back to the page it thinks was the most likely intended target.

On your blog recently there was a big fuss after you changed your domain authority modeling scores. Were you surprised by that backlask? What caused such a drastic change to your scores?

We were surprised only until we realized that somehow, our internal testing missed some pretty obvious boneheaded scores.

Basically, we calculate DA and PA using machine learning models. When those models find better "correlated" results, we put them in the system and build new scores. Unfortunately, in the late August release, the models had much better average correlation but some really terrifically bad outliers (lots of junky single-page keyword-match domains got DAs of 100 for example).

We just rolled out updated scores (far ahead of our expected schedule - we thought it would take weeks), and they look much better. We're always open to feedback, though!

When I got into SEO (and for the first couple years) it seemed like you could analyze a person's top backlinks and then literally just go out and duplicate most of them fairly easily. Since then people have become more aware of SEO, Google has cracked down on paid links, etc. etc. etc. Based on that, a lot of my approach to SEO has moved away from analysis and more toward just trying to do creative marketing & hope some % of it sticks. Do you view data as being a bit of a sacred cow, or more of just a rough starting point to build from? How has your perception as to the value of data & approach to SEO changed over time?

I think your approach is almost exactly the same as mine. The data about links, on-page, social stats, topic models, etc. is great for the analysis process, but it's much harder to simply say "OK, I'll just do what they did and then get one more link," than it was when we started out.

That analysis and ongoing metrics tracking is still super-valuable, IMO, because it helps define the distance between you and the leaders and gives critical insight into making the right strategic/tactical decisions. It's also great to determine whether you're making progress or not. But, yes, I'd agree that it's nowhere near as cut-and-dried as it once was.

The frustrating part for us at SEOmoz is we feel like we're only now producing/providing enough data to be good at these. I wish that 6-7 years ago, we'd been able to do it (of course, it would have cost a lot more back then, and the market probably wasn't mature enough to support our current business model).

How much time do you suggest people should spend analyzing data vs implementing strategies? What are some of the biggest & easiest wins often found in the data?

I think that's actually the big win with the web app (or with competitive software products like Raven, Conductor, Brightedge, etc). You can spend a lot less time on the collection/analysis of data and a lot more on taking the problems/opportunities identified and doing the real work of solving those issues.

Big wins in our new web app for me have been ID'ing pages through the weekly crawl that need obvious fixing (404s and 500s are included, like Google Webmaster Tools, but so are 20+ other data points they don't show like 302s, incorrect rel canonicals, etc.)

Blekko has got a lot of good press by sharing their ranking models & link data. Their biggest downside so far in their beta is the limited size of their index, which is perhaps due to a cost benefit analysis & they will expand their index size before they publicly launch. In some areas of the web Google crawls & indexes more than I would expect, while not going to deeply into others. Do you try to track Google's crawls in any way? How do you manage your crawl to try to get the deep stuff Google has while not getting the deep stuff that Google doesn't have?

Yeah - we definitely map our crawls against Google, Bing and Majestic on a semi-regular basis. I can give you a general sense of we see ourselves performing against these:

  • Google - the freshest and most "complete" (without including much spam/junk) of the indices. A given Linkscape index is likely around 40-60% of the Google index in a similar timeframe, but we tend to do pretty well on coverage of domains and well-linked-to pages, though worse on deep crawling in big sites.
  • Bing - they've got a large index like Google, but we actually seem to beat them in freshness for many of the less popular corners of the web (though they're still much faster about catching popular news/blogs/etc from trusted sources since they update multiple times daily vs. our once-per-month updates).
  • Majestic - dramatically larger in number of URLs than Google, Bing or Linkscape, but not as good as any of those about freshness or canonicalization (we'll often see hundreds of URLs in the index that are essentially the same page with weird URL parameters). We like a lot of their features and certainly their size is enviable, but we're probably not going to move to a model of continuous additions rather than set updates (unless we get a lot more bandwidth/processing power at dramatically lower rates).


the problem with maintaining old URLs became more clear when we analyzed decay on the WWW

In terms of reaching the deep corners of the web, we've generally found that limiting spam and "thin" content is the big problem at those ends of the spectrum. Just as email traffic is estimated to be 90%+ spam, it's quite possible that the web, if every page were truly crawled and included, would have similar proportions. Our big steps to help this are using metrics like mozTrust, mozRank and some of our PA/DA work to help guide the crawl. As we scale up index size (probably December/January of this year), that will likely become a bigger challenge.

---

Thanks Rand. You can read his latest thoughts on the SEOmoz blog and follow him on Twitter at @randfish.

Universal Truth Of Selling On The Web: Easy & Simple Wins

The following is a guest post by Jim Kukral.

Google knows this. Now you do as well. Easy always wins. Take a moment and picture your website or your blog or your product or service in your head right now. Now, think of Google’s. Which one is easier? No, you're not a search engine, you're probably a small business owner with a variety of products services, entrepreneur with a business idea, or blogger . But the comparison remains because regardless of what it is you do easy will always win.

So keep thinking about your Web business. Is what you’re selling easy to buy? By that I mean; when somebody comes to buy from you, or to simply get information from you like a phone number or to download a white paper… is it easy to do? Or are you making it too hard?

Picture Google.com again in your head. It's pretty darn easy, no? There's a logo and a big input box underneath it. You put in what you're looking to find, and hit search and boom, you find it. Easy. Google understands that customers use them for one reason, to have a problem solved, and therefore, that’s what they deliver, without all the frills that other search portals like Aol or Yahoo! try to offer.

Your opportunity right now is to figure out the main one or two reasons people visit your website, because despite what you might think, your customers probably have only those one or two things on their mind when they visit you.

If you visit the home page of Orbtiz.com, you’re probably there to do one of a few things only. Book a flight, find a car, or make a hotel reservation. Possibly all three at once. But honestly, that’s pretty much it, right? I would bet that 99% of their traffic is trying to do one of those things. The same goes for you and your website, blog, membership site or anything you produce online.

What exactly are your customers looking for? You need to find out and find out right now! Check your analytics (I recommend Google Analytics, it's free! www.Google.com/Analytics) to find out things like the most viewed pages of your website, as well as the most exited pages too. You may find out that 90% of your visitors are focusing on the free white paper download page and ignoring the other pages you thought were important. That’s great news! Now, you at least know what your customers want. And now you can make it easier for them to get it. You may also find out that a large percentage of your visitors always leave your website on one specific page, giving you the insight that perhaps they aren't finding what they're looking for, getting frustrated, and surfing away. That's bad.

So what should you do with that knowledge to make things easier for your visitor, and better for your business? If you're getting a lot of traffic to your free white paper download, go ahead and take that download information and make it stand out on your home page. If done right, you'll make it as easy as possible for your visitors to get what they were looking for, and you’ll see even more downloads, and happier visitors because you didn’t make them work so hard.

Now, you may also find out that the page you really wanted your visitors to see is not being viewed enough. This could be the specials page on your e-commerce site, or the packages page on your consulting site or maybe your customer support contact information page. Whatever it may be, once you know what it is, that page obviously needs to be viewed more, and while you can’t force it down your visitors digital throats, you can redesign your page so that it limits the other choices that can distract your visitor.

Make it easy and simple, then win!

For over 15-years, Jim Kukral has helped small businesses and large companies like Fedex, Sherwin Williams, Ernst & Young and Progressive Auto Insurance understand how find success on the Web. Jim is the author of the book, "Attention! This Book Will Make You Money", as well as a professional speaker, blogger and Web business consultant. Find out more by visiting www.JimKukral.com. You can also follow Jim on Twitter @JimKukral.

Labor Day = Yeah

When you think of labor day what comes to mind? For me it is these 2 thoughts

  • lower earnings because few people are online today
  • since almost nobody is online, any hours worked today are me getting ahead of the market ;)

Working hard & working long hours can almost be a disease...the web makes it easy to be addicted.

But for every person who is putting in hard work trying to help people there is another person selling image.

The big issue with the image game is the risks. As the lies pile up they corner people into a bad situation, to where they can (and do) lose everything.

If I had to take a single point of reference to help a stranger judge the difference between a hack and someone who wants to honestly help people, I would say it is this: do they encourage you to take on debt.

  • If they do then there is a good chance they are the type of person who will go out of their way to screw you.
  • If they do not then they are likely not a maximizer type (because if they were then they would be encouraging you to go into debt to sell you more stuff).

It is not that all debt is evil (when I got started online I was naive enough to start on a credit card), but life and markets are unpredictable. If I wasn't smart enough to get a job to cover my 6 or so months of education before going full time online who knows where I would now be. What seems like a short term gain can lead to longterm failure. We are human, and so we are flawed. Wen you have debt/leverage you have no spare parts. So if something goes wrong you are done. Nassim Taleb spoke about the importance of savings and diversity of revenues as keys to survival, while noting that the very structure of our public markets encourages risk + leverage (options encourage short term performance & volatility rather than sustained growth, and you hope the guy on the next watch is stuck holding the Madoff ponzi bag).

The falls of past empires have typically been preceded by rapid inflation in food costs. Our food supply, like most other aspects of modern day life, has been so extended as to be poisonous. Fishes soaked in chemicals literally change sex back and forth, and shrimp in the ocean (with traces of Prozac) swim toward the light - where they get ate.

Its not about fixing the conversation. Its about filling in the blanks. If people are prone to click on something that is exactly what they will get, even if it is not something they want.

We misinform kids about sex in a way that can screw up the rest of their lives. Against the will of people data is collected so that they may be stalked and harassed. If you once thought you were fat in the past, long after becoming anorexic there will still be ads reminding you how fat you are, following you around the web.

When bits of culture die the life lessons wrapped in it fade as well. Sure there may be HTML codes for emotions, but (beyond ad targeting) it is hard to reduce people to number.

Is the push toward homoginization to increase yield and chasing the lowest common denominator making people happier or more miserable?

I realize that reading the above can quickly make me sound like some ultra left-winged hippie, but the point of this post is not a political one ... rather one on the basic rule of law.

We justify (or downplay) harming ourselves, our environments, and the environments of other animals so we can have more and better. But to do this we often take on debt and leverage and put ourselves in precarious situations. Worse yet, we often have *others* decide to take on leverage for us, without our desire or permission.

Why is it that the government is giving Google tax credits to build more low income housing while the Federal Reserve is sitting on over $1 trillion in bad mortgage paper? How can the government want to make housing cheaper / more affordable while simultaneously propping up (and thus ensuring overvaluation of) virtually the whole of the market? How can the government taking both sides of the same bet lead to anything but waste, fraud & abuse?

If you believe in efficient market theory then banking should represent a small portion of the profit pool (since banks are all dealing in the same commodity of cash). And yet the banking class keeps representing a growing portion of the profits, while the bad sides of their trades (the losses) are passed on to tax payers.

I don't mind someone else levering up with risk so long as they have to pay the consequences of their failures. But capitalism without failure is like religion without sin.

These banks threatened tanks in the street if they didn't get their bailouts.

They went so far as to say even auditing the Federal Reserve would threaten the financial system. Sorry, um, but that is exactly what the banking class did. If they are not punished for committing crimes then the lawlessness will only grow more extreme, as it has.

When the bubble popped some of these scammers, charlatans, shysters, swindlers, and tricksters claimed that "nobody saw it coming," but in fact as things started to go wrong these folks leaned into it and made it worse.

Rather than having CDOs go unsold they engaged in self-dealing & kept mixing the bad chunks in, sorta like making new sausage out of old sausage. They knew what they were doing. They intended to commit fraud:

"On paper, the risky stuff was gone, held by new independent CDOs. In reality, however, the banks were buying their own otherwise unsellable assets."
...
"One rival investment banker says Merrill treated CDO managers the way Henry Ford treated his Model T customers: You can have any color you want, as long as it's black."

Its labor day. The criminal bankers who ripped you off in the past, who are currently ripping you off with more crimes, and who will rip your children off are stealing your labor. And since neither political party cares to stop it its up to you how much you want to give...there is no end to how much they would love to take. Time for me to take a break. ;)

Are You Thinking Like Google?

No, not like that, but in the good way! :D

The following is a guest post by Jim Kukral highlighting one of the most fundamental tips to succeeding online.

Have you ever really taken a step back from all the technical SEO stuff and thought about why Google wins? The real reasons why they have mass-market share and why they continue to dominate? It's time you should, because once you understand how to start thinking like Google, you can finally begin to go beyond just ranking better, but also how to be a master Internet marketer so you can get more sales, leads and publicity.

After all, once you've been found, you now have to convert. Otherwise, it's a waste of time.

So why does Google win? Because Google is the world's biggest, and best, problem solver. The truth is that there are only two reasons why we all go online, using Google or not. Those two reasons are:

1. To have a problem solved
2. To be entertained

That's it. Everything, and I mean everything you do online falls under one of those categories. For example, let's say you're planning on cooking your wife her favorite chicken marsala dish for your anniversary. You go online and do a search for "chicken marsala recipes". Boom, you now have recipes, and videos, and images and cookbooks and all kinds of information to help you solve your problem.

As another example, let's say you wanted to relax after work and watch your favorite musician play some of your favorite songs. You go to YouTube and do a search for "Rolling Stones Videos" and boom, you're now watching video content that entertains you.

YouTube, which is owned by Google, is already the number two most searched search engine on the Internet (behind Google of course). That means that today billions of people are actively searching the Internet for video content. That also means that because of the public's fast-growing massive hunger for content in video form, that regular people and businesses alike are now able to profit from the creation of that said video content.

The truth is, Google (and your business) has to solve problems for their (your) customers, the Internet searcher. If they (you) can't do that, they (you) lose customers. It's that black and white.

So I'll ask you again. Are you thinking like Google? Have you sat down and figured out what your target audience's biggest problems are? If you haven't done that you need to do it now. Anticipate what they need. Figure out their pain and then create products/services that take that pain away.

Just like Google.

For over 15-years, Jim Kukral has helped small businesses and large companies like Fedex, Sherwin Williams, Ernst & Young and Progressive Auto Insurance understand how find success on the Web. Jim is the author of the book, "Attention! This Book Will Make You Money", as well as a professional speaker, blogger and Web business consultant. Find out more by visiting www.JimKukral.com. You can also follow Jim on Twitter @JimKukral.

How To Write Good

Yes, deliberate mistake :)

It grates when people write poorly, huh. When writers write well, the words almost become invisible. The focus shifts away from technical details, and onto the message.

Is there an easy way to write better blog posts? E-mails? Web copy?
Let's take a look at three guidelines for web writing.

1. If You Can Say It, You Can Write It

The Dilbert Mission Statement Generator - sadly now offline - comes up with convoluted gems this:

"Our challenge is to assertively network economically sound methods of empowerment so that we may continually negotiate performance based infrastructures"

Satire, one would hope.

However, the US Air Force uses the following mission statement:

"The mission of the United States Air Force is to deliver sovereign options for the defense of the United States of America and its global interests - to fly and fight in Air, Space, and Cyberspace"

"Deliver sovereign options"?

Who talks like this? Well, apart from the US military.

Nobody.

Good web writing is the same as good spoken language. Use short sentences, short words, simple structures and a natural, predictable flow of ideas. Avoid waffle, hyperbole and words that hide meaning. Whenever you finish a piece of writing, read it aloud. Cut or rephrase phrases that sound clunky, because they'll read clunky, too.

Your writing will sound warm and human.

The human voice is especially important online. Communicating at a distance, particularly two-way communication, is relatively new to humans. To help people connect with one another more easily, it pays to write in a warm, conversational style that mimics personal conversation when conducted in close, physical proximity.

When you think about how you would say something, especially to a specific person, you choose words, expressions and structures based on that personal context. Try to imagine that person in front of you as your write.

This approach works well for all applications - from formal legal sites, to personal sites.

2. Planning

Planning what you're going to say helps you to complete any writing task more quickly and easily.

  • 1. Identify and list your goals. What is the message? What is the desired action you want your reader to take? What is the key thought you want your reader to take away?


    For example, a goal list might look like this:



    *inform people the last project went well, even though there were problems
    *highlight the good aspects about the project
    *highlight the problems
    *present ideas on how these problems can be overcome in the next project
    *get everyone revved up and excited about the next project
  • 2. Think about the audience. Who is your audience? What do you know about the person or group?
  • 3. Determine the right tone and format based on answers 1& 2
  • 4. Write quickly. Don't edit, even if your writing is a mess. Separate out your writing and editing functions.
  • 5. Draw a solid conclusion. Calls to action work well.
  • 6. Read aloud what you've written. Cut, fix and tighten. Writing comes alive in the rewrite.

Solid blog posts sound spontaneous, but they're not. They're often structured, worked and reworked.

3. Hyperbole Doesn't Work On The Web

Hyperbole means extreme exaggeration. i.e. "All the perfumes of Arabia could not sweeten this little hand". Web readers tend to gloss over the flowery and the convoluted.

On the web, people scan, so the shape of your writing - how it appears on the page - can be just as important as what you say. So think about the shape and form of your writing. Can you use bullets, headings and images to break up large blocks of text? Sometimes, the best thing to do is not write at all. Can an image convey your message? If so, use it.

Also consider context. When visitors arrive on a page, a page deep within your site, do they know what your site is about from glancing at that one page? If not, consider using chunks of content to provide context. These chunks of information can be repeated on every page of your site, and should be self explanatory. Think directory entry. Your repeat visitors will become blind to it, but your first time readers will appreciate it.

We could go on all day about web writing. However, we'd like to hear your tips. How do you approach writing on your site? Do you plan? Do you wing it? What style of writing gets the best results?

Selling SEO Services: A Consultative Approach

Does the thought of selling fill you with dread?

If you see yourself as a technologist, or marketer, then selling may not come easy to you. But we all need to sell something, even if it is just our opinion! If you're a consultant of any description, it comes with the territory.

So it pays to know a few techniques. Luckily, sales isn't something you have to be born to do - it does not require supernatural charm, charisma, a hide as thick as an elephant, and a superhuman drive.

Selling can be like a doctors consultation.

A Visit To The Doctor

When you go to the doctor, do you expect the doctor to just guess what is wrong with you?

A doctors consultation involves the doctor asking you a series of questions. This questioning is to help determine what the problem is, and how it can best be solved. At the end of the process, the feeling is probably one of relief and assurance i.e. that the doctor has your best interests at heart, and will cure what ails you.

It's the same in business.

Any client you encounter has a problem. Like a specialist doctor, it is your job to ask a series of questions to help nail down the problem and find a solution. The very act of questioning - known as consultative selling - helps build trust and rapport with the client in the same way you may experience with a doctor. This works especially well in the field of consulting, which is based on information sharing.

The emphasis is on clients needs, as opposed to getting a signature on the dotted line. You first establish a client's needs, then you provide a solution, if you have one. You're building a relationship, based on trust, by asking a series of questions.

Not so hard, really.

The Mechanics Of Consultative Selling

Ok, so how do you do it?

First, you need to understand the buyers buying process. You then match your selling process to their buy process.

All buyers go through a specific process. For example, if a company needs internet marketing services, do they go to their established provider - possibly the web design company who built their site - or do they go direct to the SEO market? Do they attend conferences? If so, which ones? Hint: they may not be SEO conferences. Do they ask other business people in their business network? Do they go with a known brand?

It's pretty simple to determine the buying process if the buyer comes straight to your website, fills out the contact form, and requests a call-back. But life often doesn't work that way.

A prospective client may ask their web design company. Their web design company may not have had a clue, had you not been in to see them a week earlier. You asked the web design people a few questions about whether they had an SEO capability in house, found out they didn't, and found out they had a lot of clients who quite possibly needed SEO. You proposed a joint deal whereas they would refer their clients to you, for a 10% commission.

Try to find out how your prospective clients buy SEO services, and position yourself accordingly. Think business associations and clubs, their existing providers in related areas, and the other companies they have an association with.

You need to get yourself positioned correctly in their buying process.

If you've managed to get in front of them, you then need to think about the questions you are going to ask. You should be asking about their business, where they see it going, what problems they are having, their place in the market, and their competitors. Business owners typically like doing this, and will welcome your interest, so long as you're seen as a "doctor" i.e someone they trust to help. You'll also need to make a presentation, which, depending on the context, need not be formal. It could consist of showing them case studies of how you've helped solve this problem before. Let's face it, most SEO/SEM problems and solutions are going to look pretty much the same.

It's all about trust relationships. It's a fact of life that people buy more readily from people they trust.

But how do you know if you can trust your prospective buyer?

Screening Buyers

Consultative selling is also a great way to screen out tire kickers. A person who is just pumping you for information will reveal very little about themselves. The conversation will be one sided.

If they are genuinely interested in your service, they are more likely to answer questions. They do have to trust you first in order to do this, so try to think like a doctor if you encounter resistance. i.e. "I want to help you get more traffic, but I can't do so if I don't know more about your business before I can devise an appropriate solution".

Be prepared to walk if they don't volunteer the information you need. Even if you did land the job, you may end providing a substandard solution to their problem, which will likely end in tears. Better to find clients who you can work with, rather than against.

Another method of screening is to pre-close the sale. When you are gathering needs, ask that if you can solve their problems to their complete satisfaction, as a result of this discussion, that they will buy your services.

This will sound to them like a fairly safe bet i.e. you have to propose something that solves their problem. However, it also creates an implied obligation on their part to do so. There is no risk on your side, as you can either solve the problem, in which case you'll likely get the business, or you can't, in which case you'll walk anyway.

If they are hesitant, it is either an opportunity to walk, and thus stop wasting your time, or an opportunity to find out something more about their buying process.

In short, when thinking about sales:

  • You are not a salesperson. You are a "doctor"
  • Focus on the needs of the client, not landing the job. Sale hucksters typically focus on the close too soon, which can destroy trust
  • It's ok to walk away. You won't be able to help some clients
  • Insist that the client engage in conversation. A client who asks you questions, and volunteers little information, might be pumping you for information

These consultative sales techniques are covered in various sales theory books. Check out "Consultative Selling", by Mack Hanan, Jay Abrams "The Sticking Point Solution", and "Stop Telling, Start Selling: How to Use Customer-Focused Dialogue to Close Sales" by Linda Richardson.

How Many Companies Has Google Bought?

One of the best ways to track Google's strategies is through visualizing & analyzing their acquisitions. Which is what the following image helps you do. Click on it for the full enlarged version :)


via Scores

Alexa Site Audit Review

Alexa Logo

Alexa, a free and well-known website information tool, recently released a paid service.

For $199 per site Alexa will audit your site (up to 10,000 pages) and return a variety of different on-page reports relating to your SEO efforts.

It has a few off-page data points but it focuses mostly on your on-page optimization.

Alexa Site Audit Review Homepage

You can access Alexa's Site Audit Report here:

http://www.alexa.com/siteaudit

Report Sections

Alexa's Site Audit Report breaks the information down into 6 different sections (some which have additional sub-sections as well)

  • Overview
  • Crawl Coverage
  • Reputation
  • Page Optimization
  • Keywords
  • Stats

The sections break down as follows:

Site Audit sections and subsections

So we ran Seobook.com through the tool to test it out :)

Generally these reports take about a day or two, ours had some type of processing error so it took about a week.

Overview

The first section you'll see is the number of pages crawled, followed by 3 "critical" aspects of the site (Crawl Coverage, Reputation, and Page Optimization). All three have their own report sections as well. Looks like we got an 88. Excuse me, but shouldn't that be a B+? :)

So it looks like we did just fine on Crawl Coverage and Reputation, but have some work to do with Page Optimization.

Alexa Site Audit Overview

The next section on the overview page is 5 recommendations on how to improve your site, with links to those specific report sections as well. At the bottom you can scroll to the next page or use the side navigation. We'll investigate these report sections individually but I think the overview page is helpful in getting a high-level overview of what's going on with the site.

Alexa Site Audit Overview

Crawl Coverage

This measures the "crawl-ability" of the site, internal links, your robots.txt file, as well as any redirects or server errors.

Reachability

The Reachability report shows you a break down of what HTML pages were easy to reach versus which ones were not so easy to each. Essentially for our site, the break down is:

  • Easy to find - 4 or less links a crawler must follow to get to a page
  • Hard to find - more than 4 links a crawler must follow to get to a page

The calculation is based on the following method used by Alexa in determining the path length specific to your site:

Our calculation of the optimal path length is based on the total number of pages on your site and a consideration of the number of clicks required to reach each page. Because optimally available sites tend to have a fan-out factor of at least ten unique links per page, our calculation is based on that model. When your site falls short of that minimum fan-out factor, crawlers will be less likely to index all of the pages on your site.

Alexa Site Audit Reachability Report

A neat feature in this report is the ability to download your URL's + the number of links the crawler had to follow to find the page in a .CSV format.

Alexa Site Audit Reachability Report Download Links

This is a useful feature for mid-large scale sites. You can get a decent handle on some internal linking issues you may have which could be affecting how relevant a search engine feels a particular page might be. Also, this report can spot some weaknesses in your site's linking architecture from a usability standpoint.

On-Site Links

While getting external links from unique domains is typically a stronger component to ranking a site it is important to have a strong internal linking plan as well. Internal links are important in a few ways:

  • The only links where you can 100% control the anchor text (outside of your own sites of course, or sites owned by your friends)
  • They can help you flow link equity to pages on your site that need an extra bit of juice to rank
  • Users will appreciate a logical, clear internal navigation structure and you can use internal linking to get them to where you want them to go

Alexa will show you your top linked to (from internal links) pages:

Onsite Links Alexa Site Audit

You can also click the link to the right to expand and see the top ten pages that link to that page:

Expanded Onsite Links Report

So if you are having problems trying to rank some sub-pages for core keywords or long-tail keywords, you can check the internal link counts (and see the top 10 linked from pages) and see if something is amiss with respect to your internal linking structure for a particular page.

Robots.txt

Here you'll see if you've restricted access to these search engine crawlers:

  • ia_archiver (Alexa)
  • googlebot (Google)
  • teoma (Ask)
  • msnbot (Bing
  • slurp (Yahoo)
  • baiduspider (Baidu)

Site Audit Robots.Txt

If you block out registration areas or other areas that are normally restricted, then the report will say that you are not blocking major crawlers but will show you the URL's you are blocking under that part of the report.

There is not much that is groundbreaking with Robots.Txt checks but it's another part of a site that you should check when doing an SEO review so it is a helpful piece of information.

Redirects

We all know what happens when redirects go bad on a mid-large sized site :)

Redirects Gone Bad

This report will show you what percentage of your crawled pages are being redirected to other pages with temporary redirects.

The thing with temporary redirects, like 302's, is that unlike 301's they do not pass any link juice so you should pay attention to this part of the report and see if any key pages are being redirected improperly.

Redirect Report Alexa Site Audit

Server Errors

This section of the report will show you any pages which have server errors.

Alexa Site Audit Server Errors

Making sure your server is handling errors correctly (such as a 404) is certainly worthy of your attention.

Reputation

The only part of this module is external links from authoritative sites and where your site ranks in conjunction with "similar sites" with respect to the number of sites linking to your sites and similar sites.

Links from Top Sites

The analysis is given based on the aforementioned forumla:

Alexa Reputation

Then you are shown a chart which correlates to your site and related sites (according to Alexa) plus the total links pointing at each site which places the sites in a specific percentile based on links and Alexa Rank.

Since Alexa is heavily biased towards webmaster type sites based on their user base, these Alexa Rank's are probably higher than they should be but it's all relative since all sites are being judged on this measure.

Alexa Site Audit Link Chart

The Related Sites area is located below the chart:

Related Sites Link Module Alexa Audit

Followed by the Top Ranked sites linking to your site:

Alexa Site Audit Top Ranked Sites

I do not find this incredibly useful as a standalone measure of reputation. As mentioned, Alexa Rank can be off and I'd rather know where competing sites (and my site or sites) are ranking in terms of co-occurring keywords, unique domains linking, strength of the overall link profile, and so on as a measure of true relevance.

It is, however, another data point you can use in conjunction with other tools and methods to get a broader idea of your site and related sites compare.

Page Optimization

Checking the on-page aspects of a mid-large sized site can be pretty time consuming. Our Website Health Check Tool covers some of the major components (like duplicate/missing title tags, duplicate/missing meta descriptions, canonical issues, error handling responses, and multiple index page issues) but this module does some other things too.

Link Text

The Link Text report shows a break down of your internal anchor text:

Link Text Report Alexa

Click on the pages link and see the top pages using that anchor text to link to a page (shows the page the text is on as well as the page it links too):

Link Expansion Site Audit Report

The report is based on the pages it crawled so if you have a very large site or lots and lots of blog posts you might find this report lacking a bit in terms of breadth of coverage on your internal anchor text counts.

Broken Links

Checks broken links (internal and external) and groups them by page, which is an expandable option similar to the other reports:

Alexa Broken Links Report

Xenu is more comprehensive as a standalone tool for this kind of report (and for some of their other link reports as well).

Duplicate Content

The Duplicate Content report groups all the pages that have the same content together and gives you some recommendations on things you can do to help with duplicate content like:

  • Working with robots.txt
  • How to use canonical tags
  • Using HTTP headers to thwart duplicate content issues

Alexa Duplicate Content Overview

Here is how they group items together:

Alexa Duplicate Content Grouped Links

Anything that can give you some decent insight into potential duplicate content issues (especially if you use a CMS) is a useful tool.

Duplicate Meta Descriptions

No duplicate meta descriptions here!

Alexa Site Audit Duplicate Meta Descriptions

Fairly self-explanatory and while a meta description isn't incredibly powerful as standalone metric it does pay to make sure you have unique ones for your pages as every little bit helps!

Duplicate Title Tags

You'll want to make sure you are using your title tags properly and not attacking the same keyword or keywords in multiple title tags on separate pages. Much like the other reports here, Alexa will group the duplicates together:

Alexa Site Audit Duplicate Title Tags

Low Word Count

Having a good amount of text on a page is good way to work in your core keywords as well as to help in ranking for longer tail keywords (which tend to drive lots of traffic to most sites). This report kicks out pages which have (in looking at the stats) less than 150 words or so on the page:

Alexa Site Audit Low Word Count

There's no real magic bullet for the amount of words you "should" have on a page. You want to have the right balance of word counts, images, and overall presentation components to make your site:

  • Linkable
  • Textually relevant for your core and related keywords
  • Readable for humans

Image Descriptions

Continuing on with the "every little bit helps" mantra, you can see pages that have images with missing ALT attributes:

Alexa Site Audit ALT Attribute Overview

Alexa groups the images on per page, so just click the link to the right to expand the list:

Alexa Site Audit ALT Attribute Groupings

Like meta descriptions, this is not a mega-important item as a standalone metric but it helps a bit and helps with image search.

Session IDs

This report will show you any issues your site is having due to the use of session id's.

Alexa Site Audit Session ID

If you have issues with session id's and/or other URL parameters here you should take a look at using canonical tags or Google's parameter handling (mostly to increase the efficiency of your site's crawl by Googlebot, as Google will typically skip the crawling of pages based on your parameter list)

Heading Recommendations

Usually I cringe when I see automated SEO solutions. The headings section contains "recommended" headings for your pages. You can download the entire list in CSV format:

Automated Headings Alexa

The second one listed, "interface seo", is on a page which talks about Google adding breadcrumbs to the search results. I do not think that is a good heading tag for this blog post. I suspect most of the automated tags are going to be average to less than average.

Keywords

Alexa's Keyword module offers recommended keywords to pursue as well as on site recommendations in the following sub-categories:

  • Search Engine Marketing (keywords)
  • Link Recommendations (on-site link recommendations

Search Engine Marketing

Based on your site's content Alexa offers up some keyword recommendations:

Alexa Site Audit Keyword Recommendations

The metrics are defined as:

  • Query - the proposed keyword
  • Opportunity - (scales up to 1.0) based on expected search traffic to your site from keywords which have a low CPC. A higher value here typically means a higher query popularity and a low QCI. Essentially, the higher the number the better the relationship is between search volume, low CPC, and low ad competition.
  • Query Popularity (scales up to 100) based on the frequency of searches for that keyword
  • QCI - (scales up to 100) based on how many ads are showing across major search engines for the keyword

For me, it's another keyword source. The custom metrics are ok to look at but what disappoints me about this report is that they do not align the keywords to relevant pages. It would be nice to see "XYZ keywords might be good plays for page ABC based on ABC's content".

Link Recommendations

This is kind of an interesting report. You've got 3 sets of data here. The first is the "source page" and this is a listing of pages that, according to Alexa's crawl, are pages that appear to be important to search engines as well as pages that are easily crawled by crawlers:

Alexa Site Audit Link Recommendations

These are pages Alexa feels should be pages you link from. The next 2 data sets are in the same table. They are "target pages" and keywords:

Alexa Site Audit Link Recommendations Target

Some of the pages are similar but the attempt is to match up pages and predict the anchor text that should be used from the source page to the target page. It's a good idea but there's a bit of page overlap which detracts from the overall usefulness of the report IMO.

Stats

The Stats section offers 3 different reports:

  • Report Stats - an overview of crawled pages
  • Crawler Errors - errors Alexa encountered in crawling your site
  • Unique Hosts Crawled - number of unique hosts (your domain and internal/external domains and sub-domains) Alexa encountered in crawling your site

Report Stats

An overview of crawl statistics:

Alexa Site Audit Report Stats

Crawler Errors

This is where Alexa would show what errors, if any, they encountered when crawling the site

Alexa Site Audit Crawl Errors

Unique Hosts Crawled

A report showing which sites you are linking to (as well as your own domain/subdomains)

Alexa Site Audit Unique Hosts

Is it Worth $199?

Some of the report functionality is handled by free (in some cases) tools that are available to you. Xenu does a lot of what Alexa's link modules do and if you are a member here the Website Health Check Tool does some of the on-page stuff as well.

I would also like to see more export functionality especially in lieu of white label reporting. The crawling features are kind of interesting and the price point is fairly affordable as one time fee.

The Alexa Site Audit Report does offer some benefit IMO and the price point isn't overly cost-prohibitive but I wasn't really wowed by the report. If you are ok with spending $199 to get a broad overview of things then I think it's an ok investment. For larger sites sometimes finding (and fixing) only 1 or 2 major issues can be worth thousands in additional traffic.

It left me wanting a bit more though, so I might prefer to spend that $199 on links since most of the tool's functionality is available to me without dropping down the fee. Further, the new SEOmoz app also covers a lot of these features & is available at a monthly $99 price-point, while allowing you to run reports on up to 5 sites at a time. The other big thing for improving the value of the Alexa application would be if they allowed you to run a before and after report as part of their package. That way in-house SEOs can not only show their boss what was wrong, but can also use that same 3rd party tool as verification that it has been fixed.

Your Favorite Eric Schmidt Quotes?

Do you want Google to tell you what you should be doing? Mr. Schmidt thinks so:

"More and more searches are done on your behalf without you needing to type. I actually think most people don't want Google to answer their questions," he elaborates. "They want Google to tell them what they should be doing next. ... serendipity—can be calculated now. We can actually produce it electronically."

Of course the problem with algorithms is they rely on prior experience to guide you. The won't tell you to do something unique & original that can change the world, rather they will lead you down a well worn path.

What are some of the most bland and most well worn paths in the world? Established brands:

The internet is fast becoming a "cesspool" where false information thrives, Google CEO Eric Schmidt said yesterday. Speaking with an audience of magazine executives visiting the Google campus here as part of their annual industry conference, he said their brands were increasingly important signals that content can be trusted.

"Brands are the solution, not the problem," Mr. Schmidt said. "Brands are how you sort out the cesspool."

"Brand affinity is clearly hard wired," he said. "It is so fundamental to human existence that it's not going away. It must have a genetic component."

If Google is so smart then why the lazy reliance on brand? Why not show me something unique & original & world-changing?

Does brand affinity actually have a hard wired genetic component? Or is it that computers are stupid & brands have many obvious signals associated with them: one of which typically being a large ad budget. And why has Google's leading search engineer complained about the problem of "brand recognition" recently?

While Google is collecting your data and selling it off to marketers, they have also thought of other ways to monetize that data and deliver serendipity:

"One day we had a conversation where we figured we could just try and predict the stock market..." Eric Schmidt continues, "and then we decided it was illegal. So we stopped doing that."

Any guess how that product might have added value to the world? On down days (or days when you search for "debt help") would Google deliver more negatively biased ads & play off fears more, while on up days selling more euphoric ads? Might that serendipity put you on the wrong side of almost every trade you make? After all, that is how the big names in that space make money - telling you to take the losing side of a trade with bogus "research."

Eric Schmidt asks who you would rather give access to this data:

“All this information that you have about us: where does it go? Who has access to that?” (Google servers and Google employees, under careful rules, Schmidt said.) “Does that scare everyone in this room?” The questioner asked, to applause. “Would you prefer someone else?” Schmidt shot back – to laughter and even greater applause. “Is there a government that you would prefer to be in charge of this?”

That exchange helped John Gruber give Eric Schmidt the label Creep Executive Officer, while asking: "Maybe the question isn’t who should hold this information, but rather should anyone hold this information."

But Google has a moral conscience. They think quality score (AKA bid rigging) is illegal, except for when they are the ones doing it!

"I think judgement matters. If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place," - Eric Schmidt

Which is why the blog of a certain mistress disappeared from the web. And, of course, since this post is on a blog, it doesn't matter:

If you're ever confused as to the value of newspaper editors, look at the blog world. That's all you need to see. - Eric Schmdit

Here is the thing I don't get about Google's rhetorical position on serendipity & moral authority: if they are to be trusted to recommend what you do, then why do they recommend illegal activities like pirating copyright works via warez, keygens, cracks & torrents?

Serendipity ho!

SeoMoz's WebApp Reviewed

SeoMoz recently released a beta product called, simply, "WebApp".

Pages