Rich Skrenta has been at the core of search longer than I have been in the SEO market. He is famous for launching sites like DMOZ and Topix. His most recent project is a search engine called blekko, and I recently had a chance to chat with him about blekko, the web, and marketing.
Blekko just launched publicly today. Be sure to check out their search engine, all their SEO features, and the Blekko toolbar.
Most start ups fail. And yet you have multiple successes under your belt and are going at it again. If you could boil success down to a few points, what really separates what you have done from the statistics?
Paul Graham said that the most important thing for a startup is to not die each day. If you can keep existing, that's survival for a company. Generally I like to keep costs low and hire carefully. Also, the first idea doesn't always work. We had to pivot Topix several times to find the right model. For blekko, we just want to make a site that a segment of people will find useful. If we can do that we'll be happy.
It seems openness is a great marketing angle to use online. Why do you feel that it is so under-utilized by most companies?
It feels counter-intuitive to take all our your company IP and secrets and just put them all out there. Little companies also tend to be insecure and want to be appear to be larger and more successful. They want to put on a big company face to the world, but being honest and transparent about who they are and letting the public see "behind the curtain" can often win people over better than a facade of success.
From my perspective, it seems your approach to marketing is heavily reliant on organic, viral & word of mouth strategies. What is broken with the old model of marketing? Is its death happening slower or quicker than you expect?
The internet and social media have made word-of-mouth stronger and stronger, and in many ways they eclipse traditional marketing channels now. This started with blogging and has accelerated with Twitter and Facebook. Everybody is media now. You used to fly around and do a 2 week media tour to launch a product. The aperture to get in the trade press was small, there was a handful of reporters you had to go pitch. Now there are thousands of people who have audience for every trade niche, so it's easier to get the word out about something new. But it has to be genuinely interesting, or your message won't get pickup.
A lot of people who are good at programming make ugly designs. Likewise many people are either programmers or marketers. What formal training or experiences have you had that have allowed an engineer to become such a sophisticated marketer? What strengths do you have that allow you to bridge the disciplines so well?
We joke that we have always made ugly web sites. Fortunately I was able to hire a good designer for blekko and he's been doing a great job taking our early ugly versions and making them a lot more attractive and workable.
I read a lot of stuff about marketing and positioning that we're trying to apply at blekko. I'm a big fan of Trout & Ries. I loved Kathy Sierra's stuff when she was writing. There is some fantastic material also in Kellog on Branding. We also worked with some great positioning consultants that tested various ideas on focus groups to see what would resonate with users best as a message. Every product has a bunch of features, but you want to find the one to talk about that's going to stick in people's heads the best.
I noticed you baked many social elements into your marketing strategy (friend us on Facebook, follow us on Twitter) as well as baking many social elements into your product (personal slashtags, allowing people to share their slashtags, etc.). There is some talk on the web of apps or social stuff replacing search as the center of the web, however from a marketing perspective I see much higher traffic value in search traffic. Do you think that one day social and apps will largely replace global search? Or do you feel it will generally continue to play a secondary role to search?
Social media can drive tons of attention, awareness and traffic. But the search box is the best way to navigate to stuff you want. Now what will drive those results - if I type in "pizza", what should I get? The answer can be very different depending on whether the results are coming from the web, Yelp, or Facebook. So I guess my answer is that I still see search being the core way to navigate, but I think what gets searched is going to get a lot more structured and move away from simple keyword matches against unstructured web pages.
A good number of the social sites are doing redirects for security purposes & to some degree are cannibalizing the link graph. Do you feel that links from the social graph represent an important signal, or that most of that signal still gets represented well on the remaining link graph?
There is very definitely signal in social graph links - potentially more than in the web graph. In 2000, a hyperlink was a social vote. Most links were created by humans and represented an editorial vote. That's no longer true - the web today is inundated with bulk-generated links. To the extent that humans can be separated from bots, there's more true signal in social graphs. The challenge is to get enough coverage to rank everything you need to rank. Delicious had great search results for the corpus of links they knew about, but it wasn't nearly big enough to be comprehensive. Facebook and Twitter are certainly a lot bigger, it will be interesting to see if they start to apply their data to ranking and recommending material from outside of their own sites.
When Google was young Sergey Brin at an SEO conference stated that there was no such thing is spam, only bad relevancy algorithms. When I saw some of your talks announcing Blekko you mentioned that you never want to see eHow in your personal search results. Do you feel that spam is largely down to a personal opinion? If you had to draw a line in the sand between spam & not spam, how would you characterize the differences?
Search must serve an editorial function. You can call this editorial position "relevancy", but that's hiding behind the algorithm. Of course someone wrote the algorithm, and tinkered with it to make some sites come up and others not to come up.
The web has grown 100-fold since 2000. There is most definitely spam out there. Let's take a clear-cut example, like phama links being injected via exploits into unpatched WordPress blogs. Then there is gray-area stuff, like eHow.com. Some people like eHow. Some don't. That's why we let users develop their own /spam filters.
Eric Schmidt mentioned that sharing their ranking variables would be disclosing trade secrets that could harm Google. Yet you guys are sharing your web graph publicly. Are you worried about doing this impacting your relevancy in a negative way? Or do you feel the additional usage caused by that level of awareness will give you more inputs into your search relevancy algorithms?
When I first moved to Silicon Valley I worked in computer security. In security there's an idea that "security through obscurity" isn't very good. What this means is that if you have some new encryption algorithm, but don't let anyone see the details of how it works, it probably is full of holes. The only way to get a strong encryption algorithm is to publish all of the details about how it works and have public review. Once the researchers can't punch any more holes in your algorithm, only then is it good enough to trust.
We see search the same way. If this magic 200-variable equation is so sensitive that if it leaked out the results would be completely overrun with spam, well then the algorithm doesn't actually sound that strong to me. We'd rather work towards a place where there can be public review of the mechanisms driving ranking, and where many eyes can make the spam problem shallow.
Certainly the big search engines have hundreds of human raters that help identify spam and train their algorithms. These are contractors that are the knowledge workers behind the scenes. As a little startup, we asked ourselves how we could get many more people helping us to make our results better, and also be a lot more open about the process. Formerly we had experience running a big crowdsourced search site with the Open Directory, where we had 80,000 editors classifying urls. What if we could get 80,000 people to help us curate search verticals, identify spam, and train classifiers? That would be cool.
You had a blog post comparing pornographers to SEOs. Do you feel the SEO game is mostly adversarial? Or do you feel that paying attention to the SEO industry is a great way to quickly improve the quality of a search product? Or both? :)
I think my comparison noted that pornographers have often been early adopters of new technology. :-)
There is aggressive seo, and then there is what I call appropriate discoverability. Aggressive seo can go over the line - if someone hacks your server to add links, that's borderline criminal activity. But if you have great content and it's not showing up, that's a shame. After we sold topix to the newspapers, we spent some time evangelizing seo within their organizations. Think of all of the movie reviews and restaurant reviews the US newspaper sites collectively have. Wonderfully written material by well-paid professional journalists. But you don't see their content anywhere for a restaurant or movie search. That's a shame.
Recently Ask sorta rebranded away from search & towards more of a QnA format, and Yahoo! bowed out of search through a Bing partnership. Are the cost scales that drive such changes just a legitimate piece of the business model, or were those organizations highly inefficient? How were you able to bring a competitive product to market for so much less?
I was a fan of Ask's Teoma technology, and what Jim Lanzone had been doing with the site. And Yahoo was delivering very high quality results, and had interesting initiatives like the BOSS apis and SearchMonkey. This was all great stuff. I'm disappointed that they lost heart. Running a big company that has been around for a long time is not an easy job.
From an SEO perspective I think that Google tends to have a large index, but crawling so deeply likely allows a lot of junk into their index. Bing seems to be a bit more selective with their crawling strategy. How would you compare Blekko against the other major search engines in terms of depth? Do you feel that relevancy boosts offered through vertical search (via your Slashtags) allows you guys to provide a similar or better experience without needing as large of an index?
Our crawler tends to go into highly ranked sites more deeply than poorly ranked sites. We have a 3 billion page crawl, and so we need to choose the best content to include. This starts at crawl time - should we crawl this url or that url? There are a whole set of heuristics which drive what crawl budget an individual site gets.
The web keeps getting deeper and deeper - the challenge is how to return the good stuff and not sink. This is why we believe human curation needs to be brought back to search. Only by curating the best content in every vertical can the most relevant results be returned.
Amongst SEOs the issue of "brand" as a relevancy signal has been a topic of heated debates. How important do you feel brand is as a signal of relevancy & authority?
One of the things we look at is how natural the pattern of mentions of a site looks. Real brands tend to have a natural pattern of mentions on the web.
You had a blog post a few years back titled "PageRank wrecked the web." How do you feel about paid links? What editorial actions do you guys take when you find paid links?
If links have an economic value, they're going to be bought and sold. It's that simple. What happens in our ranker is that we classify different sources of signals, and then let the machine learning figure out what the signal is telling us. Is this a good source of anchortext? Or maybe a certain class of links even has a negative contribution to rank, if what the links are telling us doesn't correlate with the direction we want the ranker to be going.
How hard is it to detect paid links? What has been the most challenging part of launching a world class search engine?
The whole thing has been hard. Search has so many sub-components, and even things that sound trivial like DNS turn into big projects when you need to scale them up to billions of web pages.
---
Thanks Rich! Be sure to check out blekko. You can follow them on Twitter & read Rich's musings on the web, search, and marketing at Skrentablog.