Fantomaster Interviewed!
Ralph Tegtmeier (aka fantomaster) has been known for many years as having one of the most insightful minds and original voices in the search game. Years ago I wanted to interview him, and only recently did we get to do that.
What did you do before you got into search?
In contrast to the maverick background and achievements my old friend Mike Grehan revealed in his recent interview with SEOBook, my life before search was positively boring. I was born in Egypt and grew up in the Middle East and Asia, where my father served terms in the German diplomatic service. Later, I mastered in Comparative Literature, English Literature and Portuguese philology at Bonn University in Germany. Even before that, I had founded and run (together with two fellow students) an occult bookshop there and went into freelance translation and writing after that.
As a translator, I hooked up with IT almost as soon as it became available, though I did study the subject in some depth before I finally purchased my first PC, a Victor Sirius 286 hybrid that was both IBM and Sirius compatible.
Came the Internet in Fall of 1994, came the "taxBomber" - that was my thentime nom de guerre as an online marketer in the offshore finance, alternate citizenship and privacy protection field.
Before the Web proper was made accessible to all, I'd been on CompuServe and tested the waters there in terms of online marketing, but there were some pretty severe limits to that so it didn't really scale that well. The WWW really changed all that.
As you may expect, in the mid-to-end 90s, optimizing a web site for the search engines was a lot more simplistic than today: keyword stuffing, multiple title tags, invisible text on page - all these techniques worked like a song.
In 1998, I teamed up with my old school buddy Dirk Brockhausen, who by that time held a doctorate in physics and was a certified SAP consultant, working for companies such as IBM and others.
How did you end up in the search field?
My first online business caught on immediately. Competition wasn't too fierce though definitely existent. One day, I stumbled across a report on how to game the search engines - quite probably the first of its kind. I purchased it, implemented a lot of the techniques outlined, and bang! - rankings improved even more! There was a lot of deadweight tied to that approach at the time, e.g. signing up for FFA sites which would bring me a ton of spam mails, etc., quite a nuisance, really. So it became essential to separate the wheat from the chaff.
Around the same time I hit upon the late Corey Rudel's stuff which was an eye opener in terms of efficient marketing, especially the American kind. Lots of impulses from that and still profiting from the impact.
When Dirk an I decided to set up shop, it was a given that we would develop software, the only question was: what kind of application? So we researched the market at some length, caught onto SEO, tested our stuff thoroughly and finally went public with it.
You built the #1 brand in the cloaking space. What were some of the key steps to doing that?
We conducted about a year and half's intense research, experimenting with all kinds of SEO in a variety of niches. Cloaking beat them all stone cold, so that's what we went for in the end.
It was quite obvious from the start that efficient, reliable cloaking requires an equally efficient and reliable database of verified search engine spiders to work from, so that's what we focused on first: the fantomas spiderSpy(TM) service which to this date boasts the world's most comprehensive list of verified search engine spiders. We've been building this list since 1999 and it's generally considered to be best of breed - and these aren't my words, mind you, but what out customers say about it.
As for cloaking proper, at first it was single page cloaking only, giving you mixed sites with both cloaked and non-cloaked pages. Later, as the major search engines began to adopt a more adversarial stance, we developed the fantomas shadowMaker(TM) which generates entire stand-alone cloaked sites, what we tagged Shadow Domains(TM) - a term Google initially stole from us in the first versions of their Webmaster guidelines. (They dropped it again later.)
Much of this was due to our being fed up with having to build SDs manually for our SEO clients, so we decided to automate the process. And so, the fantomas shadowMaker(TM) was born. We're currently working on a new version that will include a ton of additional powerful features to reflect the ever changing search environment.
Is cloaking today as relevant as it was 5 years ago? Do web 2.0 sites and other easy link sources & hosts still make it quite profitable? How has cloaking changed over the years?
Like all things search, cloaking has changed in the course of the years. Initially, it was sufficient to simply cloak single pages on your site, giving you a mix of cloaked and open pages. Then, it was more about foregoing risks for your money sites plus enhanced scalability by deploying self-contained, independent cloaked sites - those Shadow Domains(TM) I mentioned -, effectively restricting your cloaking efforts to these SDs which could be discarded and easily replaced by fresh ones should they be caught out by the search engines.
Today, cloaking has evolved to both include and target RSS feeds, promoting them via the aggregators and feed directories, for example. Our forthcoming new version of the shadowMaker will also include new functionality enhancing page structure variance, inclusion of graphics, CSS, etc. to make the SDs appear even more organic to the spiders. Finally, it will also offer a vastly improved text generation module as well.
Of course, up until now cloaking has generally only addressed on site factors, optimizing webpages for the search engine spiders. What it doesn't do per se is attend to off site stuff such as link building. So once you've started to roll out your SDs, you'll still have to throw a decent amount of good links at them to make their rankings sticky. However, this isn't a change in technology so much as in SEO strategy: once links became more all-important, you had to add link building to your arsenal of SEO techniques just like everyone else.
Is it still relevant i.e. effective? Most certainly - provided you know what you're doing by running a tight ship strategy wise. Essentially, this is nothing new: it simply comes with changing search engine algos, new platforms (such as blogs or social bookmarking sites etc.).
Another, entirely new cloaking technology is still in an experimental stage. It's what we've tagged "Mosaic Cloaking". Here, only specific parts of an otherwise "normal" web page are cloaked for spider fodder, displaying different content to human visitors. This will effectively lead us back, at least in part, to the mixed sites of yore, featuring both cloaked and non-cloaked content on the same domain. Once we have sufficient empirical data on hand to make this technology viable for general deployment, we hope to integrate it into our software, of course.
As for Web 2.0 sites, we're mainly leveraging them for both link building and traffic generation. It's actually quite easy to promote cloaked sites or pages via the social networking platforms these days because people have become so well accustomed to being redirected when browsing the Web that it doesn't tend to raise any eyebrows anymore.
Some well funded web 2.0 sites do things like list "relevant keywords" and "keywords sending traffic to this page"... what is the difference between cloaking and such an automated approach to keyword rich content generation? Why is one considered bad with the other being considered fine?
Well, cloaking or IP delivery in the technical sense is, of course, about displaying different content to search engine spiders than to human visitors. What these Web 2.0 sites are actually doing is going for the old worn keyword stuffing technique, not cloaking proper. (Well, not as a rule, anyway.)
It's actually quite funny to see well-trafficked sites like that adopt an amateurish level of purported search engine optimization which we, as professional SEOs, have long demoted as no longer effective enough. There's many plausible explanations for this, though in the main it's probably all about fundamental cluelessness. But because these sites are getting tons of traffic from other sources than organic search, and in view of the fact that the search engines are concerned about losing large chunks of their traffic and search market shares (think Facebook and Twitter for two prime examples), they seem to be giving them an unabashed preferential treatment which no ordinary mom-and-pop web site can ever hope to be blessed with.
To the uninformed, this may actually seem to endorse such dated SEO techniques though this is an entirely false conclusions. Because it's actually not the keyword and link stuffing at all that helps these sites achieve to high rankings, PageRank etc. - rather, it's all those other factors your run-of-the-mill site cannot easily emulate.
On the client front, we're experiencing a lot more openness towards "black hat" SEO such as cloaking etc. than e.g. 3-4 years ago. Generally, people aren't as impressed or as easily conned by the search engines' (especially Google's) FUD tactics regarding anything they don't like. Sure, they're worried about possibly losing their sites in the search engine indices, but the number of people who'll simply swallow everything Google feeds them by way of their peculiar gospel of what a "good boy or girl" should do or refrain from in terms of SEO is positively on the decrease.
As Google pushed nofollow and became more liberal with the "black hat" label it seems there is less discussion about black hat vs white hat. Do you agree with that? And if so, why has that conversation died down?
I think it's because people are getting more pragmatic about things. Maybe it's the novelty of doing business on the Web which has worn off, maybe it's the vast variety of divergent opinions and schools of thought of SEO and the unprecedented exposure the importance of organic search engine optimization is enjoying in the media.
Whatever it may actually be, I agree that the debate has become de-emotionalized, less religious even. When we started off with formal SEO services back in the late nineties, the debate was all about "ethical" versus "unethical" SEO. Lots of gut level reactions then to what was, after all, merely a technological, not a theological or moral issue. Add to that the increasingly competitive environment people have to cope with on the Internet and it all figures rather nicely. You might arguably say that Web commerce as a whole has matured, as, of course, has the SEO industry proper.
These days, when you speak with clients they won't flinch one bit if you ask them whether they want to opt for a "white hat" or a "black hat" approach. Rather, they'll inquire about efficacy, the relative risks and so on. So it's a pretty much unexcited, hands-on discussion which is a very good thing.
Matt Cutts often tries to equate search engine manipulators with criminals. And yet the same search results will sell exposure to virtually anyone willing to pay for it. From a linguistic and framing standpoint, what gives Google such dominance over the SEO conversation?
I've recently dubbed Matt Cutts as Google's "FUD Czar" for this very reason, not that I expect it will stop him from pursuing that course in future. Next thing we may find him equating black hat SEOs with kiddie porn peddlers, Columbian drug cartels and white slavery racketeers...
I find this a fairly worrying though certainly not an unexpected development. It's an established scare tactics we've seen deployed ever and again in human history: lump your detractors with anywhich foes everyone is concerned about to make all that muck rub off. It's how witch hunts and, in the political field, totalitarian propaganda, especially the fascist kind, have always been conducted.
I know I may get quite a bit of flak for this, but the way I view things Google as a corporation has subscribed to an essentially totalitarian mindset. It's quite clear for anyone to see: in their public statements, in the way they tend to react to criticism, and of course, even more importantly, in the vast array of technologies and data conduits they're rolling out to dominate all the time.
This being the Information Age, information is equated with power - this is a pervasive meme that's dominated Western culture for centuries if not millenia. And this is precisely what Google is trying to monopolize - alas, quite successfully.
But not to worry, I won't set out on a rant with a long winded academic analysis of Google's crypto fascist ideology and praxis here. Suffice it to say that I've studied these matters in some depth for more than 40 years now. This isn't about some whacko conspiracy theory, it's about cold, hard nosed and sober analysis and evaluation of verifiable facts. But let's let it rest there for the time being.
Many ad networks promote fraud because they promote whatever generates the most money (and additional profit margins are often created through fraud). Why is it that the media generally talks about SEO as though it is a black practice shady industry, and pay per click ads are rarely given coverage for promoting things like cookie pushing, adultery, reverse billing fraud, etc.?
For one, advertising is the media's mainstay, their commercial backbone. So we can't reasonably expect them to bite the hand that feeds them and hope to survive the exercise. Essentially, this makes them utterly blind on that score by default. At the very least, they're not given to be unduly reflective about these things.
Second, SEO is still very much a "black art" in the sense that about 99% of all media workers don't know it from scratch anyway. Let's face it: while the basic concepts of SEO are fairly straightforward and easy to explain, actually running successful SEO campaigns is quite another ballgame. Also, what with time and attention spans mutating into ever more expensive and rare commodities, most media workers simply won't (and quite possibly: cannot, even if they would) bother reading your own excellent SEO book or Mike Grehan's outlines - they're too long, too technical and effectively too specialized for your average media hack to invest time and dig into.
Third, while there is certainly an entirely real SEO industry out there now, it's still very much a fledgling operation. Yes, every man and his dog in upper management may know about the importance of SEO for their Web marketing efforts - but which SEO are we actually talking about? Ten experts, eleven opinions, right? To the outsider, it's confusing, it's mysterious, it's dark, and yes: more often than not all this discomfort translates into viewing SEO as being "shady", like it or not.
Fourth, most SEO agencies I know about are actually focused on PPC management. They may offer organic search optimization alright, but overall PPC is a pretty easy sell whereas organic SEO generally isn't. PPC is easy to understand, it's fast and it's still fairly complex enough to require expert assistance if you don't want to sink your advertising budget into uneffective campaigns at a breathtaking pace.
All this makes people feel a lot more comfortable with PPC than with organic SEO, I guess.
But what I actually find a lot more worrisome is that click fraud as a media topic seems to have been pushed snugly to the back burner for years. Unfortunately, this applies to the SEO industry as a whole as well: they don't seem to be too keen on discussing this issue which, in my view at least, is actually doing their clients a great disservice...
Google has a video game patent to exploit video game players based on their mental weaknesses (like a need for security, gambling addictions, or making rush decisons). You had a great post on Sphinn mentioning the hazards of trusting data mining companies too much and the concept of systemic mechanisms of "reality production". Whenever I mention that sort of stuff people assume I am a cynic and look at me like I am crazy. How can you spread the message about such topics without being seen as crazy?
Well, who says we aren't? (Laughs) But seriously: if you define "craziness" as implying a generally unacceptable divergence from the ruling norms and prevailing views of mainstream society, I'd actually wonder if I wasn't into some terrible mistake if people DIDN'T think I was crazy when airing such views. Plus, the original "cynics" in Ancient Greece were the "dog philosophers" which is what the term actually implies: an eminently contrarian crowd in bitter opposition to the fattened, smug establishment of conventional philosophy. So in a way it's really a badge of honor, don't you think?
It's about the violation of comfort levels, I suppose. People are having a very hard time coping with the pace at which current technology is changing the world, both emotionally and intellectually. If all you're worried about is somehow making ends meet, feeding your family, coughing up money for your mortgage, for medical care and paying for your kids' schooling, you'll tend to reduce your outlook to a tunnel vision. It's called "focus", I know, but more often than not it's a type of mental self-amputation resulting in narrow mindedness, simplistic views of the world and, what's worse, a general refusal do deal with anything unfamiliar if it threatens to shake that less than stable edifice you may mistake for a life.
Once you start putting matters into a larger perspective, they tend to confuse people even more. This, in turn, evokes emtional, gut level reactions - quite irrational, true, but very easy to explain, too: "So what's Google gotta do with fascism now - is that all you can think of, weirdo?"
Actually, this is nothing new at all. Personally, I and many members of my generation experienced a lot of this in the sixties when more or less all members of the political and economic establishment felt threatened by the hippy movement, the anti Vietnam war protests and a general criticism of capitalist and corporate values. Different contentions, to be sure, but the same mechanisms at work nevertheless.
In a Twitter post you made you mentioned something about the web becoming more narcissistic. What is driving that? How can it be prevented on an individual and group level?
To address your second question first, I don't think it can be "prevented" in any pro-active way unless you want to pull the plug on it all e.g. by canning the platforms allowing for it - hardly a realistic scenario, I would think. I'm fairly certain that it will abate to some extent once people's attention starts shifting to other matters, rather than playing voyeurs to some narcissistic exhibitionists. As it stands, it seems to reflect what's been going on in terms of TV show entertainment for many years now: people exhibiting all kinds of entirely personal quirks and traits, with tons of viewers obviously enjoying it, too.
So what's actually driving it? In a nutshell: atomization. With large families and tightly knit rural communities losing ground in favor of "individualism" and an ever more disrupted social fabric, overall societal stability can only be achieved by marginalizing the individual, feeding it (and dumbing it down) with lots of vicarious pleasures in lieu of actual participation in political, economic and societal power - call it the ideology of consumism, if you will. It's one price we're paying for our physical mobility and mental flexibility: the waning influence of the individual i.e. the very same atomization I've mentioned.
What the Web does offer us is a slew of possibilities to at least create some noise and garner a bit of attention - without more immediate social controls being in place to set us stringent limits like we would have experienced them in meatspace. Further, anonymization helps forego even those controls that have actually been implemented: if your forum moderator chucks your account for whatever reason, it's dead easy to sign up under a different identity to continue creating a stink if that's what you're up to.
Don't get me wrong: I'm not bashing the Web in any way - it offers everyone an incredible amount of wonderful possibilities we've never seen before at such a scale. Think of all the options you have in terms of gathering information on anywhich topic, or of mustering support for a cause you feel strongly about, to name but two examples.
But there's an obvious downside to it as well: as everything is essentially accessible to everyone, you're bound to hit upon lots of people you may find obnoxious or boring or outré - certainly more than you did at college or in your rural community where you grew up in pre-Web times.
Why is it that Google thinks highly of public relations (even if founded on lies) but thinks poorly of most other bulk link building strategies?
Well, as Bob Massa never tires of pointing out, a search engine's primary objective is NOT to "delivery relevance" as so many people are fond of fooling themselves and others, it's to make a profit, period. Verbatim: "A search engine's primary purpose is NOT to deliver relevancy. A search engine's primary purpose is to deliver revenue. That is not the same thing."
While many SEOs still seem to find it hard to come to terms with that, it's pretty obvious that the folks over at Google were pretty slow to learn that lesson themselves. Oh, they certainly did so in the end, and with a vengeance, too. But along with this came all the other trimmings that will make or unmake just about any commercial enterprise, an ingrained preference for low pay being compensated with lots of feelgood high talk for the suckers included. See Michael Arrington's summary "Why Google Employees Quit" for some pretty telling insights.
Of course, hypocrisy plays a major role in this field as well: just like "spam" is always what the other guy is doing, not you yourself, "public relations" is always ok for Google if it helps you ramp up your company to potential client status. At the end of the day you'll have to conduct a lot of public relations to be able to afford some serious AdWords advertising - simple as that. So it makes no sense killing the cows you actually want to milk further down the road.
By contrast, however, undetected paid links will negatively impact Google's fundamental business platform because they can't really deal with them effectively, being so very link biassed as they are (or used to be) - so they're bound to be slated as a big no-no from their point of view.
None of this is illogical in any way - but of course that doesn't mean that we as SEOs have got to like or condone it. I know for sure that I don't...
In many ways (nofollow, nepotism, publishers requiring payment for links) the "organic" link has died a slow and painful death. Do you see Google and other search engines moving away from linking as a core component in their relevancy algorithms?
Personally, I tend to view Google's ongoing campaign of stressing the "evils" of undisclosed paid linking as a sign of utter desperation. Yahoo! and MSN/Live as well as Ask, while still relying heavily on links themselves, aren't half as outspoken or, more precisely, as hysterical about it.
I am also on record umpteen times as having pointed out that PageRank and, in fact, all ranking technologies unduly biassed towards inlinks are suffering from a fundamental fallacy. Because links may be lots of different things to many people, but they're definitely not simple "votes" in the sense of unequivocal acceptance, recommendation or endorsement, i.e. quality. At the very least, that's only a tiny fraction constituting their overall functionality.
To reiterate, PageRank in its original form was nothing but an overblown and hyped citation index, directly derived from academia's predilections: in the past 40 years or so it's become a very popular metrics to grade scholars by the number of citations they can ramp up, very much in line with their overall "publish or perish" career criteria. Allow me to point out, however, that this is essentially a culture thing: on the whole, European academics, to cite a contrarian example, have always staid aloof of this mindset. Plus, competition is just as fierce and cut throat in their world as it is in the "outside world" of regular commerce. I'm not sure there's actually a lot of "citation buying" going on in the academic universe, but frankly I wouldn't be too surprised if there were.
Be that as it may, a citation index makes even less sense in a commercial environment than it may possibly do in academia. Why should you want to link to your competitors? Why should they link to you? And if I happen to link to some article of yours I happen to be in violent disagreement with, trying to refute it in all bitterness, and ridiculing you on the same stride - does that link constitute a "vote" even in terms of "relevancy"? Or a "quality" indicator? That's like arguing that Jewish activist sites rightly pointing out anti-semitic or racist pages they are in disagreement with are actually endorsing them. So what if thousands of Jewish pages are linking out to the same revisionist neo-fascist site until it starts ranking above them all? That's plain ridiculous.
I mean, is any old "reference" a "vote" or even an indicator of "relevancy"? Sure, pointing to your sources to underpin your arguments will lend them (and you) more credibility, just like in academe. But make no mistake: such questions aren't as clear cut and easy to answer as one may wish to think - after all, philosophers have been wrestling with such issues for centuries for a slew of good reasons.
So if linking as a signal of relevancy is flawed at the very best, what alternatives do the search engines actually have? And in a more direct response to your question proper: I am seeing a lot of experimentation being conducted these days, ranging from behavioral metrics to personalization of search. SERP hand jobs seem to be hitting it big now, too, certainly as far as extremely competitive niches are concerned, think PPC in the "black hat" sense of "pills, porn and casino" sites.
While it may still be premature to term this the "return of on page factors" as a critical ranking element, we're actually seeing a lot of this happening again, albeit in a very pussy footed manner.
As more people compete for attention online do you see that increasing or decreasing the quality of the web as a whole?
That's a bit like asking whether the glass is half full or half empty, I'd say. The Web is expanding, that's a fact, of course. Obviously, this applies to what you or I may consider the "bad" as much as it does to what we deem to be "good", whether it's sources of information or common behavioral traits.
In many ways it's like a commotion on the rural market place: the more people join in the fray, the louder it tends to get - and the more aggressive you'll have to be when competing for attention.
But if you shun the crowd to retire to your private club and meet with your peers, things tend to get a lot more quiet and comfy again. This is actually happening at quite a large scale these days: there's lots of "closed shop" forums and communities online who will strictly vet their members to keep out the riffraff.
Google's CEO recently stated that "brands are how you sort out the cesspool" and that humans were hardwired for brands. Did it surprise you when he said that?
Frankly, I hope I'll never live to see the day when the likes of Eric Schmidt actually manage to surprise me. I mean, what to make of a man who is on record for blithely stating that World War I was caused by a "lack of understanding" between nations - something he claims Google will actually help prevent? Sure, this may be the Reader's Digest naive version of how WW I came about, but it certainly doesn't reflect reality in any meaningful let alone accurate or verifiable way. What it does reveal, of course, is a picayune, self-serving and utterly petit-bourgeois mindset. (And no, I won't dig into the question of where the 20th century fascists used to recruit the lion's share of their followers...)
Ok, so he's obviouly no qualified historian - but is he an anthropologist, then, making even more asinine claims like this one? "Hardwired" according to Mr Schmidt the neurologist, eh? And what, pray, makes the Web a "cesspool", anyway?
No, I'm not surprised at all: brands are what Schmidt and his chums are comfortable with, what they flatter themselves to understand well. Well, perhaps they actually do, but really, my only reasonable comment on this one is: "garbage in, garbage out"...
Search penalties are well known to be two tier depending on things like "brand." How does one know how far to push while staying within their desired risk/reward ratio?
For all the ballyhoo ramped up around "scientific SEO" (and, for that matter, "scientific marketing" - of which SEO is arguably but a minor subset), it's always been about trial and error and - and this is really important! - educated guesswork. Because the cards have always been stacked from day 1: the search engines won't allow us to study and review their ranking algorithms (which, from their perspective, is perfectly understandable, of course). Also, they can exploit vast amounts of usage data no single SEO company can ever attain to even remotely - and thus they're always leaving us with the short end of the stick. Which, in statistical terms, means that we as SEOs can never hope to get the full picture anyway.
But even if it's a David vs. Goliath kind of scenario, the search engines' major weakness is their requirement to turn a buck. This makes them just as vulnerable to advertiser pressure tactics as most classic deadwood newspapers are and, in fact, always were.
When all is said and done, you cannot ever really know for sure how much is too much of anything: every niche is different and there's no such thing as a golden key to them all. So it's a question of learning, usually the hard way, of trying out different things, both old and new, of testing, testing, testing.
On the upside, if you're not concerned with branding so much, you can easily skew that risk/reward ratio in your favor by essentially cloning your sites (yes, modify them a bit so their not all-out dupes) and run various SEO strategies for them. That way, you'll probably get more exposure while minimizing your risks. Should one or several of your sites underperform or even get penalized, you'll still have others that should perform well enough. So it's really about scaling done properly.
The reliance on brand and domain authority has lowered result diversity on many fronts. Will the fear of spam cause Google to keep clamping down on diversity, or will mom and pop shops still have a chance online 5 to 10 years from now?
This will probably depend on how the search market will evolve in general. If people should get fed up with getting served more and more brands they've known about anyway, this approach may lead to a dramatic loss of market share. If so, Google's only choice will be to push back brands in favor of lesser sites and more diversity again.
Nor is this entirely unrealistic: brands are one thing, but consumer experience with these brands' products is quite another. Personally, if I want to know more about some product being offered online, I'll inquire on Twitter where I'll typically get a ton of useful responses in a whiffy - no way Google or any other major search engine can match this presently. And I'm certainly not alone: I know lots of people who are doing exactly the same now.
Then, when I'm finally ready to buy, I don't need Google to compare offers and prices, either. Once I've bookmarked my favorite comparison sites, I can merrily fulfill my consumer duties without hitting any major search engine at all in the process.
What I'm not sure about is whether people will actually go to the lengths of explicitly demanding other, better search results from Google etc. It seems more likely that they'll simply vote with their mice and go elsewhere - that's a lot easier and faster to do than having to deal with a sluggish, unresponsive behemoth of a corporation.
Generally speaking, I'm afraid I don't see mom-and-pop shops gaining any leeway within the foreseeable future as there's nothing to indicate currently that they actually will. But then, 5 to 10 years is a time span I'd be loath to predict for anyway: too many unknown variables at work here. Two to three years seems a more tangible time frame, and I doubt we'll see any major improvement of small web sites' clout and standing within that span.
Is search an already won natural monopoly? If not, what do you see hurting Google from a competitive standpoint?
For all its undisputable explosion and evolution in the past 15 years or so, search is still in a very primitive, almost primeval stage in my view. Think "Deep Web" which has hardly been scratched superficially as yet - and yes, think "relevancy", too: we're still very much experiencing the Stone Age of search currently. By inference, search is bound to undergo some very fundamental changes pretty soon, and so will searchers' requirements and expectations.
The way many Web 2.0 sites are starting to impact search as we knew it is a good case in point. I've mentioned my own Twitter usage by way of some anecdotal evidence. Sure, Twitter may still turn out to be a mere ephemereal fad in the end, the way MySpace hasn't managed to live up to its original overblown promise. There's many people predicting just that, and who knows - maybe they're right.
But no matter who will evolve to become the biggest boys on the block in the end, and it seems very likely that there'll be several of them, this is where current crawler based all-purpose search is certainly beginning to hurt. If eyeballs are really everything, I for my part wouldn't want to bet the farm on Google maintaining its current monopoly of the search space for very much longer. And I don't see Google being all smug and ignorant about it, either: it's one of the reasons why they're expanding into so many different fields ranging from mobile communication technology to trans Pacific data cables, book digitalization and online document storage, to mention but a few.
For all we know, we may possibly witness the return of the vertical fairly soon as well. This would actually dovetail nicely with the prevailing trend towards ever more granular specialization and specificity. Highly specialized information archives, focused on specific fields of expertise and an equally selective user demographics only, be it directories or portals or crowd sourced networks or databases may well be the one big thing to watch out for.
What have you been up to lately? Do you have any new products or services launching soon?
While we're best known for our cloaking applications, our activities are actual a lot more varied than that. For example, our 100% "white hat" "10 Links A Day" link building service over at http://10LinksADay.net/ is another major focus of ours.
Beyond that, we're very busy developing proprietary technology in the field of automated content creation: targeted towards clients' specific requirements in terms of topicality, keywords and links in a scalable manner, this is what I'm most involved in myself currently. Moreover, the content we're creating is all 100% readable and entirely unique stuff of an unprecedented quality, if I say so myself.
Having presented this to our 10 Links A Day clients as a special, subscribers only offer up until now, we'll soon roll it out as a stand alone service named "Customized Content Creation" (CCC).
___________________________________________
Thanks a bunch Ralph! To read his latest thoughts on search, check out his blog at http://fantomaster.com/fantomNews/
Comments
Ralph,
I really love your work. I believe your products are some of the most powerful weapons in the SEO world and most self employed internet marketers are scared to use them due to FUD.
P.S. There is nothing more powerful than auto tweet creation (link to the google search on your tweets about the blog post that talks about " submitting sites to ask.com" from July 2007. The way you manage to diversify the text in the tweet is brilliant.
Brilliant work. Look forward to future products,
Mert
Excellent interview. Mr Schmidt the neurologist, hehehee... Waiting for "Kumo", the new search engine by Microsoft but I am not having many expectations about that, especially when it is powered by Microsoft. But, anyway, a competitor for Google should be a good thing.
I share Ralph's thoughts on Google in that they're still very much relying on links after all these years despite the FUD they come out with. Matt Cutts just repeats the "great content gives you links" mantra over and over. Well, I'd rather be more proactive about link building while still writing quality content.
Also agree that links aren't always a vote. You can link to a site you despise via a lengthy criticism of this site in a forum, yet your link (according to Google) is a vote FOR this site.
Also another problem with Google's heavy reliance on links : really natural links tend to have meaningless anchor text ("click here", "this" etc). People are linking for the visitor, not the search engine. SEO'd links on the other hand are written with the algorithm in mind. So naturally won links are often less valuable than the "proactively" gained links.
Interesting interview. I must admit I had no idea Fantomaster/you was/were from Germany (like myself). That very fact and the interview I just read makes me want to ask a question:
Whenever, I did keyword/market research for a website I was trying to build I was hoping I could leverage my German/French language skills in order to build a site in a market with less competition (the German/French market).
However, most of the time the English SERPs seemed a lot more interesting...More often than not, simply, because our German SERPs were cluttered with "authoritative" sites..or brands...or whatever you want to call it - it looks ugly and uninviting to an SEO ;-).
I've been wondering for quite some time whether that had to do with our algorithms for Google Germany/France being somewhat different or if it simply came down to less mom and pop shops doing real SEO/link building to get to the top. My guess would be that there's a difference in the algorithm, because otherwise I'd expect to still see more sites rank "naturally" (I might be wrong of course).
Any idea, if algorithms of different tlds feature brands more strongly - maybe because Google doesn't have too many (none?) people that can review the German/French SERPs and do handjobs? Maybe our German/French SERPs are what google.com will look like if the brand things continues?
Ralph is so obviously a man of incredible intellectual fortitude but for all his mental prowess and complete in-your-face upfrontedness, what I have always respected most about him is his clarity of vision and his ability to express that vision in a way that makes it difficult to argue with.
I admit that my cyber career, (as opposed to a REAL career), would be greatly diminished were it not for the fantopinions of this man! I am grateful for such a clear and honest beacon of enlightened vision amidst a dark void of original thought and corporate self-serving gooblespeak.
Plus I like the way he tries so hard to "fit into" the mainstream. >for those who don't know -- that was drowning in sarcasm<
BTW Ralph, thanks a lot for the heads up on the blog translations. Pulling those down cost me almost 3 hits a day! ;-)
massa
rel="nofollow" doesn't automatically come to mind, but it's also recommended in Google's Search Engine Optimization Starter Guide, Version 1.1 (p.19) :P
"Another use of nofollow is when you're writing content and wish to reference a website, but don't want to pass your reputation on to it. [...] This would be a good time to use
nofollow."
Thanks for this excellent, mind-opening interview.
This is one of the articles I will come back and read again from time to time, especially for its non-technical content!
I really enjoyed reading this. Any time I hear someone speaking their (SEO) mind without falling in line with what google is constantly pushing... well I appreciate it. Fantomaster, I just went from follower to fan.
~ Jim
~ @seo_web_design
Ralph is the man.
And I didn't really connect with him until Twitter came along...he's got some great updates!
Thanks for the kudos, guys, much appreciated!
@Patrick: While I don't really follow the German search space too extensively, being focused on the English language markets, from what I can actually discern I'd agree with you that it's primarily an algo thing. This makes a lot of sense even from the scientific angle: overall, computational linguistics, especially Natural Language Processing (NLP) is very much concentrated in the English speaking world (mainly: academe). And while there's plenty of NLP tech around now that's entirely language agnostic, dealing with different languages as a whole poses tons of additional problems of its own.
One prime example being automated translation. Guess it's only the ultra savvy (ahem) minds of Mr Schmidt and his ilk who would seriously believe that accurate and usable machine translation is a realistic perspective for the foreseeable future...
So to cut a long story short: it's simply a whole lot more complex to adapt anywhich English biased ranking algorithms to non-English content - and "more complex" will typically translate to "more expensive".
I guess that's why many German, Dutch and Scandinavian SEOs I happen to know personally will concede (albeit merely in private for all the obvious reasons) that doing SEO for German or Dutch or Swedish etc. is very much like shooting fish in a barrel if you know what you're doing. There seems to be an unofficial consensus that what we're seeing in these search spaces is essentially suffering from (or enjoying, depending on which side of the fence you're invested in) a 2-3 years' development gap as compared to English language search.
We cannot, however, entirely rule out that second factor you've mentioned as well: generally, German mom-and-pop shops seem to be even more underdeveloped/unsavvy re SEO than their average US counterparts. So this might certainly help skew results as well.
@massa: So chagrined about those 3 missing hits a day, Bob! Though they were probably bots anyway, lol. Maybe some decent, properly translated content will do the trick in the long run. I mean, it's not as if there were an overabundance of quality SEO content in the German Web currently - though I'll gladly agree that the same may be argued for English as well, heh.
Thanks for the reply fantomaster..interesting (in particular) that you seem to believe that machine translation (that actually works) wont happen during the foreseeable future. I used to want to go the translation (and computational linguistics route) when I was trying to find a major in college, so dug into it a bit and read stuff about computational linguists sort of having given up on that notion (of that happening during the foreseeable future).
However, in the field of SEO (I remember a comment on this blog, actually where someone gave the reason that computational power will sort of explode at some time in the near future..or something along those lines) Ive been hearing opinions different from that...I guess one only needs to look at babel fish hehe (but personally I dont understand this topic well enough to be able to tell when or if itll ever work..automatical translation)
@Patrick: While no one who isn't a true psychic (if there is such a thing...) can accurately predict the future, historically all attempts at automatic translation have been utter failures, at least if you actually demanded a modicum of consistent quality. (And even if you downgrade your expectations to a mere "halfway intelligible" it won't work that well in just about all areas with very few exceptions, e.g. some highly technical chemistry papers.)
Back in the 50s and 60s, the buzz was on about "MT" (Machine Translation), driven by overblown expectations tied to Chomsky's Generative Grammar, Generative Transformation Grammar etc.
Gradually, it petered out to a mere MAT ("Machine AIDED Translation") which, of course, is an entirely different animal. As it stands, there's little to indicate that we'll see such a breakthrough as you're referring to anytime soon.
I may be wrong, of course, and very probably highly biased, having worked as a freelance translator for 15+ years. But I wouldn't know of one single major linguist or Natural Language Processing (NLP expert who'd currently disagree.
Listening to the hype and drivel spun by techies and marketing people like Mr Schmidt, you could be excused for believing otherwise, but I'd bet the farm that you'd be entirely wrong in doing so.
thanks fantomaster, I didnt really buy into it (though was a bit confused I must admit), because what I had read before..coming from computational linguists (I was considering majoring in it in college) didnt sound overly optimistic and that they basically had given up or well lowered their expectations, etc...which I guess speaks volumes, as they would seem to be biased towards believing it (as its their main career)
Hi everyone, and thank you Ralph for your insight and words.
About local search algorithm compared to the English market.
I have personally not seen any difference in my local google that is google.se compared to google.com
Maybe there is a difference but my guess is that it is actually not and if there is it is very minor.
Also the authorities that are taking the 10 first spots are usually only authorities in the main keyword that they have targeted. they may rank for other long tail keywords but its usually not very difficult to rank over them with a seo correct article that is targeting the long tail.
Best regards
Amin
@Amin: Checking traffic logs, I get to see quite a bit of local search results across just about any Google entity. Sometimes SERPs will vary wildly, sometimes not at all. Haven't dug into it at greater length to determine any consistent patterns.
As for surpassing authority sites on the long tail: yes, that's what we're seeing most of the time, too. In a way, pimping authority sites in the SERPs for the short tail, as Google are doing so frequently, actually adds to that impression of their utter desperation I mentioned in the interview - almost as if they weren't "sure of themselves" (or however you want to define algorithmical uncertainty, heh), gladly allowing for other sites to cover the more granular stuff.
Which, if true, is a very good thing for sites specifically (and preferably: intelligently) targeting the long tail.
That's weird - my response that I wrote yesterday hasn't displayed...?
Fantomaster, the short-winded version of it is that I think this will go down as one of the best interviews of the year by far - a truly awesome, honest and insightful interview.
Thanks for sharing your thoughts with us.
Ben
Not sure what happened to that comment Ben. Usually they post instantly. Maybe the database did something weird, there was a cookies issue, or we accidentally deleted it. :(
I delete overt comment spam from people like buy viagra online, but have never intentionally axed one of your comments.
Not to worry - these things happen. Regardless, it's still to be the interview of the year!
:)
Great interview, thanks Aaron and especially Ralph.
The preferential treatment in Google is one of the things that makes me despise the company more each day. Take This Query for example. A premium Adsense publisher using simple 90's useragent cloaking, and removed the cache link with a noarchive.
Nothing but a premium ad farm with a snippet of useless text, switch your useragent to Googlebot and dang there's the 6 page article that contains the text i searched for.
Google "has" to be going out of their way to allow stuff, because if anyone else done it the regular Googlebot would ding the site in a heartbeat and your Adsense account would be history.
I know this was reported to Google over 6 months ago, and someone posted it on Matt Cutts blog (he deleted it).
There's thousands of other examples, and seeing such blatant bias every day gives me absolutely no motivation to follow Google's recommended practices.
Keep up the great work Ralph.
Add new comment