Google Gearing Up for Relevancy Changes

Over the past year or 2 there have been lots of changes with Google pushing vertical integration, but outside of localization and verticalization, core relevancy algorithms (especially in terms of spam fighting) haven't changed too much recently. There have been a few tricky bits, but when you consider how much more powerful Google has grown, their approach to core search hasn't been as adversarial as it was a few years back (outside of pushing more self promotion).

There has been some speculation as to why Google has toned down their manual intervention, including:

  • anti-trust concerns as Google steps up vertically driven self-promotion (and an endless well of funding for anyone with complaints, courtesy Microsoft)
  • a desire to create more automated solutions as the web scales up
  • spending significant resources fighting site hacking (the "bigger fish to fry" theory)

Matt Cutts recently made a blog post on the official Google blog, which highlighted that indeed #3 was a big issue:

As we’ve increased both our size and freshness in recent months, we’ve naturally indexed a lot of good content and some spam as well. To respond to that challenge, we recently launched a redesigned document-level classifier that makes it harder for spammy on-page content to rank highly. The new classifier is better at detecting spam on individual web pages, e.g., repeated spammy words—the sort of phrases you tend to see in junky, automated, self-promoting blog comments. We’ve also radically improved our ability to detect hacked sites, which were a major source of spam in 2010. And we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.

It sounds like Google was mainly focused on fighting hacked sites and auto-generated & copied content. And now that hacked *GOVERNMENT* websites are available for purchase for a few hundred Dollars (and perhaps millions in personal risk when a government comes after you) it seems like Google's pushing toward fighting off site hacking was a smart move! Further, there are a wide array of start ups built around leveraging the "domain authority" bias in Google's algorithm, which certainly means that looking more at page by page metrics was a needed strategy to evolve relevancy. And with page-by-page metrics it will allow Google to filter out the cruddy parts of good sites without killing off the whole site.

As Google has tackled many of the hard core auto-generated spam issues it allows them to ramp up their focus on more vanilla spam. Due to a rash of complaints (typically from web publishers & SEO folks) content mills are now a front and center issue:

As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content. In 2010, we launched two major algorithmic changes focused on low-quality sites. Nonetheless, we hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content. We take pride in Google search and strive to make each and every search perfect. The fact is that we’re not perfect, and combined with users’ skyrocketing expectations of Google, these imperfections get magnified in perception.

Demand Media (DMD) is set to go public next week, and Richard Rosenblatt has a long history of timing market tops (see iMall or MySpace).

But what sort of sites are the content mills that Google is going to ramp up action on?

The tricky part with vanilla spam is the subjective nature of it. End users (particularly those who are not web publishers & online advertisers) might not complain much about sites like eHow because they are aesthetically pleasing & well formatted for easy consumption. The content might be at a low level, but maybe Google is willing to let a few of the bigger players slide. And there is a lot of poorly formatted expert content which end users would view worse than eHow, simply because it is not formatted for online consumption.

If you recall the Mayday update, Richard Rosenblatt said that increased their web traffic. And Google's October 22nd algorithm change last year saw many smaller websites careen into oblivion, only to re-appear on November 9th. That update did not particularly harm sites like eHow.

However, in a Hacker News thread about Matt's recent blog post he did state that they have taken action against Mahalo: "Google has taken action on Mahalo before and has removed plenty of pages from Mahalo that violated our guidelines in the past. Just because we tend not to discuss specific companies doesn't mean that we've given them any sort of free pass."

My guess is that sites that took a swan dive in the October 23rd timeframe might expect to fall off the cliff once more. Where subject search relevancy gets hard is that issues rise and fall like ocean waves crashing ashore. Issues that get fixed eventually create opportunities for other problems to fester. And after an issue has been fixed long enough it becomes a non-issue to the point of being a promoted best practice, at least for a while.

Anyone who sees opportunity as permanently disappearing from search is looking at a half-empty glass rather than one which sees opportunities that died reborn again and again.

That said, I view Matt's blog post as a bit of a warning shot. What types of sites do you think he is coming after? What types of sites do you see benefiting from such changes? Discuss. :)

How To Measure Bias In Google's Results

Here's an interesting study, conducted by Benjamin Edelman and Benjamin Lockwood, from the Harvard Business School. The study measures how much search engines, Google in particular, favor their own web services.

We find that each search engine favors its own services in that each search engine links to its own services more often than other search engines do so. But some search engines promote their own services significantly more than others.... we find that Google's algorithmic search results link to Google's own services more than three times as often as other search engines link to Google's services. For selected keywords, biased results advance search engines' interests at users' expense

People have debated this topic for a while, some saying the search engines can do what they like, others feel the search engines must be held to account.

However, the study brings up an important point. If Google claims to have algorithmic, "objective" search results, then it follows that Google should not favor their own companies properties, unless those properties achieve a top ranking based on their own merit.

Google can't have it both ways.

The problem, of course, is that Google could tweak the algorithm to favor whatever qualities its own properties display e.g. the PageRank of Google's own pages could be calculated - in truly cryptic and oblique fashion - as being of higher "worth". After all, there's no such thing as "objective" when it comes to editorial, which is the function of a search algorithm. There are merely points along a continuum of subjectivity.

But where it gets interesting is the study goes one step further. It tries to figure out what the user wanted when she searched. Did the user want to find a Google service at #1? And if not, then isn't Google doing the user a dis-service by placing a Google property at #1?

In principle, a search engine might feature its own services because its users prefer these links. For example, if users of Google's search service tend to click on algorithmic links to other Google services, whereas users of Yahoo search tend to click algorithmic links to Yahoo services, then each search engine's optimization systems might come to favor their respective affiliated services. We call this the "user preference" hypothesis, as distinguished from the "bias" theory set out above

They tested this theory using click-thru data. Regarldess of the search keyword, users almost always favor the #1 result - 72% of the time. So what if the user clicks further down, indicating that the first result is less relevant?

Gmail, the first result, receives 29% of users' clicks, while Yahoo mail, the second result, receives 54%. Across the keywords we looked at, the top-most result usually receives 5.5 times as many clicks as the second result, yet here Gmail obtains only 53% as many as Yahoo. Nor was "email" the only such term where we found Google favoring its own service; other terms, such as "mail", exhibit a similar inversion for individual days in our data set, though "email" is the only term for which the difference is large and stable across the entire period

There is a huge incentive for search engines, which increasingly crossing the line into publishing territory, to skewer the results towards their own properties. The traffic is valuable, and, whatismore, can be channeled away from competitors.

As Aaron pointed out a few months ago, if Google choose to enter a new vertical, such as travel or local, then you'd better watch out if you compete in those verticals. Regardless of how relevant you are to the search term, it's below-the-fold you'll likely be going.

So, yes, it may be Google's search engine, but they can't make claims about focusing on the user above all else, otherwise they'd return results the user wants, as opposed to possibly directing the user to Google properties due to other considerations. How can they claim "Democracy works", if they don't favour whatever site the link graph "votes" most relevant? And doesn't this come down slightly on the wrong side of "evil"?

So, What To Do?

If you feel Google can position their own sites where they like, then nothing.

Personally, I think any company can do what they like, until they reach a point where they become so influential, they can use their sheer size to reduce competition and choice. If we believe that free markets require healthy competition in order to thrive, then we should be wary of any entity that can reduce competition using anti-competitive behavior.

I'm not saying that is what Google is doing, but watch this space. Some European agencies are investing allegations of anti-trust violations.

The Commission will investigate whether Google has abused a dominant market position in online search by allegedly lowering the ranking of unpaid search results of competing services which are specialised in providing users with specific online content such as price comparisons (so-called vertical search services) and by according preferential placement to the results of its own vertical search services in order to shut out competing services

The fact Marissa Mayer said this:

[When] we roll[ed] out Google Finance, we did put the Google link first. It seems only fair right, we do all the work for the search page and all these other things, so we do put it first... That has actually been our policy, since then, because of Finance. So for Google Maps again, it’s the first link

....makes matters......interesting ;)

Secondly, if you're big enough, you could make a point of taking Google on. Check out Trip Advisors take on Google Places displaying Trip Advisors data in repackaged form, which could cause Google users to stay on Google, and not go to the Trip Advisor site:

Google is no longer able to stream in reviews from TripAdvisor to Places pages after the user review giant blocked it. TripAdvisor confirmed the move today in an email, stating that while it continues to evaluate recent changes to Google Places it believes the user does not benefit with the “experience of selecting the right hotel”. As a result, we have currently limited TripAdvisor content available on those pages,” an official says

But Google aren't really going to care much about you if you don't have some major clout.

Thirdly, stay out of any vertical Google is likely to want to own. It is likely that Google will be going after the big verticals, because a big company needs to score big on projects. Long tail stuff isn't going to make any difference to their bank balance, except in aggregate, so there will be millions of verticals in which you'll never face a direct threat.

This is also a timely reminder to build up your non-search traffic in case Google, or any other search engine, decides to change the game significantly in their favor. Encourage users to bookmark, develop your social media brand, build mailing lists, put some valuable content behind log-in/pay walls, and build membership sites. Relying on Google has always been a risky strategy, do diversify your traffic strategy where you can in 2011.

Google Approaches Its Breaking Point

Google's Take On SEOs

Google likes to make SEOs look like fools. Some are, but some are simply privy to less information. Or, in some cases, thrown under the bus by a new wave editorial policy in the gray area. Inconsistent enforcement is a major issue, but even if you go beyond that, the truth is most businesses have a range of revenue streams from pure as can be to entirely parasitic.

Google is Sleazier Than Microsoft

Recently we saw Matt Cutts play investigative reporter & literally create a story about how dirty some of Bing's affiliates are. Here is the litmus test though: when Microsoft became aware of it they immediately canned the shady distribution partner. Meanwhile, Google still funds toolbars that put AdSense ads in them AND to this day Google still funds google.bearshare.com, which *is* driven by the same kinds of home page changes that Matt found distasteful.

In Manufacturing Consent Noam Chomsky highlights that we should judge actions based on an equality of principals & that we are responsible primarily for our own actions. Yet Google complains about Microsoft. It took Microsoft less than a day to clean up their act, while Google still hasn't fixed issues that were highlighted publicly 6 years ago!

Many Subjective Warnings

Not only is Google trying to police their competitors, but recently they have offered warnings on all sorts of subjective issues, like...

Individually, each of those issues can be debated.

In our new site design our navigation is aggressively repetitive in some areas. The reason we did that was some people complained about not being able to effectively get around the site. To help make the navigation more intuitive and consistent we use drop downs and in some cases have 3 or 4 or even 5 links to the same location. Is that optimal from a search perspective? Probably not. But then again, search engines don't convert into paying customers. They are simply a conduit...a means to an end. When an engineer views a site they might not view it through the same lens as a customer would.

What is an unnatural link profile? Does it depend on who is building the links? We know that at an SEO conference when some of IAC's cross linking was highlighted Matt Cutts stated "those don't count" but didn't qualify it any further. Likewise when it was highlighted how Mahalo was link farming we were told that they deserved the benefit of the doubt. Since then the link farms have grown and mutated. I won't link at ask.inc.com/when-should-i-hire-a-company-for-lead-generation, but if I was told that the following is "natural" and "good to go" then I would have no problems building a few million links a week. Then again, I bet it would be "unnatural" if I did the same thing.

The part about treating Googlebot different from users is a bit perplexing. As technology has evolved this area has become quite blurry/murky.

  • Sometimes when clicking into big media sites that are 'first click free' I get kicked right to a registration page. In the past some iTunes pages would rank & force you into the iTunes software (though that may have recently changed).
  • Google ranks certain Youtube content in international markets, even where said content is unavailable.
  • Scroll cloaking has been around forever.
  • Tools like Google Website Optimizer can be used to alter user experience significantly.
  • There is an SEO start up which pushes search visitors to sites like CNN to a heavily ad wrapped & paginated version of the same content.
  • I accidentally screwed up using a rel=canonical on a page (cut the source code from a dynamic page and pasted it as the basis for a similar static page & forgot to remove the rel=canonical tag). Eventually I figured out what was wrong & fixed it, but both the correct and incorrect pages ranked for weeks at #1 and #2. And isn't the whole point of the rel=canonical tag to give the search engines a different type of header than an end user (telling the search engine that the content is elsewhere while telling the user nothing of the sort)?

But which of those is a problem? None of them? All of them? Does it depend on who you ask? Does it depend on perceived intent? And that is the tricky part because the same impact can be had with many different tools like 301 redirects, meta refreshes, javascript redirects, and rel=canonical. Should they penalize the technique, the intent, or the result? How do they imply intent?

Relevancy vs Market Manipulation

Spam is in the eye of the beholder, as is relevancy.

The thing is, Google is in a position to imply intent as they see fit. They are in a position to tilt the playing table as they see fit. They claim to be open and sometimes they are fighting the good fight, but businesses have a range of revenue streams from pure as can be to entirely parasitic.

The leaked internal Google documents about copyright and ads on trademarks certainly highlight that Google has no problem with a foot in each pond.

Syndication has long been a part of the media landscape, where portals chose what bits to mix in from where. But how much is fine & what should be done with duplicates? When does something go from 'legitimate syndication' to 'overt spam'? We see official Google blog posts which claim that AdSense ads are not allowed on unoriginal content, while Google has multiple large partners that wrap Google's search results in the AdSense feed and then serve it back to Google. Site categories which were described as 'shoot on sight' become viable enterprises when a person puts a web 2.0 design, venture capital & some public relations into the same basic business model. If Google is going to put out some 'thou shalt not' styled commandments under the label of 'fact vs fiction' they should have consistent enforcement of obvious issues that have been brought up publicly numerous times, including on the very post highlighting the policy. But we know they won't! They only care about dirty business practices if they are not getting a taste of the revenue stream (as shown by their BearShare partnership while policing Bing affiliates).

Based on that sort of activity, when Google announces a preference while promoting "openness" it is easy to see it as a step backward, hypocritical, or even as a farce. Embrace, extend, extinguish.

As Google pushes more to advertise itself on its ad network it is no surprise that they were eventually forced to disclose payout percentages. But it took a lawsuit to do it.

After purchasing Youtube Google rolled out their universal search & was fine with aggressively promoting Youtube over other video services. Only recent government reviews have pushed Google to give 3rd party services a fair shake, but the network effects and lead are likely already too great to overcome.

Due to past search bias, Google might get blocked out of completing the ITA deal. The good news going forward for publishers is due to increasing regulatory heat Google will only go after a small number of verticals where they payouts are huge. The regulatory blowback will be too great for them to try to be all things to all people.

When Google's head spam fighter is doing public relations AND the Washington Post covers his lobbying you know Google is nearing a breaking point.

Delays, Oh No

We were hoping to launch today, but we still do not have all the bugs worked out for all our modules/plugins to make them compatible with Drupal 7. Further, our programmer mentioned that some of the Drupal 7 documentation is missing, which makes the above task even harder. He is making great progress with the upgrade, but between the design coming in a bit late + me getting sick for a long while + all the integration issues we are going through, we are estimating our re-launch date to be either January 31 or February 1st.

I realize that is about 2 weeks away, but I would rather be conservative on the estimate rather than promise it will be 3 days and keep moving the goal post over and over again every few days. If we can launch sooner we will, but barring our internet connection dying permanently or yet another major illness, we should be launched by the 1st of February at the very latest.

Sorry for the delays, but on a positive note, this also gives us more time to make more custom graphics for our training area and do more updates within the training section. It also allows me to blog somewhat regularly over the next week, before sorta disappearing publicly to work on the membership area of the site when re-launched.

SEO Traffic is the CLEANEST and MOST VALUABLE Traffic Online

Microsoft Revenue Per Click Equals Google's

Microsoft adCenter has recently increased their revenue per click to match Google, in spite of having a small chunk of the search market share (maybe 25% between Bing and Yahoo! Search to Google's ~ 75%).

All we hear about Google's love for the scientific method, the superiority of their relevancy algorithms, them creating the best thing for advertisers, etc. has prettymuch been reduced to fluff.

Google is much more aggressive at forcing searchers down a set path, has a broader ad feature set, has more (~ 3x) search marketshare, and yet Microsoft is able to compete. And they have done this against a competitor which keeps making incremental changes to build additional yield.

The Arbitrage Game

How was Microsoft able to increase their yield so much? If you go back 5 years, at the time Yahoo! powered a greater share of search traffic then than Microsoft does now, and their ads were powering both MSN and Yahoo! Search. How did Microsoft catch up with Google when Yahoo! failed to compete?

One word: arbitrage.

I have long railed on Yahoo! for screwing advertisers with fraudulent traffic sources. Arbitragers destroyed the value of Yahoo! clicks & it wasn't until 2010 that Yahoo! allowed you to opt out of the fraudulent traffic.

Even today, Yahoo! still arbitrages search traffic through their home page's trending now section.

Notice the word highlighted in yellow. In most cases Yahoo! will typically spike one or two commercially oriented keywords into their trending box. Having ranked for numerous of these keywords, I can tell you that they can drive thousands of search clicks...which can be an expensive shot of traffic if you are paying $5 a click for them. The 'high blood pressure' might be a Dollar or two, but I have seen some expensive finance keywords in there as well.

A Look Under the Hood of Smaller Search Engines

The thing to understand about paid sources of traffic is that as soon as you add the element of payment there becomes a set of mixed incentives along the value chain.

I won't tell you which search engine it was, other than to say it was a publicly traded one, but about 4 years back a second tier search engine sent me a spreadsheet of [keywords * their bid prices] and wanted me to "generate traffic" for them.

1). Direct Partnership - Pull our ads to display on the site(s) for high paying keyword terms. The traffic must be unique and convert well for the advertiser (search engine traffic is the best). We can display ads in a variety of format and target the top terms on our network. Makes for a good compliment to other revenue streams.

2). Aggressive Referral Partnerships - I will compensate you and/or any other contacts in the black hat SEO realm up to 10% of all revenues generated by referred partnerships. (There are some SEO guys out there doing 1K+ per day in revenue - 10% = 100.00 additional per day for the life of those accounts). I am definitely willing to compensate nicely for referral of these contacts for Direct Partnership deals.

That second tier set up is of course why so many affiliate blogs recommend signing up for every affiliate network in the world. But the big issue with Yahoo! was that (in spite of being a major leading search engine) they were still operating like the 3rd tier folks, with certain publishers being able to access high payouts and CPC stats. Some of the folks running the Overture feed where whoring out out to others & one well known webmaster even has the word "clickbot" in his nickname. Yahoo! made it hard for advertisers to opt out, and that is what killed their click value.

What coincided with Microsoft's increasing revenues per click? They reined in the arbitrage folks.

"Although the Yahoo-Bing integration has been ongoing for several months, during which time we were able to adapt well to the volatile environment, in mid-December we began to experience average revenue per click decreases and the strategies we customarily deploy for responding to such decreases were not as effective," said Geoffrey Rotstein, CEO. "As a result, we are maintaining substantially lower traffic levels until we have better insight into the factors contributing to this issue. The Company is currently working diligently with the teams at Yahoo! in an effort to implement any necessary adjustments to this new marketplace."

"We have always been able to adapt quickly and positively to changes in the industry as a result of our intense focus on data and analytics. We intend to apply this same discipline to respond to these issues, as we continue to receive information from Yahoo! that will assist us to adapt our system for the new advertising marketplace," added Ted Hastings, President. "We intend to make whatever changes are required within our Company to ensure a fast and sustainable response to this new market".

SEO = Still Amazing

But the purpose of this post is to point back to the value of SEO clicks. Advertisers spend over $30 billion a year buying ads from search engines & the organic search results still get the bulk of the clicks. Of course search engines are pushing to eat the organic results as well, but for anyone who has a strong organic traffic stream it is easy to under-appreciate the value until you realize how scarce and expensive pure & clean search traffic is.

Make hay while the sun shines! :)

Google Product Search Ecommerce Play

In a "oh what is the brown stuff oozing from my pants" moment for some e-commerce site owners, Google has quietly entered the space of pulling in manufacturer data directly into Google product search:

To make these pages even better, we plan on working with suppliers and manufacturers to get product data straight from the source.

We are starting this effort through a business partnership with Edgenet, a provider of product data management solutions. Manufacturers and suppliers can work directly with Edgenet's Ezeedata service to submit high-quality product data and images to Google. For more information, you can visit their website, at www.edgenet.com .

Example here.

In the past Google has also beta tested sneaking paid inclusion into their product search, plus they have already started hard-coding their ebook results in the organic search results in the US (without disclosure). at some point you can count on this huge block of product information Google is pulling in to appear directly in the organic search results, pushing many ecommerce organic search results below the fold. Boutiques.com was just the start of a trend.

What makes this trend scarrier is that everyone is doing it: Google, Yahoo!, Bing, Ask, etc.

The mental model I have come to view search through is this: if a search engine can cut you out of the supply chain while having similar quality then they consider you to be at best irrelevant and at worst a spammer. Alternatively, the more your offering looks like a search engine, the more likely it is to be viewed as spam.

The big issue with this is network effects. Outside of brand corrosion & legal issues, there is basically no limit to how far search engines can push. Sure the above focus is on ecommerce, but don't forget that Google is buying Metawebs + ITA Software. And they have the ability to create vertical databases on the fly. If you want their search traffic you have to opt into being scrapped and disitermediated, likeso:

You can differentiate by having product information. But Google scrapes it. You can differentiate through consumer & editorial reviews. But Google scrapes it. You can differentiate by brand, but Google sells branded keywords to competitors. No matter what you do, Google competes against you. You can opt out of being scraped, but then you get no search traffic (& the ecosystem is set up to pay someone else to scrape your content + wrap it in ads).

If you are a big player (like TripAdvisor) you can tell Google to get stuffed & re-negotiate more favorable terms. Smaller players don't have that luxury. Without that leverage, Google doesn't feel they have a spot on the commercial web.

These sorts of trends make the concepts of branding and positioning more important. If Google (and similar companies) aim to consolidate down markets into fewer players then it makes sense to be a #1 in a smaller niche market than a #5 in a bigger one.

Link Exchange Request Emails

A lot of folks have been hammering away at sending out automated link exchange emails for Wordpress driven sites.

The hallmarks of many such efforts

  • URL with something cheesy like "partners" or "friends" or "roundtable"
  • automated emails without a name that mention a search engine ranking and (falsely) apologize for being sent multiple times
  • auto-generated content that is overly boastful & looks like it comes from one of those internet marketing review sites that has fake comment bots which say *everything* is the best thing since sliced bread / a genius in motion / a deity of your choice
  • Thumbshot previews
  • a bit of technical trickery

Nice bit of false empathy there. ;)

The technical trickery mentioned above is that if you visit the link they put in the email the linking post will appear *all over* the site that is "linking" to you. But if you open up a new browser from a different IP address and try to visit the parent category page before visiting the individual post page you will see that the post is only visible to a person who knows exactly where it is. So the people are not only mass automated email spammers, but they lie at hello as well (by deceiving folks into thinking there is an on-the-level exchange of some sort, while screwing them over with a page that is invisible to everyone but them).

The stuff is so out of hand that even new age doomer movies about 2012 are using it & are sending the emails to sites about SEO, offering sources of 'enlightenment.' :D

Clearly they are enlightened. ;)

Some tips & strategies:

  • The easiest way around such issues is to delete unsolicited commercial messages, especially if they are not personalized. But if you want to give someone the benefit of the doubt, then the best way to do so is check the source code of the page inside Google's cache. If the page isn't cached by Google then generally Google probably doesn't care much about it. (Yes there are exception to that, but the people who are sending unsolicited emails probably do not deserve too much benefit of the doubt.)
  • If you are out sending emails asking for links then it goes without saying that you don't want to look like the above folks (though I have received *far* worse emails from some SEO companies & PR folks). Automated tools can be dangerous things when in the hands of tools!

And We're Back!

We got our member's area fully paused out on the 25th of December & on the 26th I got probably the worst flu of my life, losing 15 pounds in 3 days. As a bonus, I got a respiratory tract infection that still has me coughing 2 weeks later!

I am starting to feel a bit over the hump (and like I could be normal within a couple days), but I recently let a number of folks down because I am so used to working 16 hours a day that its hard for me to keep up (even with SEO Book paused) when working only 8 or so hours a day. Worse yet, one day I slept over 21 hours! If I haven't emailed you it's likely not because I was trying to ignore you, but rather because I am still about a hundred more emails in the hole from the period of getting sick.

That said, we are starting to make some progress on the site. We upgraded Drupal over the last couple days (from 5 to 6, but still need to upgrade from 6 to 7). I was also testing the new HTML site design & we have a version of it live here. (One page down, and only a few thousand to go. hehehe.)

Our old site design was a *major* upgrade from the hand rolled ugliness I made way back when. The big logo + strong colors really made it stand out & made the site look and feel more memorable. But after we created a membership site, built more tools, created the online training area, started offering more videos, built the community forums, and created a monthly newsletter it sorta seemed like we had outgrown the design.

The thing I dislike about the old site design is that (to me) it looks sorta like a blog that kept on bolting on more pieces. Largely that was so because the site developed quite incrementally over the years. We never really started with a master plan, but just kept building more stuff we liked and bolted it on. Over time it added up & got a bit unwieldy & the current design doesn't really hint at the breadth or depth of the offering. Whereas the new design feels more like a complete thought that better expresses what the site offers.

In terms of the infrastructural upgrades, we are not where we need to be, but we are finally making progress, and are trying to catch up quickly. If I owe you an email expect one before the weekend is out! (Unless I feel worse after another nap here soon).

Happy new year everyone, and more blog posts coming in the days to come.

Is Social Rank Dying Already?

There has been a lot of talk in the SEO Community about Social Rank

And some talk that it might die soon.

What Is "Social Rank"

As far as the SEO is concerned, social rank is the idea that Google, and other search engines, use social networking indicators in their ranking algorithms. If you get mentioned and linked to often, from social media profiles, this helps your site rank in the search engines.

Check out Danny Sullivan's Q&A session on this topic with Google and Bing representatives.

Do you calculate whether a link should carry more weight depending on the person who tweets it?

Yes we do use this as a signal, especially in the “Top links” section [of Google Realtime Search]. Author authority is independent of PageRank, but it is currently only used in limited situations in ordinary web search

Google intimate it's tied in with PageRank, which Danny also discusses.

To some degree, “humans” on the web have pages that already represent their authority.

For example, my Twitter page has a Google PageRank score of 7 out of 10, which is an above average degree of authority in Google’s link counting world. Things I link to from that page — via my tweets — potentially get more credit than things someone whose Twitter page has a lower PageRank score. (NOTE: PageRank scores for Twitter pages are much different if you’re logged in and may show higher scores. This seems to be a result of the new Twitter interface that has been introduced. I’ll be checking with Google and Twitter more about this, but I’d trust the “logged out” scores more).

Google is a vote counting engine, so it isn't surprising they count votes from social network sites. It should also come as no surprise Google uses Twitter to help determine interest in news events, as the Twitter platform lends itself to news. This will then flow through into their news ranking. There are also the indirect benefits i.e. the attention generates articles and commentary, which then link back to your site.

All links are valuable, because attention - human, spider, or both - travels along them. Google will always be interested in who is paying the most attention to what. If people are using social networks to do that, then that is where Google needs to be.

Of course, like search, Social Media it is open to abuse.

How To Do Blackhat Social Rank

Black or grey, here are a few of the more aggressive tactics in use:

  • Fake Profiles - auto gen an entire network of friends
  • Duplicate/Fake Content - plenty of auto-gen tools about that will make posts and requests on your behalf
  • Pay Important People To Tweet Your Link As Editorial - or put your link on their profile page
  • Buy Social Media Accounts

You might have spotted a few more.

The social services will, of course, combat any threat they deem detrimental to their business. Just like in search, the game will be never-ending, as the blackhats find holes in the system, and the engineers plug them. And just like times past in SEO, the ethical debate rears its head.

Is it morally "right" or "wrong" to use technique X, Y and Z?

All a bit silly, really. People will use a technique regardless of other people's ethical dilemmas, so long as it works. It's up to the social networks, and Google, to stop what they might consider abusive practices from working, or paying off.

And they will, although they've probably got their work cut out for them. It's one thing to look at a page about, say, fitness and determine the links running along the bottom for "ring tones", "bad credit loans" and "viagra" are likely dodgy, but another thing to look at profile activity and determine whether there is a human behind it.

Social media is evolving quickly, and it will take time to patch issues, both technically and culturally. So I'm sure the blackhats will be having fun for some time yet.

Personalized Social Recommendations

Google sometimes may list results from your "social circle" at the bottom of the organic search results. The good thing about these results is that most of the recommendations are fairly transparent & benign.

Bing is displaying Facebook like data & Blekko is pushing harder at integrating Facebook likes into their algorithm as well.

A "like" might have multiple meanings depending on who is doing it. Do the votes for this page "like" Google, PPCBlog, PPCBlog's explanation of Google, search in general, algorithms, SEO, infographics, technology, marketing, or ...?

In search there is a concept of stop-words, which are words that would not be counted much because they are so common they don't really tell you much about a piece of content. Some keywords (say mesothelioma) have a higher discrimination value than others (say the). A "like" it doesn't have a great discrimination value, largely because you don't know why someone liked something. The nuanced subtleties are lost without context. Something might be liked because it is clever, in-depth, correct, humorous, offensive, and incorrect - all at the same time! It all comes down to interpretation & perspective.

Some people will offer tips on "scaling your social footprint" and such, but the trade off is that on networks where relationships are reciprocal (like on Facebook) you can't add a friend without having that friend added to your account. Brands, on the other hand, can offer an endless array of discounts and promotions. If a search engine puts too much weight on likes then companies will simply run giveaways, contests, and pricing specials to collect votes.

"Likes" are so low effort they will be easily manipulated, even amongst real account holders. Over time these votes will be every bit as polluted as the link graph (or maybe moreso) because there are so many ways to influence people individually (click the below like button for $2 off your order, etc). Such offers might fall outside of the terms of service of some networks, but it is worth noting that when Google was promoting their reviews service they violated their own TOS.

In addition to likes being easy to manipulate, some flavors of social are heavily spammed because many people use the tools simply for reciprocal promotion. I likely have over 1,000 friends on Facebook & yet I have no idea who 90%+ of the people are. Am I recommending the stuff that some of those people recommend? An algorithm that assumes I am is likely leading people astray. And you might be friends with someone while knowing that their business life is quite shady when compared against their personal life (or the other way around). Are you endorsing everything a person does?

Further, anyone can invest in creating one piece of great content that scores tons of "likes" while operating in an exploitative manner elsewhere (and/or later). It is just like the wave of bulk unsolicited emails I get promoting 'non-profit' directories which one month later require 3 or 4 page scrolls to get past all the lead generation forms, while yet claiming to be non-profit. :D

And social networks decay over time:

  • Friendster lost out due to bad management, and MySpace the same.
  • GeoCities closed last year. Delicious has had an upswing in spam, and Yahoo! has it scheduled for sunset soon.
  • And even outside of those sorts of broad platform shifts, people change over time. Years ago I might have recommended working with someone like Patrick Gavin or Andy Hagans, but I wouldn't dare do so today. Likewise a particular tip or product might be exceptionally profitable for a period of time & then eventually decay to a near sure bet money loser. Opportunities do not last forever. Marketers must change with the markets. Other products might have undesirable side effects that later come to surface. Add in media based on more precise measurements & pageview chasing, and the conflicts between recommendations + media coverage will scare some folks into not participating. Associating recommendations with individuals will cause blowback as some of the seeds turn sour & people blame the person who recommended them to the person/product/service that screwed them over. The link graph allows those with undesirable reputations to slowly fade into obscurity, whereas old likes remain in place & can cause a social conflict years down the road.

Using Social Media For SEO Purposes

A link is a marker of attention.

Google will always want to count markers of attention. Blackhat trickery aside, in order to make social media work for you, and create side effects in terms of ranking, you should build both a presence in social media, and a craft messages that are likely to be spread by social media.

It's much like PR. Public Relations, as opposed to PageRank.

Start by defining your audience. Who do you know that talks to that audience? Try to get to know as many people as possible in your audience, especially the movers and shakers who already talk with them.

Get movers and shakers to spread your message. That may involve payment of some kind. Reciprocation, favor, cash, drugs, booze, hookers. Whatever works.

Joke.

Or - and this is probably the most effective path - craft a message so interesting, they'll find it hard not to spread.

Think about how you spin your message. Think in terms of benefit. How will the audience benefit from knowing this information? What is in it for them? What are they curious about? Feed their curiosity. Sometimes, it's not the message, but the way it is stated.

Plan ahead. Can you spin your message around a public event, like a holiday? Or a current event? Or a popular personality?

Get out and meet people face-to-face. People are much more likely to be receptive to your ideas if they really do know you.

But there is a danger in overthingking this stuff. A few well placed links to a site can still get you top ten in Google, even if you have no social media presence at all. Social media is just another string to the bow.

Oh, The Opportunity

I'm doing keyword research.

The opportunities I see before me still amaze me.

Keyword lists, showing the frequency of searches, are market research nirvana. It's like being a god, delving deep into the minds of mortals.

And most people outside SEO. Still. Don't. Get. It.

Ever explained keywords to people, and received blank looks in return?

We can trawl through a keyword research tool and list thousands of niche business opportunities. Demand is on display. It is being broadcast to us.

Once we discover demand, we measure the competition, quantify the opportunity, build a site, and dive into the demand streams that have existed long before the internet was invented.

Demand, meet supply.

Just look at all this demand:

  • "japanese translation" 450,000 Monthly searches
  • "hospital jobs" 823,000 Monthly searches
  • "forklift certification" 27,100 Monthly searches
  • "address labels" 301,000 Monthly searches
  • "digital signage" 201,000 Monthly searches
  • "student credit cards" 135,000 Monthly searches
  • "coin collecting" 60,500 Monthly searches

And as we know, that's just ONE keyword per niche. The real gems can be found deeper into the long tail of associations, permeations, and similarities.

The search channel still amazes me.

It's so powerful, and so under-rated.

Mad Men

Have you seen Mad Men?

If you haven't, it's a great show about an advertising agency, set in the 1960's. The ad executives were the rockstars of the time, paid well to know what was on the minds of consumers.

What would they have made of a keyword research tool, I wonder?

Or our digital zeitgeist?

And unlike fifty years ago, there are fewer barriers to entry to many traditional markets. In the past, in order to compete-nation wide, or internationally, a huge, multinational machine, of people and capital, was needed. Now, with a credit card, we just tap into a vast network in an instant.

Fifty years ago, publishing a book was difficult and expensive. A large publishing company could get shelf space at a major retail outlet, but you couldn't. Probably still can't. You needed to print many copies, a risk and cost out of reach of most people. A publisher could reach out to reviewers, and work the publicity machines.

Now we can compete.

We can get more far more reach, in in much less time, for a fraction of the cost.

So many niches, so little time.

So, what are you going to do today?

Keyword Research Resources

Pages