Currently the theme shows the old SEO Book logo in it (as logo.gif in the theme's files). You can easily change that out with a custom logo from the likes of 99designs, CrowdSPRING, or Logo Design Works.
A couple notes of caution with that:
The dimensions of the current logo are 720 wide by a height of 154 pixels. If you change the height of the logo then you would want to adjust the height of the space above the top navigation. Currently the header div has a height of 173px, so it is set to logo height + 19 pixels.
If you order a logo you may want to color match it to the existing site design colors. For your convenience, there is a color swatch to the right & you can grab HTML colors using an extension like ColorZilla. The HTLML color code for the green is roughly #9bdc1d and the blue is roughly #5bacd8 (though both have a bit of gradient to them).
Editing the Site's Colors
Given the reliance on white in the design, it is fairly easy to change the design's colors simply by changing the color of a few images in the design. You can replace the green and blue with a wide variety of colors and still have it look good. I believe we did red and gray on PPC Blog for a while and it looked pretty good. This tool is a good tool for making gradient images. Then you can use something like SnagIt to size the images similar to the old design's images. Of course Photoshop experts should have no problems with editing the colors either. ;)
Editing the Site's Width
The white content area with a white page background makes it easy to change the theme's width in the CSS if you are pretty knowledgeable about CSS. The divs are pretty easy to understand. Container wraps around the content area. Each post div is within the content div & the sidebar is named sidebar. :)
General Disclaimers & Whatnot
First and foremost, since the theme is free it does not come with any sort of support. If you have doubts or concerns with using it then we suggest testing it out on a secondary site & customizing it as needed before putting it on your primary website.
There are a wide variety of other themes & Wordpress plugins that offer more granular SEO control. When using a theme like this one on our sites then typically we would use SEO title tag and a related posts plugin to help with SEO. If we are aiming for a fairly flat site structure then we would show excerpts on archive pages and use a different posts per page plugin to put something like 100 posts on each category page. But there are many other themes and plugins that do those sorts of things.
The template has a credit link in it. I would prefer you leave that there so others can find out how to get the theme, but if you do need to remove it all I ask is that you instead link to a charity you believe in & donate whatever you can to that charity. :)
Why Did We Switch Site Designs Here?
The above design was live on our site for nearly 5 years. And I would have kept rolling with it if our site didn't become so complex. One of the leading complaints about our old site was how navigation was inconsistent in different parts of the site.
The site started off as a blog which happened to sell an ebook, but over time as it grew to have dozens of tools, 100+ training modules, thousands of blog posts, etc. Given all the various user rolls and login permissions it was important for us to tighten up our navigation and make it more consistent (with the use of sitewide drop downs and such). I plan on using our old design on a few of our other websites that are less complex and more bloggy. And I hope you like it too! :)
A few months back I was running Advanced Web Ranking and noticed that Google and Bing were really starting to come in line on some keywords.
Major Differences
Of course there are still differences between Bing and Google.
Google has far more usage data built up over the years & a huge market share advantage over Bing in literally every global market. Microsoft's poor branding in search meant they had roughly 0 leverage in the marketplace until they launched the Bing brand. That longer experience in search is likely what gives Google the confidence to have a much deeper crawl.
That head start also means that Google has been working on understanding word meanings and adjusting their vocabulary far longer, which also gives them the confidence to be able to use word relationships more aggressively (when Bing came to market part of their ad campaign was built on teasing Google for this). The last big difference from an interface perspective would be that Google forces searchers down certain paths with their Google Instant search suggestions.
Who Copied Who?
But the similarities between the search engines are far greater than their differences.
At the core of Google's search relevancy algorithm is PageRank and link analysis. Bing places a lot of weight on those as well.
Google also factors in the domain name into their relevancy algorithms. So does Bing.
Google has long had universal search & Bing copied it.
Google has tried to innovate by localizing search results. Bing localizes results as well.
Bing moved the right rail ads closer to the organic search results. Google copied them.
Bing put a fourth ad above the organic search results. Google began listing vertical CPA ad units for mortgages and credit cards above the organic search results - a fourth ad unit.
Bing has a homepage background image. Google copied them by allowing you to upload a personalized homepage logo.
Bing offers left rail navigation to filter the search results. Google copied them by offering the same.
Bing innovated in travel search. Google is trying to buy the underlying data provider ITA Software.
Bing included Freebase content in their search results. Google bought Metawebs, which owns Freebase.
Bing offered infinite scroll and a unique image search experience that highlights the images. Google copied it.
Oh, The Outrage
Off the start Bing was playing catch up, but almost anything they have ever tried which has truly differentiated their experience ended up copied by Google. Recently Google conducted a black PR campaign to smear Bing for using usage data across multiple search engines to improve their relevancy. The money quote would be:
Those results from Google are then more likely to show up on Bing. Put another way, some Bing results increasingly look like an incomplete, stale version of Google results—a cheap imitation.
Perhaps why Google finds this so annoying is that it allows Microsoft to refine their "crawl" & relevancy process on tail keywords, which are the hardest ones to get right (because as engines get deeper into search they have fewer signals to work with and a lot more spam). It allows Microsoft to conduct tests which compare their own internal algorithms against Google's top listings on the fly & learn from them. It takes away some of Google's economies of scale advantages.
Is Google Eating Its Own Home Cooking (And Throwing UP?)
Here is what I don't get about Google's complaints though. Google had no problem borrowing a half-dozen innovations from Bing. But this is how Google describes Bing's "nefarious" activities:
“It’s cheating to me because we work incredibly hard and have done so for years but they just get there based on our hard work,” said Singhal. “I don’t know how else to call it but plain and simple cheating. Another analogy is that it’s like running a marathon and carrying someone else on your back, who jumps off just before the finish line.”
When a content site compiles reviews, creates editorial features to highlight the best reviews (and best reviewers), and works to create algorithms to filter out junk and spam then Google is fine with Google eating all that work for free. Google then jumps off their backs just before the finish line and throws the repurposed reviews in front of Google searchers.
But if Bing looks at the data generated by searchers who are performing the searches on Google and uses it as 1 of 1,000 different relevancy signals then Google is outraged.
Clickstream Data & You
This public blanket admission of Microsoft using clickstream data for relevancy purposes is helpful. But outside of the PR smear campaign from Google there wasn't much new to learn here, as this has been a bit of an open secret amongst those in the know in the search space for well over a year now.
But the idea of using existing traffic stream data as a signal increases the value of having a strong diversified traffic flow which leverages:
Recently we tested adding ads on one of our websites that had a fairly uninspired design on it. After adding the ads (which make the site feel a bit less credible) the new design was so much better fitting than the old one that the site now gets 26% more pageviews per visit. Anytime you can put something on your website which increases monetization, sends visitors away & yet still get more user engagement you are making a positive change!
If You Can't Beat Em, Filter
I was being a bit of a joker when I created this, but the point remains that as larger search engines force feed junk (content mills and vertical search results) down end user's throats that some of the best ways for upstart search engines to compete is to filter that stuff out. Both DuckDuckGo and Blekko have done just that.
The Future of (In)Organic Content Farming
Demand Media is currently worth $1.74 billion, but it remains to be seen what happens to the efficacy of the content farm business model if & when Google makes promised changes. And given that Yahoo! is Bing's biggest source of search distribution, it would be hard for Microsoft to crack down to hard without potentially harming that relationship (since Yahoo! owns Associated Content), but Google & Microsoft are the only game in town with search ads. The DOJ already blocked Yahoo!-Google. Trying to win marketshare from Google, Microsoft is burning over $2 billion a year.
Search can be used as a wedge in a variety of ways. Most are perhaps poorly understood by the media and market regulators.
Woot! Check Out Our Bundling Discounts
When Google Checkout rolled out, it was free. Not only was it free, but it came with a badge that appears near AdWords ads to make the ads stand out. That boosts ad clickthrough rates, which feeds into ad quality score & acts as a discount for advertisers who used Google Checkout. If you did not use Google's bundled services you were stuck paying above market rates to compete with those who accepted Google's bundling discounts.
Companies spend billions of Dollars every year building their trademarked brands. But if they don't pay Google for existing brand equity then Google sells access to that stream of branded traffic to competitors, even though internal Google studies have shown it causes confusion in the marketplace.
The Right to Copy
Copyright protects the value of content. To increase the cost of maintaining that value, DoubleClick and AdSense fund a lot of copy and paste publishing, even of the automated variety. Sure you can hide your content behind a paywall, but if Google is paying people to steal it and wrap it in ads how do you have legal recourse if those people live in a country which doesn't respect copyright?
You can see how LOOSE Google's AdSense standards are when it comes to things like copyright and trademarks by searching for something like "bulk PageRank checker" and seeing how many sites that violate Google's TOS multiple ways are built on cybersquatted domain names that contain the word "PageRank" in them. There are also sites dedicated to turning Youtube videos into MP3's which are monetized via AdSense.
Universal Youtube Search
Google bought Youtube and then swiftly rolled out universal search, which dramatically increased the exposure of Youtube. Only recent heat & regulatory review has caused Google to add more prominent links to competing services, nearly a half-decade later.
Philosophically Google believes in (and delivers regular sermons about) an open web where companies should compete on the merit of their products. And yet when Google enters a new vertical they *require* you to let them use your content against you. If you want to opt out of competing against yourself Google say that is fine, but the only way they will allow you to opt out is if you block them from indexing your content & kill your search traffic.
“Google has also advised that if we want to stop content from appearing on Google Places we would have to reduce/stop Google’s ability to scan the TripAdvisor site,” said Kaufer “Needless to say, this would have a significant impact on TripAdvisor’s ranking on natural search queries through Google and, as such, we are not blocking Google from scanning our site.”
From a public relations standpoint & a legal perspective I don't think it is a good idea for Google to deliver all-or-nothing ultimatums. Ultimately that could cause people in positions of power to view their acts as a collection which have to be justified on the whole, rather than on an individual basis.
Lucky for publishers, technology does allow them to skirt Google's suggestions. If I ran an industry-leading review site and wanted to opt out of Google's all-or-nothing scrape job scam, my approach would be to selectively post certain types of content. Some of it would be behind a registration wall, some of it would be publicly accessible in iframes, and maybe just a sliver of it is fully accessible to Google. That way Google indexes your site (and you still rank for the core industry keywords), but they can't scrape the parts you don't want them to. Of course that means losing out on some longtail search traffic (as the hidden content is invisible to search engines), but it is better than the alternatives of killing all search traffic or giving away the farm.
There are pushes to minimize the need for passwords, but after the Gawker leak fiasco who wants to have a common shared single point of failure for passwords? Sure managing passwords sucks. But friction is a tool that helps cleanse demand & make it more pure. It is why paid communities have a higher signal to noise ratio than free for all sites. Any barriers will annoy people, but those same barriers will also prevent some people from wasting your time. If they are not willing to jump through any hoops they were never going to pull out the credit card.
We have some exciting news to share about eHow.com. Beginning in February 2011, Facebook Login will be the exclusive means for login to the site. You’ll be able to use your new or existing Facebook username and password to connect with the eHow community. We’ll also be removing eHow member profiles to help you streamline friend lists and eliminate the work of managing multiple online accounts. Additionally, we’ll be closing forums on the site. We want to hear from you directly, so moving forward, we encourage you to communicate us through the “Contact Us” section of eHow.com.
We’re excited to introduce these updates! Get started and click on the Facebook Connect button in the upper right corner of the home page to login. We want to keep in touch, so also remember to Fan Us.
My guess is they might be trying to diversify their traffic stream away from search & gain broader general awareness to further legitimize their site. But the big risk to them is that Facebook is an ad network. So now competing sites will be able to market at their base of freelance employees. What's worse, is that there was a rumor that Facebook might plan to launch a content mill strategy. There are plenty of ways for that third party login to backfire.
My believe is that you shouldn't force logins until you have something to offer, but that when you do you should manage the relationship directly. Does that mean you have to reply to every message? No. But it does mean that if there are ways to enhance value through how you interact with your established relationships you are not stuck under the TOS of a 3rd party website which may compete against you at some point. Sure that means some upgrades will be painful, but it means that you get to chose when you do upgrades rather than letting someone else chose when your website breaks for you.
I view third party comment systems the same way. If the person providing the service changes business model it does not mean you are stuck paying whatever rate they want or starting over. This is one of the big advantages of owning your own domain name and using open source content management systems. You don't have to worry about a Ning pivot or a Geocities shut down. Sure this approach means you have to deal with security, but then leaving that sort of stuff to Facebook might not be great anyhow.
Have you ever noticed that a lot of blogs want to be seen as being the same as the media? And media companies are responding by hiring bloggers. But why is emulating the media so exciting? After all, the same media is so big, bloated & redundant that it is buried in debt. How is it possible that a humor blog network built on open source software would ever need to raise $30 million?
The problem is that it is hard to stay different and operate at scale. You eventually become that you claimed to hate. If you are good at public relations you can claim tobe different to build exposure, but ultimately once a site becomes large there is no incentive for creating signal. Rather the game becomes generating as much exposure as you can to sell to brand advertisers. Content can be dressed up to stand out, but at the core it is basically the same.
Here are 2posts from TechCrunch about Yahoo!. Both published on the same day. Both saying the same thing. There isn't much difference between them than what a good markov generator could do.
The million channel words brings addressability. There is no mass any more. You can't reach everyone. Mad Men is a hit and yet it has only been seen by 2% of the people in the USA.
The mcw bring silos, angry tribes and insularity. Fox News makes a fortune by pitting people against one another. Talkingpointsmemo is custom tailored for people who are sure that the other side is wrong. You can spend your entire day consuming media and never encounter a thought you don't agree with, don't like or don't want to see.
The polarization from such media & the blow-by-blow content style leads people to worry about inconsequential crap like their political ideology, where they can write based on a checklist. It makes them notice the trivial differences while remaining blinds to important things, like the systemic fraud that is supported and encouraged by both leading political parties. Arguing inconsequential details becomes increasingly addictive because the blame has to be sent to "them" rather than where is squarely belongs.
Media as a Conduit For Scams & Misinformation
Do you find it perplexing that the same media (which claims to be legitimate) has no problems running ads for total scams? Isn't it bizarre that the same media that claims to protect citizens from the evils of the marketplace tries to blend the ads for such scams in with their navigation to sell their readers down the river? Is this what you would expect of Newsweek?
Is that anything to aspire to?
Jokingly Geordie suggested how annoying he found the gallery sections on media sites with videos and pictures that seem like they are fresh off the Jerry Springer show. "WATCH: Teen beats ferret to death and eats it!" In the short run online advertising can grow quickly through tricking people, but the end result is distrust & people become less receptive.
The problem is lack of sufficiently broad exposure to the facts here in the US. We don’t have a fourth estate, a national media in the role of providing checks and balances to government and business excesses. Instead we have media that sells product. In the late 1990s it sold tech stocks, in the early 2000s the Iraq War, from 2002 until 2007 it sold houses, and in the future it will sell whatever measures are a “necessary” price for social stability, national security, or whatever phrases are used, because things are going to get dicey once this 40 year old Rube Goldberg monetary and trade contraption comes apart when it’s hit with a Peak Cheap Oil sledgehammer in the middle of the Jon Stewart show. I mean, how healthy is the American fourth estate when all of the serious journalism here is done by comedians?
Isn't it weird that the mark of a successful blog is that it starts to look and feel and act like bloated media organizations? Is social media any better? Or is social media mostly a bunch of lemmings following each other off the side of the mountain?
Just because there is lots of information doesn't make all of it valuable. In fact, some of it has negative value. Who are the people who login to Facebook so they can vote on Facebook about how Facebook is a waste of time & they don't use it?
The narrative of "change" gives new companies a niche or angle to get press coverage from. People ask if it is the next Google, or more! The service then go under-monetized for a couple years to feed growth and scale. The whole time the site is not monetized stories are seeded in the media about how media format x is powerful for brands even if the lead examples are nonsense.
How is it possible that we are told that data has value but privacy allegedly does not? Most such stores of data are built through an invasion of privacy.
Privacy has value. What happens when your account gets hacked & you start recommending some uncomfortable stuff? What happens when a stalker catches you on the way home based on one of your messages? How many such experiences will be viewed as a series of isolated events before people figure it out? Once these ads lose their novelty will there still be real businesses behind them? Or is narcissistic advertising the wave of the future even as people realize they are being spied on?
These companies blend their ad units into editorial so effectively that most people can't tell them apart. If that sounds familiar, that is because it is. The key to making it work is perceived relevancy. That is easy to do when you have a large ad auction and users type their intent into a search box, but is much harder to do when people are browsing pictures of cats.
Anyone who thinks that social is a clean search signal is forgetting that people vote most for stuff that his humorous & easy to share. And people share things that they saw others shared because they felt they had to. The echo chamber effect doesn't encourage critical thought. It is mostly a bunch of +1.
The following video is sad & funny. It has been viewed widely, but it does nothing to fix a broken education system.
And there are entire categories that will never be featured honestly on social media. Sure the idea of turning Kinect into a virtual sex video game will get lots of exposure, but is anyone ever going to honestly Tweet about their favorite solutions to their genital warts problem? Is there enough context to matter? Worse yet, all these networks are turning their relevancy signals into ad units, so if a search engine was to count them heavily all the search engine would be doing is subsidizing the third party ad network. And the scammers who are pushing reverse billing fraud products on the news sites will do as much damage as they can get away with on the social sites.
If there's a broad call at the company to integrate social networking features, Singhal hasn't quite heard it. He seems skeptical about whether social data can make search results significantly more relevant. If he's searching for a new kind of dishwasher, he argues, his friend's recommendations are interesting, but the cumulative opinion of experts manifested in search results is much more valuable. He notes that Google already integrates content from Twitter and says social networking data is easily manipulated. Can social context make search more relevant? "Maybe, maybe not. Social is just one signal. It's a tiny signal," he says.
If Google can't find much signal there then good luck to the folks trying to use Tweets to trade the (increasingly corrupted) stock market!
Why People Like Social Media
I think social media gives people the illusion of success through proximity. Thus people are impressed to rub shoulders with successful people, even if they are douchebags and liars to boot. There is the unsaid message that “you too can be a billionaire” that a lot of entrepreneurs and start up folks want to believe in that drives the growth of Quora. But the reality is that celebrities are whoring out their status for a quick payday, even if the advertiser value in such relationships is marginal.
Someone wants to eat my dog. Other than breathing, writing English(ish), and having a Twitter account, I probably do not have anything in common with that person. And yet there is no tool to sort that out.
That is the big problem with most social media tools: the monolithic nature.
I am not sure where I read this quote from, but I think it went something like this "we are most similar where we are most vulgar and most unique in the ways we are sophisticated." That is precisely why a lot of the broader networks will repeatedly fail in efforts to build strong niches outside their core. It is why there is so much value in being a fast follower.
Messaging and imagery allow piss poor product to be branded as food. The abuse of language is so thorough that even the words "shared planet" need a TM next to them.
The inflation and bubbles in the developing world are not yet destabilizing because the dollar is weak and the hot money supports their currency values. Historically, inflation becomes a crisis in the developing world when the dollar turns around and appreciates. However, it is possible for inflation to create a crisis without a currency crisis. It erodes the purchasing power of the people at the bottom. Social unrest can lead to political crisis
The currency pegs mean that most of the inflationary pressure you're creating doesn't hit your nation, it's exported to others. That exactly how you like it, because you can claim "inflation expectations are well-anchored." Perhaps they are in your nation, but in other places they're extremely unanchored and are not only expectations, they're realized facts as the basic cost of life spirals up out of control.
This, in turn, provokes food riots in these less-well-endowed nations that you managed to dupe into participating in your outrageous scheme. After all, there's only one thing worse than a hungry man. That's a man who used to be well-fed and now he's both hungry and ****ed, along with being unemployed.
When his belly growls loudly enough, he riots. And so do his similarly-situated neighbors.
Rather such mobs are caused because the lack of media doing its job to enforce a sense of outrage over the injustices caused upon societies the world over by banking criminals. If there was any sense of justice the large banks that caused this mess would have been bankrupted. But instead we base economic strategy on the theoretical economy rather than its impact on the real world.
People are just an externality for bankers to exploit.
One of my favorite approaches to save time online is to use multiple web browsers for different purposes. It allows you to combine speed + reliability with also having quick access to tons of valuable tools & data.
Firefox
I set up Firefox fully loaded with bookmarks and extensions (all our free & premium ones, User Agent Switcher, Web Developer, Greasemonkey, Roboform, Colorzilla), but realize that as a result it will often be a bit slower & crash more frequently. That is ok because I don't use it as my primary web browser, but as my primary SEO research browser with all our SEO tools installed. Other extensions like Web Developer and Greasemonkey make it an obvious choice to use it as your fully loaded research browser.
Google Chrome
I run Google Chrome bare to the bone, with 0 extensions installed. One time I tried to install Roboform on it, but that slowed it down as well, so I got rid of that and keep it bare. The benefit of having a minimalistic browser is that it is quite stable & fast. In this way I can open up 20 tabs from our forums at any given time without worrying about it causing a crash. What is better is how good Google is at allowing you to restore tabs if things do crash. Chrome is my forums + email browser & my general purpose browser for anything I don't have to login to access & a few of the sites I am typically logged into (like this 1).
And while Firefox is my normal research & testing browser, Chrome also has a nice feature where you can highlight & right click to inspect an element. It tells you exactly what css file the property is in, and you can double click on it, adjust the size/color/etc within Chrome to see what it changes.
Internet Explorer
I also run IE9. It's purpose is to help give me a clean & pure localized view of search. It is set up to delete all cookies when it closes. I use it in conjunction with a VPN to compare how search results look in various parts of the world. It is another type of research, but it is not always-on the way that Firefox and Chrome are. Such a browser can also be handy for putting your computer in London for exclusive BBC content, or getting around other such geographic content-access limitations. I also have Roboform enabled on IE to allow me to log into client accounts easily if I want to ensure I keep those separate from my personal accounts.
Opera
I also have Opera installed & I use it for testing user permissions based issues. Some pages on our site here operate in a way that is far more sophisticated than they might look at a glance. Some pages may look different based on if you are not registered, logged in with a basic account, logged in with a premium account, or logged in as an administrator. When testing & tweaking that sort of stuff I can end up with 4 different browsers open. Over time after we get everything up and running I hope to improve further on this front, as we haven't done as much of the conditional permissions-based changes as I would like to do. But, first thing first, we need to get re-launched soon. ;)
And the final reason to have most modern browsers installed is to check out how your site looks in all of them. I would NEVER describe myself as a website design, but I am foolish enough to hack away at the CSS & HTML. Sometimes it works. Usually it doesn't. :)
Safari
Having all browsers available (well all of them except Safari) makes it easy to see if something works or not. That said, tools like Adobe Browser Lab and Browser Shots are a nice compliment to this approach. And we have Safari on my laptop, so if the design looks good elsewhere then generally it is typically good to go in Safari, so I check it last. If you use Safari as your primary browser LastPass is good.
In the past I have highlighted how hype-driven hard launches often lead to hard landings. But what is even more challenging than launches is relaunches. Some relaunches are just flicking a switch, done mostly as a marketing gimmick. But those that are real changes are brutal, largely because you have already built up expectations in the past and have to manage expectations, even while everything is changing, and many things are not in your clear control. The more polished you become the worse you look when things go awry. :D
An Error of Confidence
In our member forums while using vBulletin 3 I became confident enough with upgrades that I did them myself even without a programmer standing by. Then I did the vBulletin 4 upgrade and it broke the templates & forced us to create a new design. vBulletin 4 has all sorts of bizarre variables in it and a lot of members were at first put off by the new design that vBulletin forced me into doing. There was almost an emotional visceral reaction amongst some members because we hate change that is forced upon us, especially if it feels arbitrary!
Based on that experience I decided that when we were going to upgrade Drupal and install a new member management software that it made sense for us to pause user accounts in case anything goes wrong. Lots has gone wrong with the update, so that turned out to be a good decision. Although at the same time it means I am spending well into 5 figures a month on upgrades and such while the site is producing no revenues.
I figured the no revenues part would encourage us to be as fast as possible, but Drupal 7 was a far more difficult change than vBulletin 4 was.
Guaranteed Broke
Whenever you do major upgrades will break. And it is virtually impossible to catch it all in advance. There are issues that happen with drop downs on certain browsers only when they are using certain operating systems with certain sized monitor, and all sorts of other technical fun stuff that doesn't appear until thousands of eyes have seen your website.
When you are new and obscure feedback comes in small bits and you keep getting incrementally better. But when everything changes you get hundreds of emails a day and it is nearly impossible to respond to them.
Did You Run Your Site Through a Geocities Generator?
We are trying our best to rush to fix stuff & get up to speed, but some issues that are even fine the day of an upgrade can appear crazy on day #2 due to how things interact. If we had our member's area accessible now, how would we really justify & explain end users seeing something like this...where bizarrely our designs merged:
Weird bugs like that can be difficult to troubleshoot, especially when they are intermittent. We have to fix those huge issues before we can even consider launching (and we mostly have already). But then there are other things that break in other ways that need fixed too.
A Laundry List of Issues
Post comment permalinks that add 30,000 pages of duplicate content to your site. (mostly fixed)
Updates that wipe out the ability to reply to a threaded comment on blog posts. (still need to fix)
Default sign-up page ugly & pretty version not posting to default. (still need to fix)
Users who desire our autoresponder still not getting it due to needing to test it again before having it send any emails. (still need to fix)
Integrating on-site social proof of value & activity like recent comments and member information. (still need to fix)
Redirect issues for certain login types. (mostly fixed)
Enable multiple product tiers & levels. (still need to fix)
Cookies issues based on old cookies before the CMS upgraded to the new system. (fixed for those who cleared cookies already, not for those who haven't)
Password reset emails don't send new passwords, but a one-time login link.
But some of those login links might be so long that they wrap and are broke by certain email clients.
Do you build a custom hack to try to fix that directly? or
Do you wait until you install your membership permissions management software and run everything through that? or
Do you convert your email module to send HTML emails? HTML emails which then requires a lot more testing because it might get stripped by some email clients. (Or, perhaps the email goes through, but the unsubscribe link is broken, which causes immediately a douchebag freetard to open up a support ticket with "lawsuit pending" as the title.)
That is only a partial list of items...there are literally about 100 more! And, as you can see from that last passwords issue, some corrections lead to additional issues. It is sorta like running up the side of a mountain carrying weights. :)
The challenging part of being a marketer, an SEO, and the guy who interacts with the customers is you deeply know how some things are flawed & that forces you to try to fix them as fast as possible. You can't just ignore the canonicalization issue that would be missed by most webmasters as you know the pain that leads to. :D
Brutal Pain
Even if you are pretty quick at fixing things, some will still blow for a bit. Complex systems are complex.
Not only will freetards complain, but you will get other forms of legitimate friction simply by virtue of being. Lots of eyes are on your errors. Once you have a well known website there is a lifeflow that goes through it 24 hours a day - if you are there or not. And if any of the common interactive paths are broken you will hear about it again and again and again. And again. :)
And yet, while you are trying to decide the best way to keep making things better you get emails that are condescendingly friendly. ;)
You guys have some very useful tools on this site and provide very useful seo information. Yet your site's user flow is surprisingly confounding and awkward. You guys strike me as practical internet marketers and I can't help but wonder why, if you were to upgrade anything on your site, you wouldn't have addressed your awkward user flow as opposed to spending time and money on some hipster faux web 2.0 window dressing. I know I'm not a paying customer...but I've always used this site for the keyword search tool and it has helped me drive traffic and increase eCPMs on my sites.
...
My guess is the type of people who use your site are not impressed by silly, day-glow,pastel makeovers and are more interested in useful seo data and information.
Nice. So they use us, make money from our work while paying us nothing, and yet they need to sling insults towards us while we publicly state that we are doing upgrades. Way to be a winner! If only everyone in the world was like them this site would disappear.
And they are completely wrong in suggesting that aesthetic doesn't matter. You can't quantify the losses without testing a different approach, but companies do not sink billions of Dollars into testing CPG packaging just for the fun of it. At a minimum a better looking site will increase trust. That leads to all sorts of other things like:
better perceived quality
lower perceived risk
higher conversion rates
being able to charge higher rates
higher visitor value
more media exposure
In many industries the winner is not who is well known within the industry, but rather who is safe and easy for outsiders to reference. Design is important for the same reason that domain names are. Either can yield an instance sense of credibility when done well & either can quickly take it away when done poorly.
And people who are new to an industry become the experts of tomorrow, so if they trust you more off the start then you build a self-reinforcing marketing channel. Whereas if you are not trusted you have to convince people to switch away from defaults after they already made their choice. And that is hard to do if they already passed you over once & your website is ugly.
And there is also the blunt straight talk feedback: "Your Products are bullshit."
I actually prefer the latter to the former because they don't insult your intellect by wrapping the insults in a passive-aggressive flowery packaging. (OK so I said a nice thing about him, so now I can REALLY insult him!!!)
Expecting Friction
One of the online issues that I think is rarely talked about is the issue of user friction. Media plays up the benefits of success but rarely highlights the cost of it. A popular game developer launched a hugely successful game at 99 cents & was devastated by his success:
I’m angry at a small percentage of customers who actively work towards harming its success. I’m angry at the customers who send me nasty emails or reviews, threatening me with ‘telling Apple to remove it’ or rating it 1 star with a ’should be cheaper than free’ remark because after paying the ridiculously exorbitant 99c, they found it didn’t live up to expectations. The absolute worst is users who condescendingly ‘try to help’ by outlining every little thing they think is wrong with it.
...
The anger, the sense of entitlement, and the overriding theme that I owe them something for daring to take up any of their time is sickening.
...
I can see now why many companies provide rubbish support, and have a ‘give us your money then piss off’ attitude. They have no doubt learned the hard way how soul destroying taking pride in your products can be.
That is a big part of the reason I abandoned the ebook business model. I felt that if I kept the model much longer I was going to have to sacrifice the quality of the customer interaction & be more like the companies I grew to hate. Rather than living that way we move higher up and get a higher quality of customer. Another benefit of our current (or soon to be restored) model is that if people ask questions in a closed garden social setting almost nobody is comfortable acting like a troll. People generally won't write the stuff in a social setting that they would write in an email, especially if they are not fully anonymous and they know doing so is going to make them look like a jerk.
While we still have tons of things to fix, the first things we fixed were related to duplicate content (to simply avoid the pain) and some issues associated with the registration path. The ones that people are going to complain about most are generally the ones you need to fix first, because that ends up saving you time in the longrun.
But if you price too cheaply (but not at free) then it is hard to ignore any of the feedback, even when it is ugly. This is why you are better off having higher prices & only converting a small portion of your audience. The folks from MagneticCat left a good comment on the above blog post:
$0.99 is an unsustainable price point. Because, if you sell 1 million games, you make $700,000 BEFORE taxes. A nice amount of money, but you also get 1 million customers – the amount of people living in a huge city – that could potentially have some problems with your game. Maybe because their iPhone’s accelerometer is broken, or because their headphone jack is not working anymore, or because there is an actual bug in your game.
We are only at about a half-million registered users & it is hard (20+ hour work days) to keep up when anything breaks. I can't imagine what it would be like to have a million PAYING customers. I think I would be sitting in the fetal position somewhere. ;)
That said, I am excited to get our site re-launched again and miss the daily water cooler nature of our forums. And based on the emails I am getting every day, so do many of our customers. Sorry for the delay guys!
Status Update
We have Drupal 7 installed on both parts of the site. We have 3 days of bug fixing left and testing our membership software (which will also take a couple days). We may try to do some of it concurrently & test our membership software Sunday or Monday & hope to have a recurring test & a cancellation test done by Tuesday evening for a soft launch to past subscribers. If that goes well then we would hope to do a full launch before the end of next week.
I have never been a huge fan of correlation analysis. The reason being is that how things behave in aggregate may not have anything to do with how they would behave in your market for your keywords on your website.
Harmful High Quality Links?
A fairly new website was ranked amazingly quickly on Google.com for a highly competitive keyword. It wasn't on the first page, but ranked about #20 for a keyword that is probably one of the 100 most profitable keywords online (presuming you could get to a #1 ranking above a billion Dollar corporation). The site did a promotion that was particularly well received by bloggers and a few bigger websites in the UK press and at first rankings improved everywhere. Then one day while looking at its rankings using rank checker I saw the site simply fell off the map. It was nowhere. I then jumped into web analytics and saw search traffic was up. What happened was Google took the site as being from the UK, so its rankings went to page 1 in the UK while the site disappeared from the global results. In aggregate we know that more links are better & links from high trusted domains are always worth getting. And yet in the above situation the site was set back by great links. Of course we can set the geographic market inside Google Webmaster Tools to the United States, but how long will it take Google to respond? How many other local signals will be fixed to pull the site out of the UK?
Over time those links will be a net positive for the site, but it still needs to develop more US signals. And beyond those sort of weird things (like links actually hurting your site) the algorithms can look for other signals to push into geotargeting. Things like Twitter mentions, where things are searched for, how language is used on your website, and perhaps even your site's audience composition may influence localization. What is worse about some of these other signals is that they may mirror media coverage. If you get coverage in The Guardian a lot of people from the UK will see it, and so you might get a lot of Tweets mentioning your website that are from the UK as well. In such a way, many of the signals can be self-reinforcing even when incorrect.
Measuring The Wrong Thing
Another area where correlation analysis falls short is when one page ranks based on the criteria earned by another. Such signal bleeding means that if you are looking at things in aggregate you are often analyzing data which is irrelevant.
Sampling Bias
Correlation analysis also has an issue of sampling bias. People tend to stick with defaults until they learn enough to change. Unfortunately most CMS tools are set up in sub-optimal ways. If you look at the top ranked results some of the sub-optimal set ups will be over-represented in the "what works" category simply because most websites are somewhat broken. The web is a fuzz test.
Of course the opposite of the above is also true: some of the best strategies remain hidden in plain sight simply due to sheer numbers of people doing x poorly.
Analyzing Data Pairs Rather Than Individual Signals
Another way signals have blurred is how Google uses page titles in the search results. That generally used to be just the page title. But more recently they started mixing in
using an on-page heading rather than the page title (when they feel the on-page heading is more relevant)
adding link anchor text into the title (in some cases)
adding the homepage page's title at the end of sub-pages (when sub-page page titles are short)
As Google adds more signals & changes how they account signals it makes analyzing what they are doing much harder. You not only need to understand how the signals are used, but how they interact in pairs or groups. When Google uses the H1 heading on a page to display in the search results are they still putting a lot of weight on the page title? Does the weighting on the H1 change depending on if Google is displaying it or not?
Along the same lines, any given snapshot of search is nowhere near as interesting as understanding historical trends and big shifts. If you are one of the first people to notice something there is far more profit potential than being late to the party. Every easily discernible signal Google creates eventually gets priced close to (or sometimes above) true market value. Whereas if you are one of the first people to highlight a change you will often be called ignorant for doing so. :D
Consensus is the opposite of opportunity.
When you do correlation analysis you are finding out when the market has conformed to what Google trusts & desires. Exact match domains were not well ranked across a wide array of keywords until after Google started putting more weight on them & people realized it. But if there is significant weight on them today & their prices are sky high then knowing that they carry some weight might not be a real profit potential in your market. It might even be a distraction or a dead end. Imagine being the person who bets (literally) a million Dollars that Google will place weight on poker.org only to find out that Google changes their algorithmic approach & weighting, or makes a special exception just for your site (as they can & have done). That day would require some tequila.
As a marketing approach becomes more mainstream then not only do the cost rise, but so does the risk of change. As people complain about domain names (or any other signal or technique) it makes Google more likely to act to curb the trend and/or lower it's weighting & value. To see an extreme version of such, consider that the past year has seen lots of complaints about content farms. A beautiful quote:
Searching Google is now like asking a question in a crowded flea market of hungry, desperate, sleazy salesmen who all claim to have the answer to every question you ask.
Over the past year or 2 there have been lots of changes with Google pushing vertical integration, but outside of localization and verticalization, core relevancy algorithms (especially in terms of spam fighting) haven't changed too much recently. There have been a fewtricky bits, but when you consider how much more powerful Google has grown, their approach to core search hasn't been as adversarial as it was a few years back (outside of pushing more self promotion).
There has been some speculation as to why Google has toned down their manual intervention, including:
anti-trust concerns as Google steps up vertically driven self-promotion (and an endless well of funding for anyone with complaints, courtesyMicrosoft)
a desire to create more automated solutions as the web scales up
spending significant resources fighting site hacking (the "bigger fish to fry" theory)
As we’ve increased both our size and freshness in recent months, we’ve naturally indexed a lot of good content and some spam as well. To respond to that challenge, we recently launched a redesigned document-level classifier that makes it harder for spammy on-page content to rank highly. The new classifier is better at detecting spam on individual web pages, e.g., repeated spammy words—the sort of phrases you tend to see in junky, automated, self-promoting blog comments. We’ve also radically improved our ability to detect hacked sites, which were a major source of spam in 2010. And we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.
It sounds like Google was mainly focused on fighting hacked sites and auto-generated & copied content. And now that hacked *GOVERNMENT* websites are available for purchase for a few hundred Dollars (and perhaps millions in personal risk when a government comes after you) it seems like Google's pushing toward fighting off site hacking was a smart move! Further, there are a wide array of start ups built around leveraging the "domain authority" bias in Google's algorithm, which certainly means that looking more at page by page metrics was a needed strategy to evolve relevancy. And with page-by-page metrics it will allow Google to filter out the cruddy parts of good sites without killing off the whole site.
As Google has tackled many of the hard core auto-generated spam issues it allows them to ramp up their focus on more vanilla spam. Due to a rash of complaints (typically from web publishers & SEO folks) content mills are now a front and center issue:
As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content. In 2010, we launched two major algorithmic changes focused on low-quality sites. Nonetheless, we hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content. We take pride in Google search and strive to make each and every search perfect. The fact is that we’re not perfect, and combined with users’ skyrocketing expectations of Google, these imperfections get magnified in perception.
But what sort of sites are the content mills that Google is going to ramp up action on?
The tricky part with vanilla spam is the subjective nature of it. End users (particularly those who are not web publishers & online advertisers) might not complain much about sites like eHow because they are aesthetically pleasing & well formatted for easy consumption. The content might be at a low level, but maybe Google is willing to let a few of the bigger players slide. And there is a lot of poorly formatted expert content which end users would view worse than eHow, simply because it is not formatted for online consumption.
However, in a Hacker News thread about Matt's recent blog post he did state that they have taken action against Mahalo: "Google has taken action on Mahalo before and has removed plenty of pages from Mahalo that violated our guidelines in the past. Just because we tend not to discuss specific companies doesn't mean that we've given them any sort of free pass."
My guess is that sites that took a swan dive in the October 23rd timeframe might expect to fall off the cliff once more. Where subject search relevancy gets hard is that issues rise and fall like ocean waves crashing ashore. Issues that get fixed eventually create opportunities for other problems to fester. And after an issue has been fixed long enough it becomes a non-issue to the point of being a promoted best practice, at least for a while.
Anyone who sees opportunity as permanently disappearing from search is looking at a half-empty glass rather than one which sees opportunities that died reborn again and again.
That said, I view Matt's blog post as a bit of a warning shot. What types of sites do you think he is coming after? What types of sites do you see benefiting from such changes? Discuss. :)
Google likes to make SEOs look like fools. Some are, but some are simply privy to less information. Or, in some cases, thrown under the bus by a new wave editorial policy in the gray area. Inconsistent enforcement is a major issue, but even if you go beyond that, the truth is most businesses have a range of revenue streams from pure as can be to entirely parasitic.
In Manufacturing Consent Noam Chomsky highlights that we should judge actions based on an equality of principals & that we are responsible primarily for our own actions. Yet Google complains about Microsoft. It took Microsoft less than a day to clean up their act, while Google still hasn't fixed issues that were highlighted publicly 6 years ago!
Many Subjective Warnings
Not only is Google trying to police their competitors, but recently they have offered warnings on all sorts of subjective issues, like...
an out of context tweet on cloaking: "Google will more at cloaking in Q1 2011. Not just page content matters; avoid different headers/redirects to Googlebot instead of users."
Individually, each of those issues can be debated.
In our new site design our navigation is aggressively repetitive in some areas. The reason we did that was some people complained about not being able to effectively get around the site. To help make the navigation more intuitive and consistent we use drop downs and in some cases have 3 or 4 or even 5 links to the same location. Is that optimal from a search perspective? Probably not. But then again, search engines don't convert into paying customers. They are simply a conduit...a means to an end. When an engineer views a site they might not view it through the same lens as a customer would.
What is an unnatural link profile? Does it depend on who is building the links? We know that at an SEO conference when some of IAC's cross linking was highlighted Matt Cutts stated "those don't count" but didn't qualify it any further. Likewise when it was highlighted how Mahalo was link farming we were told that they deserved the benefit of the doubt. Since then the link farms have grown and mutated. I won't link at ask.inc.com/when-should-i-hire-a-company-for-lead-generation, but if I was told that the following is "natural" and "good to go" then I would have no problems building a few million links a week. Then again, I bet it would be "unnatural" if I did the same thing.
The part about treating Googlebot different from users is a bit perplexing. As technology has evolved this area has become quite blurry/murky.
Sometimes when clicking into big media sites that are 'first click free' I get kicked right to a registration page. In the past some iTunes pages would rank & force you into the iTunes software (though that may have recently changed).
Tools like Google Website Optimizer can be used to alter user experience significantly.
There is an SEO start up which pushes search visitors to sites like CNN to a heavily ad wrapped & paginated version of the same content.
I accidentally screwed up using a rel=canonical on a page (cut the source code from a dynamic page and pasted it as the basis for a similar static page & forgot to remove the rel=canonical tag). Eventually I figured out what was wrong & fixed it, but both the correct and incorrect pages ranked for weeks at #1 and #2. And isn't the whole point of the rel=canonical tag to give the search engines a different type of header than an end user (telling the search engine that the content is elsewhere while telling the user nothing of the sort)?
But which of those is a problem? None of them? All of them? Does it depend on who you ask? Does it depend on perceived intent? And that is the tricky part because the same impact can be had with many different tools like 301 redirects, meta refreshes, javascript redirects, and rel=canonical. Should they penalize the technique, the intent, or the result? How do they imply intent?
The thing is, Google is in a position to imply intent as they see fit. They are in a position to tilt the playing table as they see fit. They claim to be open and sometimes they are fighting the good fight, but businesses have a range of revenue streams from pure as can be to entirely parasitic.
The leaked internal Google documents about copyright and ads on trademarks certainly highlight that Google has no problem with a foot in each pond.
Syndication has long been a part of the media landscape, where portals chose what bits to mix in from where. But how much is fine & what should be done with duplicates? When does something go from 'legitimate syndication' to 'overt spam'? We see official Google blog posts which claim that AdSense ads are not allowed on unoriginal content, while Google has multiple large partners that wrap Google's search results in the AdSense feed and then serve it back to Google. Site categories which were described as 'shoot on sight' become viable enterprises when a person puts a web 2.0 design, venture capital & some public relations into the same basic business model. If Google is going to put out some 'thou shalt not' styled commandments under the label of 'fact vs fiction' they should have consistent enforcement of obvious issues that have been brought up publicly numerous times, including on the very post highlighting the policy. But we know they won't! They only care about dirty business practices if they are not getting a taste of the revenue stream (as shown by their BearShare partnership while policing Bing affiliates).
After purchasing Youtube Google rolled out their universal search & was fine with aggressively promoting Youtube over other video services. Only recent government reviews have pushed Google to give 3rd party services a fair shake, but the network effects and lead are likely already too great to overcome.
Due to past search bias, Google might get blocked out of completing the ITA deal. The good news going forward for publishers is due to increasing regulatory heat Google will only go after a small number of verticals where they payouts are huge. The regulatory blowback will be too great for them to try to be all things to all people.