The Google example demonstrates that no one can be good at everything, even if they do hire a lot of smart people and have billions in the bank. People, like companies, are good at doing some things, and are okay, or poor, at everything else. Not because they can’t do other things, but because they don't have the time, inclination or disposition.
This truth has led to specialization. Individuals specialize. Companies specialize. Countries specialize. In economics, this is the principle of comparative advantage and it is the basis of trade. We do the things we’re good at, and buy in the things we’re not so good at, or don’t want to do.
We Can’t Be Good At Everything
Like Google, we can’t be good at everything.
Many small businesses make the mistake of trying to do everything, mainly due to lack of resources. However, the opportunity cost of trying to do everything can mean they end up being not very good at doing any one thing. This approach can make them uncompetitive.
Just because we can do something doesn’t mean we should.
Like many web professionals, I can do some coding. Some web design. A bit of this. A bit of that. But I know there are people who are way better at those things that I am, so I let them do it. If I focus on what I’m good at, then I can make money doing that, and buy in the skills I need.
There are many benefits to specialization. For starters, it’s much easier to build a reputation or brand. Who is the go-to guy for Search Engine News? Many people would answer Danny Sullivan. Who is the go-to guy for search patents? That would be Bill.
Another advantage is that specialization leads to omptimized and more efficient processes, and therefore lower costs. The specialist can optimize and improve their approach to a niche activity in a way a generalist seldom can because their focus means they are more likely to see the details.
Most of us occupy very crowded marketplaces, which makes it difficult to stand out as a generalist. Brands, and reputations, can get confused and diluted if businesses spread themselves over multiple service areas. Virgin gets away with it - mainly because of the brand that is Richard Branson - but they are an exception, not the rule.
Not that this article is about the merits of specialization vs being a generalist. More a case of optimizing a business to focus on those areas that are most lucrative, and literally weeding out everything else.
Weeding Out Clients
A lot of companies, like a lot of people, live paycheck to paycheck. They don’t want to turn down any business, because the more clients, the better, right? The more opportunities, the better?
Not all clients are equal. Not all opportunities are a good fit. A client who costs a lot to service, who doesn’t pay their bills on time, who makes life difficult for you is probably not a client worth having. Sure, they might help keep us going to the next paycheck, but this is not an optimal way to run a sustainable business long term. Such clients present an opportunity cost i.e. we could be working with better clients, be making better money, and honing our service around mutual benefit.
For this reason, many companies make a habit of firing clients, or never take them on in the first place.
For example, last year, I received a letter from my accountant. She advised me they were reviewing their business and letting a lot of their clients go, although they were still happy to work with me, and asked that I have a chat with them if I had any concerns.
I did have a chat with them, mainly to confirm my suspicions.
They were deliberately getting rid of 50% of their clients. They had figured out who their top clients were i.e. the clients who took them the least time to service because their books were in order, and they eliminated the rest i.e. those clients whos books were a mess and were generally a pain to deal with. They downsized their business, reduced overhead and now tell me they are making more money than they previously were due to their optimized cost structure. They also appear to be playing a lot more golf!
They optimized their business, became more profitable, and have a lot more time because they made a point of figuring out the core of their business, and saying “no” to everything else.
Say No
Saying no can be very powerful. Prospective clients seem to respect this more, not less. There is something very appealing about a service that is exclusive and beyond reach. It signals a level of confidence that can be attractive.
Exclusive positioning is not just done for the sake of it. It’s a way to filter clients in order to find a good fit, which is especially important for small companies, as they have less resources available to carry bad risks. If we can figure out a client need that we know we can service well (specialization), with sufficient margins for us to be enthusiastic, and the client gets the value they were looking for, then everyone wins.
Let’s say running a PPC bid management service earns an internet marketing company the most money with the least effort. Let’s say they also do web design, but this is a lot more work (read: higher cost to service), and the margins are lower.
Would this company be better off saying “no” to new web design business? Quite possibly. They could dedicate more time to PPC, their PPC processes would get more refined through increased specialization, and they would likely be better placed to compete in the PPC space as their brand and attention becomes more focused. They could let go of the web designer, thus reducing overhead.
Granted, there are many factors to consider, but the question is this: are some areas of your business being serviced only because you can? Or does it make sense to focus on the areas where there is best fit? i.e. better margins, lower costs, most productive relationships - even if that means letting some clients, and even some staff, go?
You might be working for a boss who is an idiot. You know he makes stupid decisions. When he's off making yet another stupid decision, it's you left doing all the work. As for job security - that's a joke these days. He can fire you on a whim.
So why not cut out the weak link? Why not go into business for yourself?
The reality, of course, is that starting a business isn't as easy as saying it. You'll likely work longer hours, for less money, and there are no guarantees. While your friends are looking forward to the weekend, you might not see a weekend for a while. Most small businesses fail in their first five years, taking dreams and savings along with them.
Everything has a downside.
However, many businesses not only survive, they prosper. They make their founders wealthy. Even if they don't make fortunes, they can provide lifestyle benefits that are near impossible to achieve with a regular job. There's a lot to be said for being the master of your own destiny.
To achieve that, it's best to start with some good advice.
Makin’ Mistakes
I've been running my own small business for a decade now. Whilst it's been rewarding, and I achieved the goals I set for myself, there has also been a fair few missed opportunities and inevitable wrong turns. I jumped in blind, and like many in the search marketing industry, pretty much made it up as I went along.
Not that there's anything wrong with that.
But I wished I had understood a few fundamental truths first. I wished someone had imparted some profound wisdom, and I wished I had been smart enough to listen. Come to think of it - they did, and I wasn't.
Such is life.
I’m in the process of setting new goals for the next few years. I'm restructuring. So I decided to reflect on the past, examine the good and the bad, and try to do more of the former, and less of the latter.
One of the problems I identified was that I was spreading myself way too thin over many projects. I have a *lot* of sites. I have domains I’d even forgotten I owned. I have domain names I keep renewing, vowing to do something with one day, yet never getting around to it.
In short, I was growing an awful lot of small pumpkins.
Getting The Fundamentals Right
I’ve decided to ditch almost all of what I have been doing in the past, and focus on a very narrow range of activities, one of which is working with Aaron on SEOBook.
One book I really wish I'd read when I was starting out - had it been available, which it wasn’t - is called "The Pumpkin Plan: A Simple Strategy To Grow A Remarkable Business". I'd like to share the central theme of the book with you, because I think it's a great lesson if you're thinking of starting a business, or, like me, optimizing an existing one.
It’s the lesson I wished I’d understood when I started. I certainly hope it’s of help to someone else :)
If You Want To Prosper, Learn To Grow Pumpkins
There are geek farmers who obsess about growing huge pumpkins. They are the hackers of the vegetable world. In order to grow a huge pumpkin - weighing half a ton or more - you can’t just throw seeds on the ground. You can’t grow a whole lot of pumpkins and hope one of them turns out to be huge.
You’ve got to follow a process.
And here it is:
Step One: Plant promising seeds
Step Two: Water, water, water
Step Three: As they grow, routinely remove all of the diseased or damaged pumpkins
Step Four: Weed like a mad dog. Not a single green leaf or root permitted if it isn’t a pumpkin plant
Step Five: When they grow larger, identify the stronger faster growing pumpkins. Then, remove all the less promising pumpkins. Repeat until you have one pumpkin on each vine.
Step Six: Focus all of your attention on the big pumpkin. Nurture it around the clock like a baby and guard it like you would your first Mustang convertible
Step Seven: Watch it grow. In the last days of the season this will happen so fast you can actually see it happen
What’s this got to do with business? It’s a process for growing not just pumpkins, but businesses. Let's apply it:
Step One: Identify and leverage your biggest natural strengths
Step Two: Sell, Sell, Sell
Step Three: As your business grows, fire all your small time, rotten clients
Step Four: Never, ever let distractions - often labelled as new opportunities - take hold. Weed them out fast.
Step Five: Identify your top clients and remove the rest of the less promising clients
Step Six: Focus all your attention on your top clients. Nurture and protect them. Find out what they want more than anything and if its in alignment with what you do best, give it to them. Then, replicate the same service or product for as many of the same types of top client as possible
Step Seven: Watch your company grow to a giant size
In essence, it’s about focusing on those things you do best. It’s about focusing on your very best customers, and ditching the rest. It’s about creating your own niche by identifying and solving the problems that no one else does.
None of this is new, of course. There are plenty of business advice books that say similar things. However, this is one of those great little stories I wish I had internalized earlier. Rather than grow a lot of small pumpkins, focus on growing those that matter.
Given recent changes at Google, I dare say a lot of SEOs - particularly those who run their own small sites - may be rethinking their approach. Unfortunately, the small guy is being squeezed and the rewards, like in most endevours, are increasingly flowing to large operations. Search conferences, which used to be the domain of the lone-wolf affiliate guy and mom and pop businesses are now jammed full of corporates and their staff. The entire landscape is shifting. New approaches are required, not just in terms of tactics, but in the underlying fundamentals.
It would be interesting to hear your lessons in business. What are the things you know now that you wished someone had told you when you started? Please share them in the comments.
Google launched a disavow links tool. Webmasters who want to tell Google which links they don’t want counted can now do so by uploading a list of links in Google Webmaster Tools.
If you haven’t received an “unnatural link” alert from Google, you don’t really need to use this tool. And even if you have received notification, Google are quick to point out that you may wish to pursue other avenues, such as approaching the site owner, first.
Webmasters have met with mixed success following this approach, of course. It's difficult to imagine many webmasters going to that trouble and expense when they can now upload a txt file to Google.
Careful, Now
The disavow tool is a loaded gun.
If you get the format wrong by mistake, you may end up taking out valuable links for long periods of time. Google advise that if this happens, you can still get your links back, but not immediately.
Could the use of the tool be seen as an admission of guilt? Matt gives examples of "bad" webmaster behavior, which comes across a bit like “webmasters confessing their sins!”. Is this the equivalent of putting up your hand and saying “yep, I bought links that even I think are dodgy!”? May as well paint a target on your back.
Some webmasters have been victims of negative SEO. Some webmasters have had scrapers and autogen sites that steal their content, and then link back. There are legitimate reasons to disavow links. Hopefully, Google makes an effort to make such a distinction.
Not only is it difficult working out the links that may be a problem, it can be difficult getting a view of the entire link graph. There are various third party tools, including Google’s own Webmaster Central, but they aren’t exhaustive.
Matt mentioned that the link notification emails will provide examples of problem links, however this list won't be exhaustive. He also mentioned that you should pay attention to the more recent links, presumably because if you haven't received notification up until now, then older links weren't the problem. The issue with that assumption is that links that were once good can over time become bad:
That donation where you helped a good cause & were later mortified that "online casino" and "discount cheap viagra" followed your course for purely altruistic reasons.
That clever comment on a well-linked PR7 page that is looking to cure erectile dysfunction 20 different ways in the comments.
Links from sources that were considered fine years ago & were later repositioned as spam (article banks anyone?)
Links from sites that were fine, but a number of other webmasters disavowed, turning a site that originally passed the sniff test into one that earns a second review revealing a sour stench.
This could all get rather painful if webmasters start taking out links they perceive to be a problem, but aren’t. I imagine a few feet will get blasted off in the process.
Webmasters Asked, Google Gaveth
Webmasters have been demanding such a tool since the un-natural notifications started appearing. There is no question that removing established links can be as hard, if not harder, than getting the links in the first place. Generally speaking, the cheaper the link was to get the higher the cost of removal (relative to the original purchase price). If you are renting text link ads for $50 a month you can get them removed simply by not paying. But if you did a bulk submission to 5,000 high PR SEO friendly directories...best of luck with that!
It is time consuming. Firstly, there’s the overhead in working out which links to remove, as Google doesn’t specify them. Once a webmaster has made a list of the links she thinks might be a problem, she then needs to go through the tedious task of contacting each sites and requesting that a link be taken down.
Even with the best will in the world, this is an overhead for the linking site, too. A legitimate site may wish to verify the identity of the person requesting the delink, as the delink request could come from a malicious competitor. Once identity has been established, the site owner must go to the trouble of making the change on their site.
This is not a big deal if a site owner only receives one request, but what if they receive multiple requests per day? It may not be unreasonable for a site owner to charge for the time taken to make the change, as such a change incurs a time cost. If the webmaster who has incurred a penalty has to remove many links, from multiple sites, then such costs could quickly mount. Taken to the (il)logical extremes, this link removal stuff is a big business. Not only are there a number of link removal services on the market, but one of our members was actually sued for linking to a site (when the person who was suing them paid to place the link!)
It’s hard to imagine this data not finding it’s way to the manual reviewers. If there are multiple instances of webmasters reporting paid links from a certain site, then Google have more than enough justification to take it out. This would be a cunning way around the “how do we know if a link is paid?” problem.
Webmasters will likely incorporate bad link checking into their daily activities. Monitoring inbound links wasn’t something you had to watch in the past, as links were good, and those that weren’t, didn’t matter, as they didn’t affect ranking anyway. Now, webmasters may feel compelled to avoid an unnatural links warning by meticulously monitoring their inbound links and reporting anything that looks odd. Google haven’t been clear on whether they would take such action as a result - Matt suggests they just reclassify the link & see it as a strong suggestion to treat it like the link has a nofollow attribute - but no doubt there will be clarification as the tool beds in. Google has long used a tiered index structure & enough complaints might lower the tier of a page or site, cause it's ability to pass trust to be blocked, or cause the site to be directly penalized.
This is also a way of reaffirming “the law”, as Google sees it. In many instances, it is no fault of the webmaster that rogue sites link up, yet the webmaster will feel compelled to jump through Google’s hoops. Google sets the rules of the game. If you want to play, then you play by their rules, and recognize their authority. Matt Cutts suggested:
we recommend that you contact the sites that link to you and try to get links taken off the public web first. You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business.
Left unsaid in the above is most people don't have access to aggregate link data while they surf the web, most modern systems of justice are based on the presumption of innocence rather than guilt, and most rational people don't presume that a site that is linked to is somehow shady simply for being linked to.
If the KKK links to Matt's blog tomorrow that doesn't imply anything about Matt. And when Google gets featured in an InfoWars article it doesn't mean that Google desires that link or coverage. Many sketchy sites link to Adobe (for their flash player) or sites like Disney & Google for people who are not old enough to view them or such. Those links do not indicate anything negative about the sites being linked into. However, as stated above, search is Google's monopoly to do with as they please.
On the positive side, if Google really do want sites to conform to certain patterns, and will reward them for doing so by letting them out of jail, then this is yet another way to clean up the SERPs. They get the webmaster on side and that webmaster doing link classification work for them for free.
Who, Not What
For a decade search was driven largely by meritocracy. What you did was far more important than who you were. It was much less corrupt than the physical world. But as Google chases brand ad Dollars, that view of the search landscape is no longer relevant.
Large companies can likely safely ignore much of the fear-first approach to search regulation. And when things blow up they can cast off blame on a rogue anonymous contractor of sorts. Whereas smaller webmasters walk on egg shells.
When the government wanted to regulate copyright issues Google claimed it would be too expensive and kill innovation at small start ups. Google then drafted their own copyright policy from which they themselves are exempt. And now small businesses not only need to bear that cost but also need to police their link profiles, even as competitors can use Fivver, ScrapeBox, splog link networks & various other sources to drip a constant stream of low cost sludge in their direction.
Now more than ever, status is important.
Gotchas
No doubt you’ve thought of a few. A couple thoughts - not that we advocate them, but realize they will happen:
Intentionally build spam links to yourself & then disavow them (in order to make your profile look larger than it is & to ensure that competitor who follows everything you do - but lacks access to your disavow data - walks into a penalty).
Find sites that link to competitors and leave loads of comments for the competitor on them, hoping that the competitor blocks the domain as a whole.
Find sites that link to competitors & buy links from them into a variety of other websites & then disavow from multiple accounts.
Get a competitor some link warnings & watch them push to get some of their own clean "unauthorized" links removed.
The webmaster who parts on poor terms burning the bridge behind them, or leaving a backdoor so that they may do so at anytime.
If a malicious webmaster wanted to get a target site in the bad books, they could post obvious comment spam - pointing at their site, and other sites. If this activity doesn’t result in an unnatural linking notification, then all good. It’s a test of how Google values that domain. If it does result in an unnatural link notification, the webmaster could then disavow links from that site. Other webmasters will likely do the same. Result: the target site may get taken out.
To avoid this sort of hit, pay close attention to your comment moderation.
Please add your own to the comments! :) Gotchas, that is, not rogue links.
There is a good thread on Webmaster World entitled “Next Generation SEO”. Many webmasters hit hard by the uncertainty created by Panda and Penguin are wondering what approaches might work best in the future.
Here at SEOBook, we’ve long advocated the approach of grounding SEO within a solid marketing-based strategy, so we post frequently on this theme. But this is not to say the technical side - algorithm gaming - no longer works, because it most certainly does.
Let’s take a look at both approaches, and how they can be fused.
Rationale Of SEO
The rationale of search engine optimization is simple. If we can work out what factors the search engine algorithms favor, we can do more of it. Our reward will be a high ranking, meaning more people can find us, which results in more traffic.
In theory, the incentive of the search engine and that of the SEO should be aligned. The SEO works out exactly what the search engine wants, and hands it to them on a plate.
The problem is reality.
Search engines, in reality, aren’t near as clever as those running them sometimes make out. They are susceptible to gaming. Sergey Brin, a few years back, even went so far as to suggest “there is no spam, only bad algorithms”. The existence of a webspam team appears to indicate the algorithms still need plenty of work.
There are other problems in terms of incentives. The search engines make their money not by the relevance of the search results, but by the relevance of the advertising. Unless competition is finely balanced between search engines - which it hasn’t been for many years now - the search results need only be “good enough”. Does a search engine really want the searcher finding the most relevant content in the main search results? The business case would demand something a little more subtle, which is not good news for webmasters who rely on gaming the algorithms.
Secondly, a search engine would prefer the money being spent on SEO was going into PPC. If it is considerably cheaper to game the algorithm than run PPC, then few people would bother with PPC. In this respect, SEOs are a competitive threat to the search engines advertising business, yet the incentive for the webmaster is to minimize marketing spend. The search engines, therefore, would have an incentive to drive up the cost of SEO.
Gaming The Algorithm
Given search engine algorithms aren’t yet as clever as those who run would like them to be, the search engines engage in a mixture of algorithm adjustments and FUD in order to counter the effectiveness of search engine optimization. Lately, they have been ramping up activity in both areas, which suggests to me that they see “enthusiastic” search engine optimization as a significant problem.
SEO is a significant problem because it works.
The minute something starts to work a little too well, and the knowledge of how to do it becomes a little too widespread, the search engines have typically jumped in to reset the game. This tends to affect the “low-hanging fruit techniques” i.e. anything that is cheap to do, widely known, and therefore widely practiced.
Take link building. It’s a well known fact that links will result in higher rankings. Google recently jumped into this area, with a mix of adjustments and FUD, to scare off those who buy links. We have the now ridiculous scenario whereby webmasters are paying to have links removed. I can imagine the Google “web quality” team in fits of laughter. Meanwhile, a cottage industry has sprung up in response i.e. people demanding money to delink.
Many of the links webmasters are removing aren't causing the problem.
Link buying still works. Webmasters just need to stay away from the most blatant “low hanging fruit techniques”, such as identical anchor text and obvious paid link networks. At SEOBook, we’ve made side-by-side comparisons of sites that were harmed by their links, and those that weren’t, yet there are often only minor differences in their link graphs. Paid links aren't the problem. Failing to mix it up is the problem.
Search engine optimization is also about content, and the evidence doesn’t suggest that having a lot of content is a negative. It depends where content sits. If the domain has a strong link structure, then the content can be....less than stellar. If your site lacks a stellar link graph, then it becomes more important to have engaging content i.e. fewer immediate click-backs
Moving Forward
The problem with algorithmic-centric approaches is that when the algorithm shifts, so too does the approach. This is fine for people who like chasing the algorithms. It works less well for those who don’t have a lot of time and money to spend doing so, or have less scope to radically change approach.
“Next generation SEO” is essentially “marketing”, whilst keeping the search engines firmly in mind. It's what good SEO has always been about, it's just that some in the industry have lost focus and become obsessed with traffic for its own sake.
It’s about building links for traffic, juice and awareness. It’s about establishing networks, particularly in the social space, as we have to go where the people hang out. It means paying attention to the rise of mobile usage. It means paying attention to verticals, such as Amazon. There is evidence to suggest people are moving away from search engines, or not using them quite as much.
In short, SEO will become more like a pay-per-click approach, without paying per click.
Mix And Match
This is not to say you need to do any of the above. You can still do well in SEO by publishing pages and pointing links at them. However, there are many aspects that can be optimized. If nothing else, optimizing a web business can be more fun, and more prosperous, than chasing Google’s near constant algorithm updates in pursuit of raw traffic.
SEO will be around for many years yet. Whilst search engines are a conduit for traffic, then there will be people doing SEO. It’s fair to say the barrier to entry has been raised, so it’s now more difficult and therefore costly to undertake SEO. This flushes out some competitors, however it is rarely those with holistic marketing strategies that get flushed.
We optimize to align a site with search engine algorithms in order to gain higher rankings, which, in turn, leads to visitor traffic. Other forms of optimization occur after the visitor has landed on our pages.
One such optimization is called persuasion optimization.
After going to the effort of getting a visitor to land on our pages, the last thing we want them to do is to click back. We want them to read and act upon our messages.
There Are Many Ways To Persuade
Robert Cialdini, a Professor of Psychology at Arizona State University, identified six categories into which persuasion techniques commonly fall: reciprocity, consistency, social proof, liking, authority and scarcity.
We can use these techniques to optimize both our content and site design so that visitors are more likely to stay on our pages, and more likely to convert to desired action.
Whilst these techniques could be seen as being manipulative, it depends how they’re used. If used in good faith, they’re a natural part of the ritual involved in selling people on our point of view. On the other hand, being aware of these techniques makes them easy to spot if used against us!
1. Reciprocity
Reciprocity is when we give something to someone, and they return the favour. The act of reciprocity is so ingrained in our culture, it can occur whether the person asked for the favor or not, and whether the people involved previously knew each other, or not.
Reciprocity creates an obligation.
Examine your offer to see if you can give something of real value away. For example, some information product vendors give away large chunks of the product, or long trials. People may reciprocate by paying for the remaining sections or full product. They may have been less likely to do so if the vendor gave less value up front. Think about what you can do for your audience, rather than the other way around.
Another way of thinking about reciprocity is to provide a concession. If you concede something small, but do so early, the other party may feel obligated to concede something greater later on.
For example, ask for something significant. When this is turned down, ask for something moderate - the moderate request being what you wanted all along. The second request is more likely to be accepted as it appears you’ve already made a concession, so the other party feels obligated to do likewise.
2. Commitment and Consistency
People like people to be consistent.
People who lack consistency can be seen as untrustworthy or disorganized. If we’re consistent, it reduces complexity, because other people don’t have to re-evaluate us each time they need to make a decision about us. They merely need to remain consistent with their previous evaluation of us, and their previous decision. If we start acting differently, it forces a re-evaluation.
The same goes for websites.
Look for areas of your website where the messages may conflict. This could be as obvious as a mistake in the copy about the offer, or as subtle as a change in tone of voice. Each page should flow from one to the next in a consistent manner, using consistent tone and design, and the message should not contradict, or wander off on unexpected tangents.
There are exceptions of course. If you’re trying to shock people, or draw attention to something out of the ordinary, then playing against consistency can work. However, consistency would have to have been established first, before it’s possible to successfully play against it.
3. Social Proof
Does your site show evidence that other people find it valuable? Examples may include testimonials, reviews, and associations.
Social proof helps establish trust quickly by leveraging existing trust relationships. If someone trusts those associations you cite - say, “As seen in the New York Times” - or is merely inclined to trust the crowd over their own judgement, then the path of least resistance is to trust you, too.
Establishing trust quickly is critical online, because it’s easy for the user to click back, so social proof can be a very powerful technique.
It’s also easy to get wrong, as to can look contrived. People are most likely to be persuaded by social proof if the person or entity providing the proof is a known authority. Who does your audience already know and trust? Some sites make the mistake of using testimonials from non-entities.
4. Liking
People like to do business with those they like.
Attractive people, rightly or wrongly, can be persuasive, as others tend to assign them positive traits. At a base level, the use of physically attractive male and female models is a staple of the advertising industry. At a higher level, people respond to people who are like them. The attraction is based on similarity.
In terms of web marketing, your level of “likeability” will very much depend on the audience. A
site selling fashion is likely to be aspirational i.e. less like the actual audience, but exhibiting attractive traits to which the audience aspires. A site selling technical solutions will likely focus on familiarity and affinity. A site about weighty subjects will likely convey intellectualism. It's all a way of mirroring the audience, either literally who they are or how they perceive they are, in order to be liked.
Another aspect of liking is association. Look at ways you can associate yourself with entities or people you visitors already like. Common tactics include aligning your site with a charity, celebrity or industry event.
5. Authority
People often respond to authority figures.
“Correct conduct” is a response to authority figures. For example, the “white hat/black hat” positioning in SEO is defined by an authority figure, in this case, the search engine and their representatives.
Authority on websites can be conveyed using symbols, qualifications and associations. However, these days, people tend to more cynical of authority than in times past. They will likely question authority by wanting to see evidence of claims made, and try to establish if the person telling them the information is trustworthy.
Does your site offer evidence and proof of your claims?
6. Scarcity
We tend to undervalue what is plentiful, and overvalue what is scarce.
An overt use of this tactic is to create artificial scarcity, particularly in the frauduct world. For example, I’m sure you’ve seen aggressive marketers claiming there are only so many places/products left, in an attempt to make you perceive scarcity, so you’re more inclined to act impulsively.
Cialdini notes:
“According to psychological reactance theory, people respond to the loss of freedom by wanting to have it more. This includes the freedom to have certain goods and services. As a motivator, psychological reactance is present throughout the great majority of a person's life span. However, it is especially evident at a pair of ages: "the terrible twos" and the teenage years. Both of these periods are characterized by an emerging sense of individuality, which brings to prominence such issues as control, individual rights, and freedoms. People at these ages are especially sensitive to restrictions”.
People are most attracted to scarcity when they are newly scare i.e. they haven’t always been scarce, and secondly, when other people are competing for the same resources. In terms of a website, these two concepts could be combined. Time is both running out, and demand has been overwhelming. This is also a form of social proof, of course.
There are elements of manipulation and story-telling in marketing, and no doubt you can see these concepts in some of the worst examples of web marketing. But they also exist in some of the best. And no doubt we all use some of these techniques, possibly unknowingly, in our everyday lives.
These ideas can be very powerful when combined on a website. Try evaluating your competitors against each of the six categories. Have they used them well? Overused them? Then audit your own site, experiment and track changes.
A little effort spent on persuasion can go a long way to maximizing the value of the traffic you have already won.
In light of that, we thought it would be a good time to get out in front of the tired "Death of SEO" meme that is sure to appear once again in the coming weeks. ;)
The font size is somewhat small in the below image, but if you click through to the archived page you can see it in it's full glorious size.
Want to syndicate this infographic? Embed code is here. We also created a PDF version.
Ranking well for our chosen keywords involves putting in a lot of effort up front, with no guarantee of ranking, or reward.
Even if we do attain rankings, and even if do get rewarded, there is no guarantee this situation will last. And this state of flux, for many seos, is only likely to get worse as Google advises that updates will be “jarring and julting for a while”
Even more reason to make every visitor count.
If we can extract higher value from each visitor, by converting them from visitor to customers, and from short term customers to long term customers, then our businesses are less vulnerable to Google’s whims. We don’t need to be as focused on acquiring new visitors.
There is great value to be had in optimizing the entire marketing chain.
Hunting For Customers Vs Keeping Customers
It comes down to cost.
According to a Harvard Study a few years back, it can cost five times as much to acquire a new customer as it does to keeping a current customer happy. Of course, your mileage may vary, as whether it really costs five times as much, or three, or seven really depends what your cost structure.
However, this concept is an important one for search marketers, as it’s reasonable to assume that the cost of acquiring customers, via keyword targeting, is rising as Google makes the marketing process of keyword targeting more expensive than it has been in the past. This trend is set to continue.
If the cost of customer acquisition is rising, it can make sense to look at optimizing the offer, the conversion rates and optimizing the value of existing customers.
Underlying Fundamentals
If you have something a lot of people desperately need, and there isn’t much competition, it typically doesn’t cost much to land those customers. They come to you. If you have something genuinely scarce, or even artificially scarce, people will line up.
The problem is that most businesses don’t enjoy such demand. They must compete with other businesses offering similar products and services. So, if there is a scarcity issue, it’s a scarcity of customers, not service and product providers.
However, by focusing on a specific niche, businesses can eliminate a lot of competition, and thereby reduce the marketing cost. For example, a furniture manufacturer could conceivably make furniture for a wide variety of customers, from commercial offices, to industry, to the home.
But if they narrowed their focus to, say, private jet fit-outs, they eliminate a lot of their competition. They’d also have to determine if that niche is lucrative, of course, but as you can see, it’s a way of eliminating a lot of competition simply by adding focus and specialization.
By specializing, they are more likely to enjoy higher quality leads - i.e. leads that may result in a sale - than if they targeted broadly, as it is difficult to be all things to all people The cost of marketing to a broad target market can be higher, as can the level of competition in the search results pages, and the quality of leads can be lower.
Conversion Optimization
Once we’re focused on our niche, and we’ve got targeted visitors coming in, how can we ensure fewer visitors are wasted?
Those who do a lot of PPC will be familiar with conversion optimization, and we’ll dive deep into this fascinating area over the coming weeks, but it’s a good concept for those new to SEO, and internet marketing in general, to keep at front of mind.
You’ve gone to a lot of trouble to get people to your site, so make sure they don’t click back once they arrive!
Here’s a great case study by a company called Conversion Rate Experts. It outlines how to structure pages to improve conversion rates. Whilst the findings are the result of testing and adaptation, and are specific to each business, there are a few few key lessons here:
Length of the page. In this case, a long page improved conversion rates by 30%. Of course, it’s not a numbers game, more the fact that the longer page allowed more time to address objections and answer visitor questions.
As Conversion Rate Experts point out:
The media would have us believe that people no longer have any capacity to concentrate. In reality, you cannot have a page that’s too long—only one that’s too boring. In the case of Crazy Egg’s home page, visitors wanted their many questions answered and that’s what we delivered. (If you’d like more people to scroll down your long pages, see the guide we wrote on the topic.)”
It’s best to experiment, to see what works best in your own situation, but, generally speaking, it pays to offer the visitor as much timely information as possible, as opposed to short copy if there is a analytical, need-oriented motivation. Short copy can work better if the customer is impulsive.
As we see in the Crazy Egg case study, by anticipating and addressing specific objections, and moving the customer closer to the point of sale, the webpage is doing the job of the salesperson. This is an area where SEO and PPC, linked with conversion rate optimization, can add a ton of value.
The second interesting point was they optimized the long-term value of the customer to the company by making a time-sensitive offer.
The one-time offer test illustrates another important principle of conversion optimization: Don’t let the fear of a short-term loss stand in the way of a long-term gain
The offer they made turned a short-term customer into a long-term customer. If we have a lot of long term customers on our books, it can take some of the pressure off the need to constantly acquire new customers.
Optimize Everything
We engage in SEO because there are many similar sites.
The benefit of SEO is we can occupy premium real estate. If we appear high on the search result pages, we are more likely than our competitors to command the customers attention. But we stand to gain a lot more stability if we are not wholly reliant on occupying the top spots, and therefore less vulnerable to Google’s whims.
The following is a guest column written by Rory Joyce from CoverHound.
Last week Google Advisor made its long-awaited debut in the car insurance vertical -- in the UK. Given Google’s 2011 acquisition of BeatThatQuote.com, a UK comparison site, for 37.7 million pounds ($61.5 million US), it comes as little surprise that the company chose to enter the UK ahead of other markets. While some might suspect Google’s foray into the UK market is merely a trial balloon, and that an entrance into the US market is inevitable, I certainly wouldn’t hold my breath.
Here are three reasons Google will not be offering an insurance comparison product anytime soon in the US market:
1) High Opportunity Cost
Finance and insurance is the number one revenue - generating advertising vertical for Google, totaling $4 billion in 2011. While some of that $4 billion is made up of products like health insurance, life insurance and credit cards, the largest segment within the vertical is undoubtedly car insurance. The top 3 advertisers in the vertical as a whole are US carriers -- State Farm, Progressive and Geico -- spending a combined sum of $110 million in 2011.
The keyword landscape for the car insurance vertical is relatively dense. A vast majority of searches occur across 10-20 generic terms (ie - “car insurance,” “auto insurance,” “cheap auto insurance,” “auto insurance quotes,” etc). This is an important point because it helps explain the relatively high market CPC of car insurance keywords versus other verticals. All of the major advertisers are in the auction for a large majority of searches, resulting in higher prices. The top spot for head term searches can reach CPCs well over $40. The overall average revenue/click for Google is probably somewhere around $30. Having run run similar experiments with carrier click listing ads using SEM traffic, I can confidently assume that the click velocity (clicks per clicker) is around 1.5. So the average revenue per searcher who clicks is probably somewhere around $45 for Google.
Now, let’s speculate on Google’s potential revenues from advertisers in a comparison environment. Carriers’ marketing allowable is approximately $250 per new policy. When structuring pay-for-performance pricing deep in the funnel (or on a sold-policy basis), carriers are unlikely to stray from those fundamentals. In a fluid marketplace higher in the funnel (i.e. Adwords PPC), they very often are managing to a marginal cost per policy that far exceeds even $500 (see $40 CPCs). While it may seem like irrational behavior, there are two reasons they are able to get away with this:
a) They are managing to an overall average cost per policy, meaning all direct response marketing channels benefit from “free,” or unattributable sales. With mega-brands like Geico, this can be a huge factor.
b) There are pressures to meet sales goals at all costs. Google presents the highest intent of any marketing channel available to insurance marketers. If marketers need to move the needle in a hurry, this is where they spend.
Regardless of how Google actually structures the pricing, the conversion point will be much more efficient for the consumer since they will be armed with rates and thus there will be less conversion velocity for Google. The net-net here is a much more efficient marketplace, and one where Google can expect average revenue to be about $250 per sold policy.
How does this match up against the $45 unit revenue they would significantly cannibalize? The most optimized and competitive carriers can convert as high as 10% of clicks into sales. Since Google would be presenting multiple policies we can expect that in a fully optimized state, they may see 50% higher conversion and thus 15% of clicks into sales. Here is a summary of the math:
With the Advisor product, in an optimized state, Google will make about $37.50 ($250 x .15) per clicker. Each cannibalized lead will thus cost Google $7.50 of unit revenue ($45 - $37.50). Given the dearth of compelling comparison options in insurance (that can afford AdWords), consumers would definitely be intrigued and so one can assume the penetration/cannibalization would be significant.
Of course there are other impacts to consider: How would this affect competition and average revenue for non-cannibalized clicks? Will responders to Advisor be incremental and therefore have zero opportunity cost?
2) Advisor Has Poor Traction in Other Verticals
Over the past couple of years, Google has rolled out its Advisor product in several verticals including: personal banking, mortgage, and flight search.
I personally don’t have a good grasp on the Mortgage vertical so I had a chat with a high-ranking executive at a leading mortgage site, an active AdWords advertiser. In talking to him it became clear that there were actually quite a bit of similarities between mortgage and insurance as it relates to Google including:
Both industries are highly regulated in the US, at the state level.
Both verticals are competitive and lucrative. CPCs in mortgage can exceed $40.
Like insurance, Google tested Advisor in the UK market first.
Hoping he could serve as my crystal ball for insurance, I asked, “So why did Advisor for Mortgage fail?” His response was, “The chief issue was that the opportunity cost was unsustainably high. Google needed to be as or more efficient than direct marketers who had been doing this for years. They underestimated this learning curve and ultimately couldn’t sustain the lost revenue as a result of click cannibalization.”
Google better be sure it has a good understanding of the US insurance market before entering, or else history will repeat itself, which brings me to my next point...
3) They Don’t Yet Have Expertise
Let’s quickly review some key differences between the UK and US insurance markets:
Approximately 80% of car insurance is purchased through comparison sites in the UK vs under 5% in the US.
There is one very business-friendly pricing regulatory body in the UK versus state-level, sometimes aggressive, regulation in the US.
The UK is an efficient market for consumers, the US is not. This means margins are tighter for UK advertisers, as evidenced by the fact that CPCs in the UK are about a third of what they are in the US.
As you can see, these markets are completely different animals. Despite the seemingly low barriers for entry in the UK, Google still felt compelled to acquire BeatThatQuote to better understand the market. Yet, it still took them a year and a half post acquisition before they launched Advisor.
I spoke with an executive at a top-tier UK insurance comparison site earlier this week about Google’s entry. He mentioned that Google wanted to acquire a UK entity primarily for its general knowledge of the market, technology, and infrastructure (API integrations). He said, “Given [Google’s] objectives, it didn’t make sense for them to acquire a top tier site (ie - gocompare, comparethemarket, moneysupermarket, confused) so they acquired BeatThatQuote, which was unknown to most consumers but had the infrastructure in place for Google to test the market effectively.”
It’s very unlikely BeatThatQuote will be of much use for the US market. Google will need to build its product from the ground up. Beyond accruing the knowledge of a very complex, and nuanced market, they will need to acquire or build out the infrastructure. In the US there are no public rate APIs for insurance carriers; very few insurance comparison sites actually publish instant, accurate, real-time rates. Google will need to understand and navigate its way to the rates (though it’s not impossible). It will take some time to get carriers comfortable and then of course build out the technology. Insurance carriers, like most financial service companies, can be painfully slow.
Conclusion
I do believe Google will do something with insurance at some point in the US. Of the various challenges the company currently faces, I believe the high opportunity cost is the toughest to overcome. However, the market will shift. As true insurance comparison options continue to mature, consumers will be searching exclusively for comparison sites (see travel), and carriers will no longer be able to effectively compete at the scale they are now -- driving down the market for CPCs and thus lowering the opportunity cost.
This opportunity cost is much lower however for other search engines where average car insurance CPC’s are lower. If I am Microsoft or Yahoo, I am seriously considering using my valuable real estate to promote something worthwhile in insurance. There is currently a big void for consumers as it relates to shopping for insurance. A rival search engine can instantly differentiate themselves from Google overnight in one of the biggest verticals. This may be one of their best opportunities to regain some market share.
A pessimist sees the difficulty in every opportunity; an optimist sees the opportunity in every difficulty
If you’re one of the many webmasters feeling hammered by Penguin and Panda, you’d be forgiven for thinking the best opportunities in search are long gone. Perhaps it’s all gotten just a bit too complicated.
In reality, internet marketing is still just a toddler taking baby steps. Opportunity abounds. It’s a case of looking for it, finding it and capitalizing on it.
So how?
In this post. we’ll look at the nature of opportunities, and how to spot them.
People Always Have Problems
If you look at these stories, you’ll notice a common thread.
Honest Tea solved a problem. People wanted bottled organic tea. VMail solved a problem. Small businesses needed a cheap VoIP solution. Ben and Jerry solved a problem. There was no ice cream shop in their town.
Opportunity is the chance to solve a problem. To fill a need. If no one is solving that problem, and that problem is a difficult one the customer, and the customer is desperate to solve it, then the stronger the marketing opportunity will be.
Are there any fewer needs than there were last year? Than before the financial crash? There are just as many needs, as there are just as many people, and as life grows ever more complex, their needs become greater. Their needs may change. They might want fewer luxury products and better deals on everyday products. Next year, it might be the other way around.
Where There Are Problems, There Are Marketing Opportunities
We find marketing opportunities - good ones - when there is a high probability we’ll satisfy a market need, and do so profitably.
Is it about finding a keyword with high search volumes?
Is it about doing what the the successful people are doing?
Is it about doing what everyone else is doing, but just being better at SEO than they are?
Possibly, but this type of thinking is more to do with tactics than opportunity. A marketing opportunity is better evaluated for a higher level. Take the 5,000 ft view.
Ask:
Can I supply something in short supply?
Can I improve on an existing product or service in way that is considerably superior?
Can I supply a genuinely new product of service?
For example, there is a market opportunity for search engine news. Is it a good opportunity for a new entrant? Probably not, as this market is saturated by established players. The audience already have their needs met.
However, there might be better opportunities for news on, say, 3d printers, or some other emerging technology where there is a need, but it isn’t well served. Of course, these opportunities can narrow over time as more and more people see the opportunity, and move into the space.
Perhaps the most lucrative opportunities score highly in each area. They are in short supply, they are relatively new, and you can improve on something people already do.
A good example would be the iPhone. When they came out, there was only one iPhone, they were new(ish) idea for the target market, and they integrated functions people already performed, but did so in a superior way. It’s little wonder Apple could charge such high margins on them, and it took competitors a long time to catch up. Anyone releasing a smartphone today would have to improve on those areas - price point is probably the obvious opportunity - in order to be able to compete.
How To Nail Down The Opportunities
There are various methods marketers do to test their conceptions. Let's look at three.
One method is the problem detection method. Try asking people if they have any problems with their existing service. For example, a prospective SEO customer might say “I’d like to spend more on SEO, but I don’t know if my spend will be worthwhile”. The opportunity is to figure out a way to show it will be worthwhile if the customer spends more, which might involve offering a money-back guarantee, or a pay on performance arrangement, or some other way to improve that problem the customer has with existing services.
Another way is the ideal method. Ask the customer what would be their ideal product or service. Often, customers will describe fanciful things, but listening to their whims can help you think of products and services you may not have thought of yourself, as it’s easy to fall into the trap of thinking within the constraints of your industry. At one time, a customer may have wanted no time delay between ordered a meal and receiving the meal. This probably sounded like an impossible ideal to a restaurateur at the turn of the century, used to waiters and a kitchen staff, but an opportunity to Mr McDonald, who thought more in terms of an industrial process. And Ray Kroc scaled it from there.
Another method is the consumption chain method. You track the consumption of the product or service from start to finish, and see if there are any steps in the chain that can be improved upon. Questions to ask are how people become aware of the product or service, how do customers make their purchase decisions, if they need to consult someone else to make the purchase decision, how they get the product or service, where they store it, how often they use it, and so on. At each step in the chain, there is a chance to make a change, to optmize, and to make better.
We’ve only touched on the ideas on ways to seize opportunities. How have you discovered opportunities in the past? What is your process for spotting new opportunities? Please add them to the comments!
Since Ayima launched in 2007, we've been crawling the web and building our own independent backlink data. Starting off with just a few servers running in our Directory of Technology's bedroom cupboard, we now have over 130 high-spec servers hosted across 2 in-house server rooms and 1 datacenter, using a similar storage platform as Yahoo's former index.
Crawling the entire web still isn't easy (or cheap) though, which is why very few data providers exist even today. Each provider makes compromises (even Google does in some ways), in order to keep their data as accurate and useful as possible for their users. The compromises differ between providers though, some go for sheer index size whilst others aim for freshness and accuracy. Which is best for you?
This article explores the differences between SEOMoz's Mozscape, MajesticSEO's Fresh Index, Ahref's link data and our own humble index. This analysis has been attempted before at Stone Temple and SEOGadget, but our Tech Team has used Ayima's crawling technology to validate the data even further.
We need a website to analyze first of all, something that we can't accidentally "out". Search Engine Land is the first that came to mind, very unlikely to have many spam links or paid link activity.
So let's start off with the easy bit - who has the biggest result set for SEL?
The chart above shows MajesticSEO as the clear winner, followed by a very respectable result for Ahrefs. Does size matter though? Certainly not at this stage, as we only really care about links which actually exist. The SEOGadget post tried to clean the results using a basic desktop crawler, to see which results returned a "200" (OK) HTTP Status Code. Here's what we get back after checking for live linking pages:
Ouch! So MajesticSEO's "Fresh" index has the distinct smell of decay, whilst Mozscape and Ayima V2 show the freshest data (by percentage). Ahrefs has a sizeable decay like MajesticSEO, but still shows the most links overall in terms of live linking pages. Now the problem with stopping at this level, is that it's much more likely that a link disappears from a page, than the page itself disappearing. Think about short-term event sponsors, 404 pages that return a 200, blog posts falling off the homepage, spam comments being moderated etc. So our "Tenacious Tim" got his crawler out, to check which links actually exist on the live pages:
Less decay this time, but at least we're now dealing with accurate data. We can also see that Ayima V2 has a live link accuracy of 82.37%, Mozscape comes in at 79.61%, Ahrefs at 72.88% and MajesticSEO is just 53.73% accurate. From Ayima's post-crawl analysis, our techies concluded that MajesticSEO's crawler was counting URLs (references) and not actual HTML links in a page. So simply mentioning http://www.example.com/ somewhere on a web page, was counting as an actual link. Their results also included URL references in JavaScript files, which won't offer any SEO value. That doesn't mean that MajesticSEO is completely useless though, I'd personally use it more for "mention" detection outside of the social sphere. You can then find potential link targets who mention you somewhere, but do not properly link to your site.
Ahrefs wins the live links contest, finding 84,496 more live links than MajesticSEO and 513,733 more live links than SEOmoz's Mozscape! I still wouldn't use Ahrefs for comparing competitors or estimating the link authority needed to compete in a sector though. Not all links are created equal, with Ahrefs showing both the rank-improving links and the crappy spam. I would definitely use Ahrefs as my main data source for "Link Cleanup" tasks, giving me a good balance of accuracy and crawl depth. Mozscape and Ayima V2 filter out the bad pages and unnecessarily deep sites by design, in order to improve their data accuracy and showing the links that count. But when you need to know where the bad PageRank zero/null links are, Ahrefs wins the game.
So we've covered the best data for "mentions", the best data for "link cleanup", now how about the best for competitor comparison and market analysis? The chart below shows an even more granular filter, removing dead links, filtering by unique Class C IP blocks and removing anything below a PageRank 1. By using Google's PageRank data, we can filter the links from pages that hold no value or that have been penalized in the past. Whilst some link data providers do offer their own alternative to PageRank scores (most likely based on the original Google patent), these cannot tell whether Google has hit a site for selling links or for other naughty tactics.
Whilst Ahrefs and MajesticSEO hit the top spots, the amount of processing power needed to clean their data to the point of being useful, makes them untenable for most people. I would therefore personally only use Ayima V2 or Mozscape for comparing websites and analyzing market potential. Ayima V2 isn't available to the public quite yet, so let's give this win to Mozscape.
So in summary
Ahrefs - Use for link cleanup
MajesticSEO - Use for mentions monitoring
Mozscape - Use for accurate competitor/market analysis
Juicy Data Giveaway
One of the best parts of having your own index, is being able to create cool custom reports. For example, here's how the big SEO websites compare against each other:
"Index Rank" is a ranking based on who has the most value-passing Unique Class C IP links across our entire index. The league table is quite similar to HitWise's list of the top traffic websites, but we're looking at the top link authorities.
Want to do something cool with the data? Here's an Excel spreadsheet with the Top 10,000 websites in our index, sorted by authority: Top 10,000 Authority Websites.
Rob Kerry is the co-founder of Ayima, a global SEO Consultancy started in 2007 by the former in-house team of an online gaming company. Ayima now employs over 100 people on 3 continents and Rob has recently founded the new Ayima Labs division as Director of R&D.