Have you been selling a product or service for some time, but think you might need to do something new to keep up with the market? Offer something fresh?
One of the problems with making significant changes to your products or services is that it tends to carry a high level of risk. There is a risk you could alienate your existing prospects. There is always risk in starting over and trying something new and untested, as the untried and untested is more likely to fail.
But what if you could change your product or service without really changing it at all! Here are a few ideas on how to make changes, by changing the pitch, and without going to the effort, or taking the risk of making fundamental changes.
Positioning
One of the great things about direct marketing, of which search marketing is a part, is that we’re not likely to be starting with products and services that have had an awareness and associations built up over many years - like Coca-Cola, for example. We get to modify the position, if we so choose.
Position, in marketing, means perception. Perception in the minds of the prospective customer. We can appeal to perceptions, or shape our product to fit perceptions, depending on what our prospects want.
For example, we could take the same car and market it to two different groups using positioning. To one group, we emphasize safety features above all else. To another group, we emphasize performance. The product doesn’t change, but the positioning does, and thus appeals to different groups of buyers. In reality, a car manufacturer probably wouldn’t do this, at least not in the same market, as it could send confusing messages.
However, on the web, we can often chop and change products, and target different groups, and one doesn’t necessarily need to overlap another.
Vertical Positioning
A vertical is a group of similar businesses and customers that engage in trade based on specific and specialized need. They may be a subset of a larger market. For example, PPC is a vertical within internet marketing, itself a subset of general marketing.
In terms of positioning within vertical markets, imagine you’re a software developer in the search marketing space. If you were talking to a group of manufacturers, and want to talk about what you do in a way that is understandable to this audience, you might talk to them in broad terms about marketing.
If you were talking to a group of marketers you might talk more specifically about search marketing. If you were talking to a room full of search marketers, you might talk more specifically again about the PPC optimization software you’re working on. If you were talking to a room of PPC optimization software developers.....and so on.
They are all part of the same market - and they might all need what you have - but each audience exists in different verticals, and so you change the message to suit. Changing vertical positioning is when you target a different vertical within the same market. An example of this might be a landlord who rents out a house to a single tenant changes to renting it out students on a room by room basis with “shared facilities”. She’s still in the accommodation provision market, the product is the same, but it is pitched to a different niche.
Can you identify different verticals in your market to which you product or service might also appeal? Can you configure your product, without making fundamental changes, so that it appeals to the needs of a different niche within your market?
Positioning In Time
Positioning in time, sometimes described as horizontal positioning in direct marketing circles, refers to the point in time when a person buys something, and positioning the message to appeal to different buyers depending on where they are in the buy-cycle.
For example, if someone is genuinely new to your product, and doesn’t even know they want it, then you could pitch your advertising based on the benefits your product provides. If I wanted to sell, say, a revolutionary new power cell, I wouldn’t talk about specifications to someone unfamiliar with the product, I’d talk about the fact that it replaces the need to be on an electricity grid, so the buyer doesn’t need to pay line charges. I’d emphasize benefits.
If someone is already aware of these new power cells, and knows all the benefits, I would likely emphasize other aspects, such as features and price more than benefits, as the buyer should already understand them.
This type of positioning will be familiar to people who do a lot of PPC. The link text, message and landing page changes to accommodate buyers at different stages in the sales cycle. The product doesn’t change, but the message does.
Isolate
Another way to reposition a product or service is to use an isolation technique. Take a single aspect of the product and make it a major part of the offer. For example, TIME magazine sells subscriptions to a magazine, but their advertising often focuses on the “free” gifts that accompany a subscription. This technique is often used when the main product itself is well known to the audience, and there’s not much new that can be said about it.
Many software companies who formerly sold their software now give their software away as part of a freeware model, but sell software support and maintenance services around it. They isolate an aspect that was always there - service - but now emphasize it, and push the actual product into the background. This tends to happen when the product becomes commodity and there are few ways to differentiate it without making significant changes.
Combine
Think about bundling products or services together to appeal to a different vertical.
For example, there might be a small market for individual electronic components, but a large market for a “phone tapping device”. Something Woz and Steve Jobs built a company on.
Music distribution companies, like Spotify, take individual tunes, bundle them together as a huge library, and sell subscriptions to it, as opposed to selling on a song by song, basis, like iTunes do.
Individual garden plants and potting accessories might not be very interesting, but bundled together as a “kitchen greenhouse” they might appeal to an audience of foodies who don’t necessarily see themselves as gardeners.
Ford said “give the customer any color they want, so long as it is black”. This strategy worked for a while, because people just wanted a car. However, the market changed when GM decided they would offer a range of cars to suit different “purposes, purses and personalities”.
Between 1920 and 1923, Ford’s market share plummeted from 55 to 12 percent.
These days, auto manufacturers segment the market, rather than treat it as one homogeneous mass. There are cars for the rich, cars for the less well off, cars built for speed, and cars built for shopping.
Manufacturers do this because few manufacturers can cater to very large markets where the consumer has infinite choice. To be all things to all people is impossible, but to be the best for a smaller, well-defined group of people is a viable business strategy. It costs less to target, and therefore has less risk of failure. Search marketing is all about targeting, so let's take a look at various ways to think about targeting in terms of the underlying marketing theory which might give you a few ideas on how to refine and optimize your approach.
While there are many ways to break down a market, here are three main concepts.
Segments
Any market can be broken down into segments. A segment means “a group of people”. We can group people by various means, however the most common forms of segmentation include:
Benefit segmentation: a group of people who seek similar benefits. For example, people who want bright white teeth would seek a toothpaste that includes whitener. People who are more concerned with tooth decay may choose a toothpaste that promises healthy teeth.
Demographic Segmentation: a group of people who share a similar age, gender, income, occupation, education, religion, race and nationality. For example, retired people may be more interested in investment services than a student would, as retired people are more likely to have capital to invest.
Occasion Segmentation: a group of people who buy things at a particular time. Valentines Day is one of the most popular days for restaurant bookings. People may buy orange juice when they think about breakfast time, but not necessarily at dinner. The reverse is true for wine.
Usage Segmentation: a group of people who buy certain volumes, or at specific frequencies. For example, a group of people might dine out regularly, vs those who only do so occasionally. The message to each group would be different.
Lifestyle segmentation: a group of people who may share the same hobbies, or live a certain way. For example, a group of people who collect art, or a group of people who are socialites.
The aim is to find a well-defined market opportunity that is still large enough to be financially viable. If one segment is not big enough, a business may combine segments - say, young people (demographic) who want whiter teeth (benefit). The marketing for this combined segment would be different - and significantly more focused - that the more general “those who want whiter teeth” (benefit) market segment, alone.
How does this apply to search and internet marketing in general?
It’s all about knowing your customer. “Knowing the customer” is an easy thing to say, and something of a cliche, but these marketing concepts can help provide us with a structured framework within which to test our assumptions.
Perhaps that landing page I’ve been working on isn’t really working out. Could it be because I haven’t segmented enough? Have I gone too broad in my appeal? Am I talking the language of benefits when I should really be focusing on usage factors? What happens if I combine “demographics” with “occassion”?
Niches
Niches are similar to segments, but even more tightly defined based on unique needs. For example, “search engine marketing education” is a niche that doesn’t really fit usefully within segments such as demographics, lifestyle or occasion.
The advantage of niche targeting is that you may have few competitors and you may be able to charge high margins, as there is a consumer need, but very few people offer what you do. The downside is that the niche could weaken, move, or disappear. To mitigate this risk, businesses will often target a number of niches - the equivalent of running multiple web sites - reasoning that if one niche moves or disappears, then the other niches will take up the slack.
Search marketing has opened up many niches that didn’t previously exist due to improved marketing efficiency. It doesn’t cost much to talk to people anywhere in the world. Previously, niches that required a global audience in order to be viable were prohibitive due to the cost of reaching people spread over such a wide geographic area.
To function well in a niche, smaller companies typically need to be highly customer focused and service oriented as small niche businesses typically can’t drive price down by ramping volume.
Cells
Cells are micro-opportunities. This type of marketing is often overlooked, but will become a lot more commonplace on the web due to the easy access to data.
For example, if you collect data about your customers buying habits, you might be able to identify patterns within that data that create further marketing opportunities.
If you discover that twenty people bought both an iPhone and a PC, then they may be in the market for software products that makes it easy for the two devices to talk to each other. Instead of targeting the broader iPhone purchaser market, you might tailor the message specifically for the iphone plus PC people, reasoning that they may be having trouble getting the two devices to perform certain functions, and would welcome a simple solution.
For those selling search marketing to customers, especially those customers new to the concept of search marketing, it’s often useful to pitch search marketing services in terms the customer already understands.
A lot search marketing theory and practice is borrowed and adapted from direct marketing. Direct marketing concepts have been around since the 60s, and may be more readily understood by some customers than some of the arcane terminology sometimes associated with SEO/SEM.
Here are some ideas on how to link search marketing and direct marketing concepts.
1. Targeting & Segmentation
A central theme of direct marketing is targeting.
On broadcast television, advertisers show the one advertisement to many people, and hope it will be relevant to a small fraction of that audience. Most television advertising messages are wasted on people who aren't interested in those messages. It’s a scattergun, largely untargeted approach.
Search marketing, a form of direct marketing, is targeted. Search marketers target their audience based on the specific keywords the audience use.
Search marketing is concerned with the most likely prospects - a small fraction of the total audience. Further, if we analyse the visitor behavior of people using specific keyword terms post-click, we can find out who are the hottest prospects amongst that narrowly defined group.
The widely accepted 20-80 rule says that 20% of your customers create 80% of your business. An example might be "luxury vacations France", as opposed to "vacations France". If we have higher margins on luxury travel, then segmenting to focus on the frequent luxury travel buyer, as opposed to a less frequent economy buyer whom we still might sell to, but at lower margins, might be more in line with business objectives. Defining, and refining, keyword terms can help us segment the target market.
2. Focus
Once you get a search visitor to your site, what happens next?
They start reading. Such a specific audience requires focused, detailed information, and a *lot* of it, or they will click back.
It is a mistake to pitch to an "average" audience at this point i.e. to lose focus. If we’ve done our job correctly, and segmented our visitors using specific keyword terms, we already know they are interested in what we offer.
To use our travel example above, the visitor who typed in “luxury vacations in France” wants to hear all about luxury vacations in France. They are unlikely to want a pitch about how wonderful France, as a country, is, as the keyword term suggests they’ve already made their mind up about destination. Therefore, a simplistic, generalized message selling French tourism is less likely to work.
Genuine buyers - who will spend thousands on such vacations - will want a lot of detail about luxury travel in France, as this is unlikely to be a trivial purchase they make often. That generally means offering long, detailed articles, not short ones. It means many options, not few. It means focusing on luxury travel, and not general travel.
Simple, but many marketers get this wrong. They go for the click, but don’t focus enough on the level of detail required by hot prospects i.e. someone most likely to buy.
3. Engagement
One advantage of the web is that we can spend a lot of time getting a message across once a hot prospect has landed on a site. This is not the case on radio. Radio placements only have seconds to get the message across. Likewise, television slots are commonly measured in 15 and 30 second blocks.
On the web, we can engage a visitor for long periods of time. The message becomes as long as the customer is prepared to hear it.
4. Personalized
The keyword tells you a lot about visitor intent. “Luxury travel France” is a highly targeted term that suggests a lot about the visitor i.e. their level of spend and tastes. If we build keyword lists and themes associated with this term, we can personalize the sales message using various landing pages that talk specifically to the needs of the visitor. Examples might include “Five Star Hotels”, “Luxury Car Hire”, “Best Restaurants In Paris”, and so on. Each time they click a link, or reveal a bit more about themselves,we can start to personalize the message. Personalized marketing works well because the message is something the prospect is willing to hear. It’s specifically about them.
We can personalize the journey through the site, configuring customized pathways so we can market one-to-one. We see this at work on Amazon.com. Amazon notes your search and order history and prompts you with suggestions based on that history. One-to-many marketing approaches, as used in newspapers, on radio and on television typically aren’t focused and lack personalization. They may work well for products with broad appeal, but work less well for defined niches.
5. Active Response
We’re not just interested in views, impressions, or reach. We want the visitor to actively respond. We want them to take a desired, measurable action. This may involve filling out a form, using a coupon, giving us an email address, and/or making a purchase.
Active response helps make search marketing spends directly accountable and measurable.
6. Accountable
People either visit via a search term, or they don’t.
Whilst there can be some advantage in brand awareness i.e. a PPC ad that appears high on the page, but is only clicked a fraction of the time, the real value is in the click-thru. This is, of course, measurable, as the activity will show up in the site statistics, and can be traced back to the originating search engine.
Compare this with radio, television or print. It’s difficult to know where the customer came from, as their interaction may be difficult to link back to the advertising campaign.
Search marketing is also immediately measurable.
7. Testable
Some keyword terms work, some do not. Some keyword terms only work when combined with landing page X, but not landing page Y. By “work” we tend to mean “achieves a measurable business outcome”.
Different combinations can be tried and compared against one another. Keywords can be tested using PPC. Once we’ve determined what the most effective keywords are in terms of achieving measurable business outcomes, we can flow these through to our SEO campaign. We can do the reverse, too. Use terms that work in our SEO campaigns to underpin our PPC campaigns.
This process is measureable, repeatable and ongoing. Language has near infinite variety. There are many different ways to describe things, and the landing pages can be configured and written in near infinite ways, too. We track using software tools to help determine patterns of behaviour, so we can keep feeding this back into our strategy in order to refine and optimize. We broaden keyword research in order to capture the significant percentage of search phrases that are unique.
Local SEO has the undeserved reputation of being "easy" and "not a lot of work". The competitiveness of keywords might be less competitive than broader keywords but there still is a fair amount of work that goes into getting the campaign off and running properly.
On the whole, keywords targeted in local SEO campaigns are less competitive than their broader counterparts but there are also mitigating factors to consider when determining the overall difficulty of the campaign.
Consideration also needs to be given to how the following factors will effect the overall difficulty of producing a successful campaign:
relationship between keyword volume, difficulty, and conversion ROI for the client on both Google and Bing
prevalence of Google/Bing local inserts (need to factor in the wonkiness of these ranking algorithms as well)
appropriateness and value of setting up and running social media profiles for the client
link difficulty (depending on the client's niche)
availability of other online traffic generation options (buying exposure on other sites where the target market is)
the client's desire to engage in pre-campaign PPC to more accurately determine search volume for a more accurate setting of expectations
client's budget
your margins
Some of what I mentioned above doesn't really fall into the "difficulty" of ranking for keywords, but ranking is only a piece of the overall puzzle. You should have an idea of how difficult the entire process will be, because it's more than just rankings at this stage (and has been for awhile).
Building the Campaign Framework
There are a number of pre-campaign, post-campaign, during-campaign tools you can use for these kinds of campaigns but you don't have to go nuts. Local search stats can be small enough to make extrapolation without PPC or historical analytics data fruitless with respect to actionable date
When you begin to layout your campaign process you could follow a broader roadmap and adjust as necessary. For example, your specifics might change if you are working with an existing site rather than a new one (no historical data, no initial on-site reviews to do, etc).
While links are still and will continue to be uber-important for the foreseeable future, it is wise to consider the rise of site engagement, social signals, and online PR. This is why when we talk about "local SEO" we talk about things like strategic ad buys, social media plays, and PPC for research purposes.
Local SEO can be a lead-in to an entire marketing campaign as we discussed here, so we'll leave ongoing PPC, email marketing, offline ad integration, etc for those kinds of discussions but just know that once you get your foot in the door the door can open pretty wide. The more you can do and the better you do it the better your retention rate will be.
Keyword Research
The biggest thing to do is set expectations. If you come running in with unqualified keyword volume reports you are really starting from a level of distrust, even if the client doesn't know it yet. If the client isn't interested in some initial PPC then it's in your best interest to clue them in on the potential inaccuracy of various keyword tools.
For an existing site you can pull keyword search data from whatever analytics package the client has as well as from both sets of webmaster tools (Bing and Google). You can cross reference that with current rankings to see where you might be able to score some quick wins.
For a new site, set up accounts on Google's Webmaster Tools as well as Bing's. These will come in handy down the road for more keyword data, link data, and site health reviews.
We've talked about local SEO keyword research via PPC before and on top of that, or in lieu of that if the client isn't interested, you can get some local and a bunch of broader keywords from tools like:
The second tool combines search terms with local modifiers in a given radius of the area you select.
If you find local volume lacking I suggest the following steps:
Start with the targeted town's (or towns) name and/or zip code(s) as modifiers
Move up to a bigger nearby town or county if needed
If volume is still sparse, move up to state level keyword modifiers
Couple those bits of research with what the non-locally modified results show to see if you can find overlapping relationships between core keywords (medical insurance versus health insurance, or car insurance versus auto insurance, etc)
Move into Google Trends and Insights to further qualify the broader keywords by region and state
If no clear winner emerges, err on the side of where the broader volume is
Site Architecture and Content
Quite a few local sites are going to be your brochure-style sites. Site structure can vary quite a bit depending on the size and scope of the site. Since most local sites focus on a particular product or service (rather than being Amazon.Com) it is wise to keep the following in mind:
stay far, far away from duplicate and NEAR duplicate content (if the client is an insurance agent don't have similar pages like acmeinsurance.com/car-insurance, /auto-insurance, /vehicle insurance)
also, avoid using the town/city names as the only modifiers where no difference exists between services or products (acmeinsurance.com/town1-auto-insurance, /town2-autoinsurance, /town3 autoinsurance)
get the client involved in the content writing, they generally have lots of marketing or product material that you can pull from and give to a writer for topical ideas and industry jargon
consider hiring on a well-respected job board like problogger.net for specific content needs (finance, home/garden, food, etc)
don't overdo internal linking with keyword rich anchors, especially on navigation (try to keep it broad from a keyword standpoint...Car Insurance vs Providence Car Insurance as an example)
use tools like screaming frog and xenu to assess overall on-page health, structural integrity, and internal linking stats
write your page titles and meta descriptions with click-thru's in mind while mixing in broad and local keyword variations to help describe the site rather than simply to keyword stuff
Tools like Google's Page Speed and Yslow can provide you with detailed analysis on potential site loading issues prior to launch. I have found that printing these out before/after is a good way to show the client, who typically is a novice, some of the stuff that is going on behind the scense. Clients like before and afters (when the after is more favorable than the before of course).
Tracking
Tracking is key, naturally, so you'll need to pick an analytics package. There are some decent Google Analytics alternatives, if you aren't interested in dealing with the borg. That said, you can choose from some fairly full-featured packages
Google Analytics (free, except that you are giving them all sorts of data :D )
For ease of use and feature sets I tend to either go with Clicky, Mint or Google Analytics. I haven't spent much time with Woopra and I find Piwik to not be as intiuitive or as user friendly as the other 3 I mentioned (which is even more important when the client wants/needs access.)
Speaking of tracking, you should consider getting familiar with a cheap virtual phone number vendor (I would recommend phone.com) as well as Google's URL builder for tracking potential adverts and media buys down the road (as well as offline adverts if you end up servicing that aspect of the client's marketing campaign). If you use Google Analytics, another cool tool to use is Raven's Google Analytics configuration tool
I generally recommend staying away from tracking numbers because it can screw up your Google Places rankings and trust but when I use them I typically just make them images on whatever page they are being listed on and I never use them for IYP citations (listings on sites like Yelp, Yellowpages, Merchant Circle, etc).
Planning Out Link Building
For local sites, you'll want to attack link building on two fronts: 1. External links 2. Citations.
Before you get into any of the link planning, you should get the client set up in KnowEm. KnowEm will help get the client on all the relevant social networks and goes a long way in establishing the base for the client to control it's branded searches and branded SERPs.
You can choose from a variety of packages from basic registration to complete profile set up (bio's, pictures, descriptions, etc). Once these profiles are built, you can begin building links to them (and link to them from the client's site) to further the client's domination of their own branded SERPs.
For citations, I would recommend using Whitespark (we reviewed it here). Whitespark really is an essential tool in building citations, tracking citations, and doing competitive citation research. Speaking of citations, each year David Mihm releases the Local Search Ranking Factors and I would highly recommend saving each year's version and refer back to it when designing your citation building plan(s).
As for traditional link building, it's fairly similar to non-local link building with respect to the broader overview of link outreach but can be niched down to focus on locality for both link equity and qualified traffic.
Some of the things you can do at the beginning of the link planning process would be:
make a list of the vendors you use, find out if they have a site and would be willing to link to you
local papers tend to have really favorable online advertising rates, exposure runs a close second to links and part of how I like to approach the link building process is to be everywhere (online) locally; play hardball for a bit on the rates and you'll be surprised about the relatively cheap, local exposure you can buy
set up google alerts for your client's brand and for local topics relevant to their product/service
talk to other local businesses about co-promotions on both your site, their site, and your social networks (if available)
if you offer coupons and discounts to certain groups or demographics, get those posted on local sites as well; many local sites do not have sophisticated ad serving technology so you often get a nice, relevant, clean link back to your client's site
in addition to competitive link research you can pull the backlinks of local chambers of commerce and local travel/tourist sites to find potential link opportunities
run a broken link checker on local resource sites, specifically ones that deal with local events, news, tourism and see if there are link opportunities for your client
Infographic ideas for local clients, depending on the niche, can be found fairly easily and can bring in lots and lots of local links and exposure. Every state and many towns/cities have Wikipedia pages which link out to demographic statistics. There is a trove of data available and if you can be creative with the data + your client's niche there are lots of opportunities for you.
For instance, in Rhode Island insurance rates are typically higher than neighboring states (Massachusetts, New Hampshire, Vermont, etc). The reasons generally are things like exposure to coastal regions, proximity of towns to city center, accident history, etc. You could easily make a decent infographic about this. Local news and resource sites would probably be willing to gobble it up. If you were able to interview insurance company spokespeople you could find yourself with some pretty good exposure and some pretty solid links.
Expectations and Budget
The reality is that if you do not properly set expectations (maybe think about showing the client how the sausage is made pre-campaign) and you take whatever budget comes your way you will not be able to provide quality service for very long, the campaign will not succeed, and you may do irreparable harm to your brand in your local market.
If you have other results and testimonials to fall back on, as well as a solid plan mapped out (that can be explained to the client), then you've held up your end of the bargain with respect to providing a fair proposal for your time and effort. Sometimes the initial planning is the most time-intensive part of a local campaign.
Plan it out correctly from the beginning and you should be able to produce the results required to keep the client and build up your brand in your local market.
In that past, we’ve talked a lot about Google’s brand bias, but no matter how a brand is defined in technical terms, the reality is that Google cannot leave popular brand sites out of the search results.
If a person searches for, say, AVIS, and doesn’t see AVIS in the top spot, then as far as the searcher is concerned, Google is broken. If a person searches for various car rental terms and does not see AVIS somewhere, then it's also likely they'll expect Google is broken.
There was a time on the web when relevant information was harder to come by. Not so now. Now, we have too much information. We don’t even know how big the internet is. They're guessing two trillion pages. And counting.
One way to do that is by developing a clear brand identity.
What Is A Brand
A logo? A set of graphics? A catchy name?
Not really.
Plenty of companies have logos, graphics and a catchy name but they do not have strong brand identities. A brand is largely about how other people define you. They define you based on the experience they have when engaging with you.
For example, take Apple. How would you define their brand? The logo? The shops? The fonts they use in their advertising?
These aspects are not Apple’s brand. Apple’s brand is in the way Apple’s customers feel about Apple. It’s a feeling tied up with concepts such fashion, design, innovation and quality - and unique to Apple.
This feeling creates a clear identity in the mind of the customer.
Having a clear identity makes you memorable. People will remember your site name. People will search for your site name. And when enough people do that, then there is little chance Google can ever drop you below number #1 for brand searches. If you get it right, Google will even rank you against relevant related keywords you aren't targeting.
Because Google would look broken if it didn't feature you.
Tooting our own horn here, but if you typed “seo book” into Google, and didn’t see this site, you’d think Google was broken. There are plenty of books on SEO, but only one “seo book” that owns a clear brand identity in this space. And SEO Book gets plenty of traffic from other search search related terms that it does not target, because Google associates the site so strongly with the "SEO education" niche. The people who search on SEO queries click on this site, and once they arrive, they don’t click back too often.
Own Your Space
Any company, no matter how small, can develop unique brands and build their own brand related search stream, and associated searches, over time.
If you run a small company, do you occupy clear space? By clear space, think focused, unique selling proposition. What is the thing you offer that others do not? If other people offer what you do, then what is the thing you do better? How do people describe you? Can they reduce it to an elevator pitch? Is what you offer focused, or confused?
It’s about more than providing something a bit unique. In a cluttered environment, like the web, it's about creating something genuinely different. Probably radically different, given the high level of noise in the search results.
Once you have your differentiation down you can then advertise it, which creates further brand awareness: "High dwell campaigns are three times more efficient at stimulating branded search."
This makes for a more defensible search marketing strategy, because it's difficult for generic competition to emulate you once you've carved out a clear identity. It’s not about offering more features. Or a lower price. Those things are details. It’s about crafting a unique identity that others will know you by. Focus on the parts of your business that really make the money, and considering orienting your entire identity around that one aspect.
The problem with not having a clear identity and point of difference, when it comes to SEO, is that it is a constant battle to maintain position. Google can easily flush all the me-too sites that chase generic keywords and Google’s users aren’t going to complain. The sites with unique identities don’t have to spend near as much time, energy and money maintaining rank.
But hang on, doesn’t this go against everything SEO is about?
There’s nothing wrong with chasing generic terms. It’s a completely valid strategy. However, if we’re in it for the long haul, we should also make an effort to develop a clear, differentiated brand. It means we can own our space in the search results, no matter how Google changes in future.
Look at Trip Advisor. Google may be gunning for the travel space with their own content acquisitions, but they’re going to look deficient if they don’t display TripAdvisor. They are going to look deficient if they don't show Trip Advisor when people are looking for just about any travel review queries, whether Trip Advisor is targeting them or not, because Trip Advisor are synonymous with travel reviews. By not featuring Trip Advisor, Google would merely encourage more people to by-pass Google and search Trip Advisor directly.
That's a powerful place to be.
Not everyone can dominate the travel space like Trip Advisor, of course. But it is worth noting that Trip Advisor started small & the principle is the same, no matter what the niche. It’s about becoming the most memorable site in your niche. No matter if it’s poggo sticks for one legged dogs, then be the go-to site for poggo sticks for one legged dogs. Eventually, word gets around, and such a site become synonymous with poggo sticks for one legged dogs, and associated terms, whether they optimize for related terms, or not.
Google will associate keywords with this site in order to deliver a relevant result, and if this site owns the “poggo sticks for one legged dogs” niche, then their SEO workload is greatly reduced.
Are They Talking About You?
Your brand should be something people will talk about. Where are all the links coming from these days? Social networks. Google pays attention to social signals - tweets, Facebook, Google+ and other social links - because that is the way many links occur. They are markers of attention, and Google will always look for markers of attention.
And as their audiences click through to you, Google gets valuable signals about your relevance to entire groups of people. You can be sure Google is grouping these people by interest - creating demographic profiles - and if your site interests a certain group, then this will flow through into searches made by these groups. Google can also tie many of these users back to their identities by using persistent cookies & Google+.
That's the way it's going. SEO, and wider marketing and brand strategy, will all meld together.
Well, you might - if you knew what a trustworthy guy I am :)
But I know that’s nowhere good enough. I know I would need to earn it.
"Do I trust you?" is a question your web visitors are probably asking themselves right now.
Do they trust your title tag description and snippet enough to click the link? If they click the link, and land on your site, do the trust you enough to stay? Do they trust you enough to click deeper into your site? Do they trust you enough to hand over their e-mail address, or their credit card details?
If your site ranks well, but visitors don't trust it, is your search marketing campaign broken? Blowing an opportunity to form a trust relationship, which can be broken in less than a few seconds online, earns nothing but a click-back.
Webmasters need to establish trust quickly in order to get people to take the next step.
I'm Trustworthy!
All webmasters look to establish trust. We intuitively know that in order to convince someone else of something, they first need to trust us. No webmaster would want to project an air of untrustworthiness, although some mistakenly do, by overlooking a few simple steps.
If you have an existing site, or you are planning a new one, consider undertaking an audit of trust factors.
Your visitors will want to know....
1. Who Am I Dealing With?
This is especially important online, because there is little context to our interaction.
If we walk into a doctors office, we may see medical equipment, and nurses, and qualifications on the wall. It provides us with sufficient context to establish a level of trust that this person is likely a qualified doctor who knows what she is doing. The environment gives us a pretty good idea of who we are dealing with.
Not so online.
A web site, especially a website that is previously unknown to us, provides little in the way of context. Anonymity can lend an air of the mysterious, but it doesn't do much to help establish trust.
Let visitors know who they are dealing with. This doesn’t necessarily involve telling them your life story, or showing them a photo of you and the kids, although that can work well for personalized forms of marketing.
If you don’t already have them, consider adding staff photos and position details, detailed company description and history, and ensure address and contact details are prominently displayed. If you have a physical location, show it. Provide a map. The tech-savvy will likely want to see you on social networks, such as Facebook and Twitter, too.
This point is obvious, I know. Most webmasters do it. Audit your site to see if you provides visitors a clear idea of who they are dealing with.
2. Overcome Fear
There is fear in new engagements.
Not spine tingling fear. Just a low level fear of the new. Your visitors may fear your site is wasting their time. They may fear they might be ripped off. They may fear you won't deliver on the promise made explicit in your title tag and heading.
Look at ways to counter fear of the new.
The familiar carries less fear than the unfamiliar. Obviously, if you've already established a reputation, then you will already be familiar to your visitors, and therefore likely appear more trustworthy that someone a visitor doesn't know.
But search marketing is often focused on attracting the new visitor, so an established reputation may not be something we can rely on. This is why it can be good idea to leverage reputation from elsewhere.
For example, a commonly used tactic is the “As Seen In...” references used by sites such as ForSaleByOwner.com. Whilst also providing credibility by association, it is also a means of leveraging the trust in those brands with which the visitor is already familiar.
3. Will This Work For Me?
There’s more to relevance than matching a keyword. How will your solution work for your visitor? How do you know what is really relevant to them when all you have for a clue is a keyword phrase?
This is easier said than done. You need to get inside their head. You need to know their questions and objections and be able to answer them. If people feel you care about solving their problems, they are more likely to trust you.
But how?
Listening. Being clear about what problem you’re going to solve. Is it a real problem, or an imagined one? Look for opportunities to have your visitor define their problem in their own terms, using surveys, visitor tracking, market research (i.e. the language they use in forums, on blogs, Facebook, Twitter, et al)
The best sites seem to know exactly what you are thinking. They reflect you, back at you.
This can be underlined with your copy. Use “you” as opposed to “I”. Look how many times “you” has been used in this copy. This is a very effective selling technique, because your visitors really don’t care about you, they care about them.
4. Deliver
We've dealt with superficial areas, most of which can just as easily be abused by the manipulative and deceitful as used properly by the honest and trustworthy.
Trust also a process. People will judge you by your actions.
Tell them what you'll do.
Do it.
Tell them you've done it.
Does your website demonstrate this? One good way to show process is by using a case study. You outline the problem. Show how you planned to solve it. Then you show that you solved it. For extra points, show how happy people were with the outcome.
Offer free trials, where possible. Offer free downloads. Look for tangible ways to prove you do what you say you do.
Once a visitor has engaged your services, ensure your process is transparent, communicated and you do what you said you’d do.
5. What Will Everyone Else Think?
This is related to fear.
Will I be ridiculed for choosing your service? Made to feel stupid? There was a saying in the IT industry that “no one got fired for buying IBM”. It wasn’t that IBM was necessarily a better provider, it was that many people used them, so there was perceived strength in numbers of the tried and true.
People tend to go where other people are. Can you provide similar social validation?
Customer references are a great way to provide social validation. If the customers are from companies with which your visitor is already familiar, all the better. Faces. Lot’s of happy faces provide social validation.
Also include the number of people who use, or have used, your service.
In summary:
Have you let people know who they are dealing with?
Have you made reference to the familiar?
Have you addressed their needs in their own terms?
Have you included references and case studies?
Have you done what you’d said you’d do?
PS: Thanks to Seth Godin for the inspiration, off whos' post I’m riffing :)
On Feb. 25, 2011, Google released Panda to wreak havoc on the web. While it may have been designed to take out content farms, it also took out scores of quality e-commerce sites. What do content farms and e-commerce sites have in common? Lots of pages. Many with zero or very few links. And on e-commerce sites with hundreds or thousands of products, the product pages may have a low quantity of content, making them appear as duplicate, low quality, or shallow to the Panda, thus a target for massive devaluation.
My e-commerce site was hit by Panda, causing a 60% drop in traffic overnight. But I was able to escape after many months of testing content and design changes. In this post, I'll explain how we beat the Panda, and what you can do to get your site out if you've been hit.
The key to freeing your e-commerce site from Panda lies at the bottom of a post Google provided as guidance to Pandalized sites:
One other specific piece of guidance we've offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content.
Panda doesn't like what it thinks are "low quality" pages, and that includes "shallow pages". Many larger e-commerce sites, and likely all of those that were hit by Panda, have a high number of product pages with either duplicate bits of descriptions or short descriptions, leading to the shallow pages label. In order to escape from the Panda devaluation, you'll need to do something about that. Here are a few possible solutions:
Adding Content To Product Pages
If your site has a relatively small number of products, or if each product is unique enough to support entirely different descriptions and information, you may be able to thicken up the pages with unique, useful information. Product reviews can also serve the same purpose, but if your site is already hit by Panda you may not have the customers to leave enough reviews to make a difference. Additionally, some product types are such that customers are unlikely to leave reviews.
If you can add unique and useful information to each of your product pages, you should do so both to satisfy the Panda and your customers. It's a win-win.
Using Variations To Decrease Product Pages
Some e-commerce sites have large numbers of products with slight variations. For example, if you're selling t-shirts you may have one design in 5 different sizes and 10 different colors. If you've got 20 designs, you've got 1,000 unique products. However, it would be impossible to write 1,000 unique descriptions. At best, you'll be able to write one for each design, or a total of 20. If your e-commerce site is set up so that each of the product variations has a single page, Panda isn't going to like that. You've either got near 1,000 pages that look like duplicates, or you've got near 1,000 pages that look VERY shallow.
Many shopping carts allow for products to have variations, such that in the above situation you can have 20 product pages where a user can select size and color variations for each design. Switching to such a structure will probably cause the Panda to leave you alone and make shopping easier for your customers.
Removing Poor Performing Products
If your products aren't sufficiently unique to add substantial content to each one, and they also don't lend themselves to consolidation through selectable variations, you might consider deleting any that haven't sold well historically. Panda doesn't like too many pages. So if you've got pages that have never produced income, it's time to remove them from your site.
Getting Rid of All Product Pages
This is a bold step, but the one we were forced to take in order to recover. A great many of our products are very similar. They're variations of each other. But due to the limitations of our shopping cart combined with shipping issues, where each variation had different shipping costs that couldn't be programed into the variations, it was the only viable choice we were left with.
In this option, you redesign your site so that products displayed on category pages are no longer clickable, removing links to all product pages. The information that was displayed on product pages gets moved to your category pages. Not only does this eliminate your product pages, which make up the vast majority of your site, but it also adds content to your category pages. Rather than having an "add to cart" or "buy now" button on the product page, it's integrated into the category page right next to the product.
Making this move reduced our page count by nearly 90%. Our category pages became thicker, and we no longer had any shallow pages. A side benefit of this method is that customers have to make fewer clicks to purchase a product. And if your customers tend to purchase multiple products with each order, they avoid having to go from category page to product page, back to the category page, and into another product page. They can simply purchase a number of products with single clicks.
Noindexing Product Pages
If you do get rid of all links to your product pages but your cart is still generating them, you'll want to add a "noindex, follow" tag to each of them. This can also be a solution for e-commerce sites where all traffic enters on category level pages rather than product pages. If you know your customers are searching for phrases that you target on your category pages, and not specifically searching for the products you sell, you can simply noindex all of your product pages with no loss in traffic.
If all of your products are in a specific folder, I'd recommend also disallowing that folder from Googlebot in your robots.txt file, and filing a removal request in Google Webmaster Tools, in order to make sure the pages are taken out of the index.
Other Considerations: Pagination & Search Results Pages
In addition to issues with singular product pages, your e-commerce site may have duplicate content issues or a very large number of similar pages in the index due to your on-site search and sorting features. Googlebot will fill in your search form and index your search results pages, potentially leading to thousands of similar pages in the index. Make sure your search results pages have a rel="noindex, follow" tag or a rel="canonical" tag to take care of this. Similarly, if your product pages have a variety of sorting options (price, best selling, etc.), you should make sure the rel="canonical" tag points to the default page as the canonical version. Otherwise, each product page may exist in Google's index in each variation.
Maxmoritz, a long time member of our SEO Community, has been working in SEO full time since 2005. He runs a variety of sites, including Hungry Piranha, where he blogs regularly.
There are many ways to organize pages on a site. Unfortunately, some common techniques of organizing information can also harm your SEO strategy.
Sites organized by a hierarchy determined without reference to SEO might not be ideal because the site architecture is unlikely to emphasize links to information a searcher finds most relevant. An example would be burying high-value keyword pages deep within a sites structure, as opposed to hear the top, simply because those pages don't fit easily within a "home", "about us", contact" hierarchy.
In this article, we’ll look at ways to align your site architecture with search visitor demand.
Start By Building A Lexicon
Optimal site architecture for SEO is architecture based around language visitors use. Begin with keyword research.
Before running a keyword mining tool, make a list of the top ten competitor sites that are currently ranking well in your niche and evaluate them in terms of language. What phrases are common? What questions are posed? What answers are given, and how are the answers phrased? What phrases/topics are given the most weighting? What phrases/topics are given the least weighting?
You’ll start to notice patterns, but for more detailed analysis, dump the phrases and concepts into a spreadsheet, which will help you determine frequency.
Once you’ve discovered key concepts, phrases and themes, run them through a keyword research tool to find synonyms and the related concepts your competitors may have missed.
One useful, free tool that can group keyword concepts is the Google Adwords Editor. User the grouper function - described here in "How To Organize Keywords" to "generate common terms" option to automatically create keyword groupings.
Look at your own site logs for past search activity. Trawl through related news sites, Facebook groups, industry publications and forums. Build up a lexicon of phrases that your target visitors use.
Then use visitor language as the basis of your site hierarchy.
Site Structure Based On Visitor Language
Group the main concepts and keywords into thematic units.
For example, a site about fruit might be broken down into key thematic units such as “apple”, “pear”, “orange”, “banana” and so on.
Link each thematic unit down to sub themes i.e. for “oranges”, the next theme could include links to pages such as “health benefits of oranges”, “recipes using oranges”, etc, depending on the specific terms you’re targeting. In this way, you integrate keyword terms with your site architecture.
The product listing by category navigation down the left-hand side is likely based on keywords. If we click on, say, the “Medical Liability Insurance” link, we see a group of keyword-loaded navigation links that relate specifically to that category.
Evidence Based Navigation
A site might be about “cape cod real estate”. If I run this term through a keyword research tool, in this case Google Keywords, a few conceptual patterns present themselves i.e people search mainly by either geographic location i.e. Edgartown, Provincetown, Chatham, etc or accommodation type i.e. rentals, commercial, waterfront, etc.
Makes sense, of course.
But notice what isn’t there?
For one thing, real estate searches by price. Yet, some real estate sites give away valuable navigation linkage to a price-based navigation hierarchy.
This is not to say a search function ordered by house value isn’t important, but ordering site information by house value isn’t necessarily a good basis for seo-friendly site architecture. This functionality could be integrated into a search tool, instead.
A good idea, in terms of aligning site architecture with SEO imperatives, would be to organise such a site by geographic location and/or accommodation type as this matches the interests of search visitors. The site is made more relevant to search visitors than would otherwise be the case
Integrate Site Navigation Everywhere
Site navigation typically involves concepts such as “home”, “about”, “contact”, “products” i.e. a few high-level tabs or buttons that separate information by function.
There’s nothing wrong with this approach, but the navigation concept for SEO purposes can be significantly widened by playing to the webs core strengths. Tim Berners Lee placed links at the heart of the web as links were the means to navigate from one related document to another. Links are still the webs most common navigational tool.
“Navigational” links should appear throughout your copy. If people are reading your copy, and the topic is not quite what they want, they will either click back, or - if you’ve been paying close attention to previous visitor behaviour - will click on a link within your copy to another area of your site.
The body text on every page on your site is an opportunity to integrate specific, keyword-loaded navigation. As a bonus, this may encourage higher levels of click-thru, as opposed to click-back, pass link juice to sub-pages and ensure no page on your site is orphaned.
Using Site Architecture To Defeat Panda & Penguin
These two animals have a world of connotations, many of them unpleasant.
Update Panda was an update partly focused on user-experience. Google is likely using interaction metrics, and if Google isn’t seeing what they deem to be positive visitor interaction, then your pages, or site, will likely take a hit.
What metrics are Google likely to be looking at? Bounce backs, for one. This is why relevance is critical. The more you know about your customers, and the more relevant link options you can give them to click deeper into your site, rather than click-back to the search results, the more likely you are to avoid being Panda-ized.
If you’ve got pages in your hierarchy that users don’t consider to be particularly relevant, either beef them up or remove them.
Update Penguin was largely driven by anchor text. If you use similar anchor text keywords pointing to one page, Penquin is likely to cause you grief. This can even happen if you’re mixing up keywords i.e. “cape cod houses”, “cape cod real estate”, “cape cod accommodation”. That level of keyword diversity may have been acceptable in the past, but it’s not now.
Make links specific, and link it to specific, unique pages. Get rid of duplicate, or near duplicate pages. Each page should be unique, not just in terms of keywords used, but in terms of concept.
In a post-Panda/Penquin world, webmasters must have razor-sharp focus on what information searchers find most relevant. Being close, but not quite what the visitor wanted, is an invitation for Google to sink you.
Build relevance into your information architecture.
My good friend Bill at SEOByTheSea has unearthed a Google patent that will likely raise eyebrows, whilst others will have their suspicions confirmed.
The patent is called Ranking Documents. When webmasters alter a page, or links to a page, the system may not respond immediately to those changes. Rather, the system may change rankings in unexpected ways.
A system determines a first rank associated with a document and determines a second rank associated with the document, where the second rank is different from the first rank. The system also changes, during a transition period that occurs during a transition from the first rank to the second rank, a transition rank associated with the document based on a rank transition function that varies the transition rank over time without any change in ranking factors associated with the document.
Further:
During the transition from the old rank to the target rank, the
transition rank might cause:
a time-based delay response,
a negative response
a random response, and/or
an unexpected response
So, Google may shift the rankings of your site, in what appears to be a random manner, before Google settles on a target rank.
Let's say that you're building links to a site, and the site moves up in the rankings. You would assume that the link building has had a positive effect. Not so if the patent code is active, as your site may have already been flagged.
Google then toys with you for a while before sending your site plummeting to the target rank. This makes it harder to determine cause and effect.
Just because a patent exists doesn't mean Google is using it, of course. This may be just be another weapon in the war-of-FUD, but it sounds plausible and it’s something to keep in mind, especially if you're seeing this type of movement.
The Search Engine As Black Box
In ancient times (1990s), SEO thrived because search engines were stupid black boxes. If you added some keywords here, added a few links there, the black box would respond in a somewhat predictable, prescribed, fashion. Your rankings would rise if you guessed what the black box liked to "see", and you plummeted if you did too much of what the black box liked to see!
Ah, the good old days.
These days, the black box isn’t quite so stupid. It’s certainly a lot more cryptic. What hasn’t changed, however, is the battle line drawn between webmasters and search engines as they compete for search visitor attention.
If there are any webmasters still under the illusion that Google is the SEOs friend, that must be a very small club, indeed. Google used to maintain a - somewhat unconvincing - line that if you just followed their ambiguous guidelines (read: behaved yourself) then they would reward you. It was you and Google on the good side, and the evil spammers on the other.
Of late, Google appear to have gotten bored of maintaining any pretense, and the battle lines have been informally redrawn. If you’re a webmaster doing anything at all that might be considered an effort to improve rank, then you're a "spammer". Google would no doubt argue this has always been the case, even if you had to read between the lines to grasp it. And they’d be right.
Unconvinced?
Look at the language on the patent:
The systems and methods may also observe spammers’ reactions to rank changes caused by the rank transition function to identify documents that are actively being manipulated. This assists in the identification of rank-modifying spammers.
“Manipulated”? “Rank modifying spammers”? So, a spammer is someone who attempts to modify their rank?
I’ve yet to meet a webmaster who didn’t wish to modify their rank.
Google As A Competitor
Google’s business model relies on people clicking ads. In their initial IPO filing, Google identified rank manipulation as a business risk.
We are susceptible to index spammers who could harm the integrity of our web search results. There is an ongoing and increasing effort by “index spammers” to develop ways to manipulate our web search results
It’s a business risk partly because the result sets need to be relevant for people to return to Google. The largely unspoken point is Google wants webmasters to pay to run advertising, not get it for “free”, or hand their search advertising budget to an SEO shop.
Why would Google make life easy for competitors?
The counter argument has been that webmasters provide free content, which the search engines need in order to attract visitors in the first place. However, now relevant content is plentiful, that argument has been weakened. Essentially, if you don't want to be in Google, then block Google. They won't lose any sleep over it.
What has happened, however, is that the incentive to produce quality content, with search engines in mind, has been significantly reduced. If content can be scraped, ripped-off, demoted and merely used as a means to distract the search engine user enough to maybe click a few search engine ads, then where is the money going to come from to produce quality content? Google may be able to find relevant content, but "relevant" (on-topic) and "quality" (worth consuming) are seldom the same thing
One content model that works in such as environment is content that is cheap to produce. Cheap content can be quality content, but like all things in life, quality tends to come with a higher price tag. Another model that works is loss-leader content, but then the really good stuff is still hidden from view, and it's still hard to do this well, unless you've established considerable credibility - which is still expensive to do.
This is the same argument the newspaper publishers have been making. The advertising doesn’t pay enough to cover the cost of production and make a profit - so naturally the winner in this game cuts production cost until the numbers do add up. What tends to be sacrificed in this process - is quality.
NFSW Corp, a new startup by ex-TechCrunch and Guardian columnist writer Paul Carr has taken the next step. They have put everything behind a paywall. There is no free content. No loss-leaders. All you see is a login screen.
Is this the future for web publishing? If so, the most valuable content will not be in Google. And if more and more valuable content lies beyond Google's reach, then will fewer people bother going to Google in the first place?
Here’s how it works. Our engineers come up with some insight or technique and implement a change to the search ranking algorithm . They hope this will improve search results, but at this point it’s just a hypothesis. So how do we know if it’s a good change? First we have a panel of real users spread around the world try out the change, comparing it side by side against our unchanged algorithm. This is a blind test — they don’t know which is which. They rate the results, and from that we get a rough sense of whether the change is better than the original. If it isn’t, we go back to the drawing board. But if it looks good, we might next take it into our usability lab — a physical room where we can invite people in to try it out in person and give us more detailed feedback. Or we might run it live for a small percentage of actual Google users, and see whether the change is improving things for them. If all those experiments have positive results, we eventually roll out the change for everyone"
Customer focus is, of course, admirable, but you’ve got to wonder about a metric that doesn’t involve the needs of publishers. If publishing on the web is not financially worthwhile, then, over time, the serps will surely degrade in terms of quality as a whole, and users will likely go elsewhere.
There is evidence this is already happening. Brett at Webmasterworld pointed out that there is a growing trend amongst consumers to skip Google altogether and just head for the Amazon, and other sites, directly. Amazon queries are up 73 percent in the last year.
There may well be a lot of very clever people at Google, but they do not appear to be clever enough to come up with a model that encourages webmasters to compete with each other in terms of information quality.
If Google doesn’t want the highest quality information increasingly locked up behind paywalls, then it needs to think of a way to nurture and incentivise the production of quality content, not just relevant content. Tell publishers exactly what content Google wants to see rank well and tell them how to achieve it. There should be enough money left on the table for publishers i.e. less competition from ads - so that everyone can win.
I’m not going to hold my breath for this publisher nirvana, however. I suspect Google's current model just needs content to be "good enough."