Content Publishing, Controlling Costs, Scaling Profits & Link Bait: Being Small & Competing With Big Fish

Much in the same way I recently mentioned SEO marketing as being a layered process I tend to view building a long-term profitable website as a layered process. My fundamentals revolve around marketing and SEO of course, but as Google's algorithms get more authority based in nature it is worth taking a look at ways to control content costs while still coming up with ideas that help build up domain authority scores.

In any publishing medium, especially one which encourages crap content, and one where people grade your work in many ways, it usually takes a while to gain enough brand / authority / trust / popularity to be profitable, or you need to create something unique or citation worthy, or you need content of various quality levels to be profitable from the start. On the web your content is graded by people in the following ways

  • being worthy of attention (based on overall and subject related credibility and authority)

  • being worthy of a subscription (generally being worth visiting again, or adding your feed)
  • being worthy of a recommendation (via link, instant message, or email)

Then search engines look at whatever of that information they can interpret to find signal amongst the noise and try to rank pages based on query relevance, naturalness and uniqueness of text, semantic structure, user acceptance, age, naturalness of growth and citation, and outbound linkage.

Because there are so many forms / techniques / types of low quality information, and search is such a profitable targeted advertising vehicle, some search engines (for example, Google) have been placing significant weight on domain related authority to rank pages for specific queries. Put another way, if I published the same article on CNN.com and MySmallWebsite.com the CNN article will win out time and time again.

Small webmasters still can compete, but they must overcome the disadvantage of being small and having limited mindshare, trust, (and typically financial capital). There are many fundamental things you can do to help overcome those disadvantages. Once you overcome those disadvantages you can work your way into a self reinforcing market position which allows you to profit greatly from the work necessary to gain such a position.

Brand & Niche Selection
If a niche is saturated it may be worth it to pick another niche or a subtopic within that niche. One of my biggest failures when I first got on the web is that I wanted to learn everything about search. The problems with that are: Danny Sullivan is an amazing person and with all the hard work he has done it would be ridiculous to try to do something similar starting a decade later, and that niche is way too broad for anyone to do unless they had a team helping them. So I decided to try SEO, and that worked well. Today SEO is so saturated that it would be much harder for me to gain traction today than when I started a few years ago.

Site Design
A professional site design will pay for itself many times over, largely because a professional site design makes average to slightly above average content quality linkworthy.

If all you have is a few hundred dollars spend it on a logo design, and then use a minimalist site design which is color matched to the logo.

Format Your Content so it is Easy to Share & Easy to Change
Using a blog platform such as Wordpress as your content management system makes it easier for other bloggers to identify with you as being similar to them. Using a database driven content management system also makes it easy to change your site design quickly.

It is best if it is easy to change your site design quickly (to test different ad placements, etc.), and to be able to grow out any section that you found interesting and / or highly profitable.

Participate in Communities
Online (and offline, if possible) participate in communities discussing your topic to help create friendships, learn better what people care about in your field, and to help people if you can. If their site is authoritative in nature many people will come across your information published on their site.

You can start building your brand before you even have your site by selecting a username that you later relate to your brand name.

Buy a Bit of Trust
A listing in DMOZ, the Yahoo! Directory, or Business.com might seem expensive at first, but if you get a few trusted links it helps show the search engines that you are serious about business (and it shows them what community you belong to and that a human editor has reviewed your site).

Information Accuracy
Rather than aiming for bland objectivity, it is easy to be remarkable by being more biased and more personable, especially on your feature articles.

Content Quality
Mix your content quality. Try to create a few promotional pieces of content and market the hell out of those, but also leverage that authority to help improve the exposure of other things on your site. Even if your promotional pieces lose money, if you create enough other (hopefully cheap and easy to produce) content riding on the authority of your great content it can work to lower the cost of your higher quality content. Your cheap content can be textually unique but may not need to be conceptually unique. Your promotional content should hopefully be conceptually unique.

If you already have significant mindshare and a highly profitable business model you may want to try to post higher quality stuff most of the time. Posting lower quality information in bulk is more about giving your site a back-fill to aggregate the cost of the higher quality content.

Post about things you care and are passionate about, but if you are still trying to build up authority rather than publish based exclusively on information quality and passion, occasionally publish based on how well you think your ideas may spread.

If you can find a way to make consumers want to help generate your content then to them your content will be of high quality. If they create it they may also want to help market it.

Content Costs
I am using money as a proxy for value input each piece. If you are low on cash you can make up for that by putting in time and effort. Rather than writing 100's of $20 to $50 articles, write 100 or 100's of cheap ones, then spend a few hundred to a few thousand dollars per good idea. Make sure your best ideas are well executed, bake social elements into their structure, make them look exceptionally legitimate and useful, and make them easy to share.

To control costs, rather than hiring many employees full time it might make sense to try hiring freelance writers off Craigslist and other related sites. Consumer generated content is also a way to maximize the return on expensive content.

Content Targeting
When you are trying to write content use keyword research tools and search forums and other locations for common questions. If you are making up your bulk content make sure you control costs, but still make the content unique and legitimate enough looking to pass an editorial hand check and not undermine your brand value if you are trying to build a real brand.

If there are not any authoritative and relevantly targeted pages ranking for the related search query, and you can make a quality page on the topic which would be easy to cite then try to roll it into a high quality (and perhaps high cost) linkbait which helps lift the authority of the rest of the site. Make sure you target linkbait at specific people who are easy linkers or exceptionally authoritative linkers.

Don't be afraid to send personal targeted emails if you are launching something of high quality. I wrote something amazing and got about 300 links. I wrote something of slightly lower quality and got about 3,000 links. What was the difference? About a half dozen polite emails to friends seeded the second idea and helped spread the second idea much further than the first.

Another good way to make your site more linkable is by regularly linking out to the people who you would want to link at your site.

Social Marketing
If your content is of decent quality submit it to social sites like Del.icio.us, Digg, Netscape, and others. On your linkbait content page place links that make it easy for others to vote for your content, and ask a few friends to vote. Exposure on these social sites will put your content in front of bleeding edge link rich webmasters.

If your linkbait page looks useful, well structured and something worth looking at later many people will bookmark it, especially if it is long enough to look comprehensive and be placed in the this is too much so I will check it out later category.

Also do not forget to spend the 10 minutes it takes to make a topical Squidoo lense.

Use a Proper Title
If you are writing a quick piece of lower quality content then try to be somewhat literal and descriptive in how you title that page. Conversely, if you are writing to try to spread an idea you may want to sacrifice the title a bit if a more controversial title will make the idea spread much further.

Titles matter a lot if you want to create a controversy. Look at magazine covers, meme trackers, meta news sites, and social bookmarking sites for examples of good titles.

Controversy = discussion = links = money.

Information Timeliness
There are so many meta news sites, social bookmarking tools, and meme trackers that in most competitive markets it is going to be a waste of time to try to be the first person with every story unless you have access to insider information OR are able to create your own controversy which allows you to become the story.

You can use these fast acting sources to your advantage too, more on that later though.

Additionally you can create meta posts that strongly agree and strongly disagree with certain AUTHORITATIVE opinions on a topic. If you link at popular narcissistic webmasters eventually some of them will cite you back.

Information Sources
Use and abuse vertical search, social bookmarking, and meta news sites as the resources that they are. Even if they are 50% spam that still leaves a lot of good resources to make your articles look well researched with minimal effort.

The more creative you are with how you search the more stuff you will be able to dig up. For example, search for multiple related phrases, competing URLs, and track early votes for really interesting stories and see what other related stories those people voted for.

Don't forget traditional published books, books in the public domain, government content & research, DVDs, and documentaries. I saw a magazine at an airport for $10 which gave me thousands of dollars worth of content ideas. Their publishing format and information inaccessibility means that people creating a slightly dumbed down version of similar information get to make good money starting their ideas based on the hard work and research purchased for only $10.

Generally the less accessible a piece of information is the easier it is to sound remarkable by citing it or stating something similar on the web :)

Picture Page
Include photos from sites like IStockPhoto or Flickr in your higher quality articles to make them look more legitimate.

What has Worked so Far?
Check your traffic logs to see what articles are the most popular. Install SEO for Firefox and do a site search, or use Yahoo! Site Explorer to look at which of your articles have the most backlinks. Also search for your domain on Del.icio.us to see which of your pages got the most bookmarks.

Syndicate
Writing high quality content articles that are published on other sites is a good way to improve your credibility, link authority, drive targeted traffic, and if your site is new your content on older trusted sites might be more likely to rank to help your brand and ideas gain further exposure.

Monetizing

  • If your site is brand new go lean on the ads until you have some authority. Heavy ads too early = no authority. No authority = no income.

  • When links can be profitable link for conversion from within your content.
  • If you plan on selling ads directly it may make sense to put up a fake ad or two in the sponsors section. After one competitor has bought an ad many companies will feel they need to buy in because the competition already did.
  • Set AdSense as a default monetization model if you do not want to deal with ad sales (or need to get a pricing baseline).
  • Blend the ad colors with your content and place those ads in your content area to make them look like part of the content.
  • Consider factors affecting ad clickthrough rate, and check AdSense ad targeting before you write about a topic.
  • Look at the ads after you write a page. Make sure your page talks about the topics the ads are targeted to or you create other more targeted pages that address the contents of those ads if that makes sense.
  • Check ad clickthrough rate on a per page level and see what keywords are driving your ad clicks. Set up ad channels to test the earnings of different formats or sections of your site. Grow your site based on where the income is coming from. The deeper you want to dive into a topic the more targeted the traffic will be, but remember to keep building your site authority if you are going to build a big site.

The Importance of Viral
From Andrew Goodman:

Some well-funded companies with strong business development plans are able to negotiate means of driving underpriced traffic to a site, while selling listings at a higher price (this is why all the kerfuffle about "click arbitrage" seems to be overblown: many businesses have grown through "click arbitrage" and continue to be built around it).

In the past, quite a few companies were built up quickly simply through the grace of free mass organic Google referrals. As spaces get cluttered and large media companies spend in multiple channels in order to indirectly maintain their organic lead, this gets harder to achieve for a startup unless something goes a bit viral.

Thus if many of the current successes launched today with their current model they would not be citation worthy enough to earn their current market position.

What Did I Miss?
Any other tips we should add?

Bulk & Automation in SEO

Many people are still stuck in the bulk and automation line of thinking with link building. Largely because service providers are lazy. Largely because business models that are highly automated in nature are easier to extract value from if people are not thinking through what they are buying. In much the same way many SEOs will still sell you search engine submission stuff, for many years many SEOs will sell you bulk or automated link building programs. People new to the market may be cheap or too lazy to learn SEO, are heavily pitched bogus offers, and read a bunch of outdated information that reinforces old ideas which no longer have any value.

Why does the outdated information get so much exposure? Because search (especially Google) is biased toward older sites. And, with search replacing directories and link lists at the primary means of web navigation people do not link the same ways that they used to. Google not only wants to automate selling paid links on your site, but they also are even trying to automate recommending related links, thus trying to require publishers to do more to earn an editorial link.

But if something is automated, aggressively marketed, widely used, and was effective at manipulating the results you have to think that the search engines would quickly aim to stop it. If those same techniques are generally associated with other low quality sites is that a good network to actively place your site in? Odds are that search engines would be extra aggressive at deweighting the technique if it was effective and generally associated with junk content.

Sketchy SEO techniques have a limited shelf life, and not long after you hear them mentioned people start patching up the holes, so those technically savvy enough to find new algorithmic holes are typically going to keep quiet about them and extract as much value as they can before the holes are closed.

Matt Cutts hinted that having mostly low quality links may prevent your site from getting crawled deeply, and, more recently, he also mentioned that if a site added pages too quickly it may get flagged:

It looks like the primary issue with the Windows Live Writer blog was the large-scale migration from spaces.msn.com to spaces.live.com about a month ago. We saw so many urls suddenly showing up on spaces.live.com that it triggered a flag in our system which requires more trust in individual urls in order for them to rank (this is despite the crawl guys trying to increase our hostload thresholds and taking similar measures to make the migration go smoothly for Spaces). We cleared that flag, and things look much better now.

Admittedly, if you participate in some markets (like consumer finance or insurance) many automated junk content sites will place you in their network by scraping your site and linking to you, but if you can get a few quality links you can easily beat out people playing the bulk numbers game.

If you want your SEO to be effective longterm it is best to avoid easy and automated techniques, in favor of layered or complicated techniques that are going to be hard for most competitors to replicate.

What is an Authority?

If you thought a link would drive targeted traffic, lead to additional readers, and perhaps lead to other editorial citations would that be an easy way to define an authority link? Before the Google Florida Update I was ranking at ~ #6 for search engine marketing even though I really didn't know much about the topic. Within 9 months of being on the web, and on a few hundred dollars of ad spend I had a ranking that I soooo did not deserve (thank you primitive search technology!!!). After the Florida Update my then low grade link spam was rendered ineffective.

When the Florida update happened I read everything I could about it and started testing pages to see why I though they dropped. Danny Sullivan mentioned my article in one of his articles. Before he mentioned me in his article I basically got told two words when asking for links fuck and off. After Danny mentioned me, many of the sites which brushed me away were more willing to link to my sites.

After I noticed a few citations driving good traffic I thought that while I had a bit of credibility, a few minutes of mindshare, and the decent traffic that goes with it, that I ought to consider that ripe time to hunt down more well trusted authoritative links. And it was exceptionally easy. Some people who would generally reject me in the past said things like "Oh, you are that Aaron. We will get your link up today."

Within a few month my other site was again ranking for search engine marketing. Through that one article and the authoritative links it made possible I gained the authority I lacked. (Note: About a year later that site ranking later dropped a good bit likely due to a lack of anchor text diversity and me spending most of my time promoting this site as a stronger brand).

The amount of information is growing much faster than the available attention to consume it. There is far greater opportunity during periods of instability, as long as you allow yourself the flexibility to be able to read the market and are willing to be wrong with what you say. And if you find some type of success it is important to build off that success while the iron is still hot.

The weak point of the top reporters is not laziness, but vanity. You don't pitch stories to them. You have to approach them as if you were a specimen under their all-seeing microscope, and make it seem as if the story you want them to run is something they thought of themselves.

Our greatest PR coup was a two-part one. We estimated, based on some fairly informal math, that there were about 5000 stores on the Web. We got one paper to print this number, which seemed neutral enough. But once this "fact" was out there in print, we could quote it to other publications, and claim that with 1000 users we had 20% of the online store market.

Stats are just arbitrary numbers used to make news sound well researched and legitimate.

An Interesting Narrative

Long ago I mentioned that even if I didn't usually agree with Michal Martinez, that I thought he was exceptionally citation worthy. He recently offered up a blog post explaining his view of the recent evolution of SEO and link building in an article called Who Does Google Trust Now? I don't agree with many of the statements in the article, like:

Neither age of site nor age of links pointing to the site should really matter to how much a site can be trusted.

I believe that age matters. At least to some degree. I have an old domain which had no relevant anchor text (other than internal navigation) and few (perhaps no) quality links that ranks quite well in Google for competitive commercial queries.

In spite of not agreeing with some the article (and thinking portions of it might be a bit self-aggrandizing), I think Michael did a great job with the end of the article. The last few paragraphs rocked.

Simply getting links from free directories, article submission sites, reciprocal links, and other popular link sources will probably gradually extend the length of time new sites require to earn trust if for no other reason that they will only very slowly naturally attract links from trusted neighborhoods.

Exactly. When you focus on as much as you can get for free without building any value (and valuing your time at nothing) it takes an awful lot of pricing your time at free to catch up with established sites. Perhaps more time than you have left in your life. Why race toward the bottom?

The real question comes down to this: if I am correct, or close to correct, in my analysis, how long will it take for spammers and SEOs to develop methodologies that effectively poison the "good" (trusted) neighborhoods and force Google to develop some filtration methodology?

SEO and search constantly co-evolve. What Google trusts now is only temporary, and some SEOs have been building AHEAD of Google's shifts. And the search results show what is going on, so they will continue to be forced to change their algorithms. Nothing new there.

But poison is a harsh word. I don't think we should fault people for gaming the system. Google creates the game...we are just pawns that must move with the ebb and flow. As SEO gets harder those who know how to do it will get paid more. And one can argue that by manipulating search results we are helping keep the search engines sharp, forcing them to improve their algorithms.

Google is more profitable and has a larger effect on the web than you or I. People do not link as naturally as they once did (worrying about what the all powerful Google may think), Google is training some people about how to link to benefit Google, and Google has some people so brainwashed that they consider anyone disagreeing with Google's for profit agenda as spammers poisoning the web. Who's actions are poisoning the web?

Johnon.com - Great Blog

I am sure many of the readers here already read Johnon.com, but if you do not yet read it check out John Andrews's blog on Competitive Webmastering. A couple of my favorite recent posts on his blog

Go Words - he describes a bit about LSI type technology, while also beating up an old and outdated definition from one of my other sites :) Also check out his comments on that page for his explanation of how having too much keyword proximity throughout your copy could flag the page to be filtered as an irrelevant attempt at manipulation. I have been trying to hammer away at Google with a page ranking for a wide basket of keywords, and have been having a bit of fun with this...am ranking in the top 5 for 11 of 12 target phrases thusfar, but only 12 for 12 is acceptable. :)

Nobody wants to be a Tool - John talks about his love for toolbars :)

Proof Google Loves EDU & GOV Sites

I often see many .edu and .gov sites in Google's SERPs and think their representation is to a disproportionate level. And there is a business case for doing that too.

Google's Matt Cutts has argued that .edu and .gov links do not carry any more weight other than their raw PageRank scores being higher, but if they trust those resources enough to display them disproportionately more in the search results, then wouldn't they also be likely to trust how those resources voted for other pages more as well? I have a PageRank 7 site that doesn't rank anywhere near as well as you would expect given its PageRank. I also have a couple PageRank 5 sites that rank for a ton of searches and are getting thousands of visits a day. One of them has less than 30 pages too. What do the PageRank 5 sites have that the PageRank 7 site lacks? Tons of .edu and librarian type links.

Lets imagine that my experiences as a searcher and as a search marketer are totally biased, irrelevant, and too small of a sample to be accurate. Here is what we know that Google does for certain with PageRank and links:

  • shows outdated and rarely updated PageRank scores

  • only shows a sample of backlink data
  • scrubs out many of the most authoritative backlinks to a site when showing you a small sample of the backlink data
  • does not let you use multiple advanced operators in your search if one of the advance operators is the link function (link:site.com)

So just about everything they show you about PageRank or links is an obfuscated half truth. Why would we expect their words to be any more factually correct than these algorithmic half truths they share?

Here is what else we know Google does...

Google has a Librarian's newsletter, to help teach librarians how search works, and how to trust good resources (ie: who they should be linking at). Help improve our relevancy by linking at quality sites. That was the first two issues of their newsletter, and perhaps its main goal?

WebmasterWorld recently posted a thread announcing that Google is offering SEO training to federal government employees.

Imagine Google training one section of the web about how SEO works, and then not providing the same training to other webmasters. That effect alone will add a bias toward .GOV sites, and goes to show the bias they have toward governmental websites (whether or not they admit it exists).

When researching with a friend last night I came across a .GOV link scheme that made my jaw drop twice. Once in envy when I saw how effective and viral it was, and again when I realized how easily I could duplicate the marketing method. But I better rush off quickly with that one while the opportunity is still there, before Google teaches them how to link!

As much as the governmental training is about making governmental content accessible, it is also likely about making government agencies more aware of SEO, such that it is harder for people like me to bilk high quality .GOV links.

Rotating Page Titles for Anchor Text Variation

I have a site with some content in the consumer finance vertical. The domain is quite authoritative in nature, and based mostly on internal authority (plus 4 decent external links), a page on the site started ranking for a "nice" query. Based on that ranking, in the first 3 months the page picked up 100s of scraper backlinks, which I believe caused the page to get filtered out of Google for having too much of an unnatural and too well aligned profile (ie: looking blatently focused and manipulative in nature).

About a week ago I changed the page title to something different. The page quickly started ranking again in Google, and now any automated spam links it picks up will have different anchor text.

Most people probably do not have to worry too much about the effects of scraper sites if they are building legitimate content that will get many legitimate inbound links, but if you are writing vanilla content that is extracting profit from a well established domain it may be worth considering a page title change if you believe the scrapers may have whacked your page.

Are Meta Description Tags Important? Ask Werty...

Most of my friends use a meta description tag on their home page.

Not Werty though... When you go to Amsterdam with a friend, and some have called you a garden variety fag, it doesn't help your case to have a snippet like this.

From Rock En Seine

Friends don't let friends do crazy things like go without using a meta description on their home page.

From Rock En Seine

Hard Answers Are Easy Links

If it is hard to find the answer to a question then

  • it is probably easy to be one of the best answers

  • those who stumble across your answer will appreciate your effort, relevancy, and knowledge

Matt Cutts recently posted some SEO tips on his site revolving around the theory that

In general, any time you look for an answer or some information and can’t find it, that should strike you as an opportunity.

You hear people say examples different ways. Some people will say to start from market edges, while others will say try to get user generated media, etc...but generally the way to make something that is hopefully useful is to be passionate and knowledgeable about a topic and then try to create something that you would want to frequently use or reference.

And while it may seem like it sucks to have to put the extra effort in to find the correct answers to certain questions, that is the exact reason that it has so much value.

There are tons of offline opportunities that will migrate online. How high is the quality of the average book as compared to the average web page? And yet due to most publishers not marketing most books very hard most of them sell fewer than 1,000 copies. Much book content will eventually be found online, but for now many people make a great living by reading various print books, condensing them down into something more palatable and publishing it as an ebook.

The web is about making knowledge accessible and selling it as credible. The more accessible your information is the easier it is to be referenced and thus perceived as credible.

Experience presents great offline to online opportunity as well. When I went to Search Engine Strategies I got an extra bonus 8 hour layover each way. One of them was because in San Diego I was required to exit a terminal, find out what terminal I was supposed to go to, find out how to get to that terminal, wait on a terminal bus, wait while the terminal bus driver allowed people to overload the bus, reorganize everyone else's luggage to make it fit good enough for the bus driver to drive, ride the length of the airport, try to get tickets for the next flight from multiple people because the machine would not work and a couple of the workers did not feel like helping me, go through security, find my gate, and get on my plane in an hour.

Now that airport runaround probably sounds a bit absurd, but the people who sold me my ticket most likely KNEW that I was going to have to leave the airport and re-enter, with a high probability of missing my flight. And if they didn't know that, then the online airline booking company which decided to find ways to aggregate and make such information readily available would have a huge advantage over competitors who did not make that type of experience related aggregated information easily accessible.

Many markets are full of people chasing money, but if you can capture experiences or are willing to share what you learn you have a distinct business advantage over people who are looking at revenue ahead of quality.

Duplication as a Form of Waste

When you do a Threadwatch site search in Google most of the pages are filtered out due to having duplicate meta description tags.

If you have complete duplication of any element (page title, meta keywords, meta description) across your site then it is at best a wasted opportunity, but may also hurt your ability to get your site indexed or ranked well in some search engines. Also, if you have the exact same information in the page title, meta description, and meta keywords areas then that onpage duplication across elements through the "eyes" of a search engine at best makes you look like an ignorant webmaster, but might also be a sign of low information quality or spamming. If you have a huge dynamic site and are forced to chose between having duplication across major elements on a page or duplication from page to page or just yanking an element (like the meta description or meta keywords tag) then you are usually better off just yanking the element until you can find a formula that allows you to dynamically generate somewhat unique page level information.

I think sending duplicate information is in many ways far worse than showing nothing at all, and Matt Cutts recently stated similar in a TW comment. I will yank the meta descriptions from Threadwatch pretty soon.

Many content management systems (like MovableType - which this blog uses) make the onpage header and page title the exact same as one another. In an ideal world you could have the option to make them different to help mix up your on page optimization (by allowing you to focus the page on a broader set of keywords) and your anchor text (as people often link at things using the official name as the link).

If you have a small hand crafted website then it is probably worth taking the time to try to make your content as unique as possible from page to page and element to element within those pages. Any time you have the chance to show that your content is hand crafted and unique that is a valuable opportunity, especially as the volume of search spam increases and spamming techniques evolve.

Pages