Hidden Content Costs & Understanding Your Profit Potential Per Page
Ever since Google has got more selective with what they will index, the model for profitable SEO changed from chucking up pages and hoping some of them are profitable, to where it makes sense to put more strategy into what you are willing to publish.
The Supplemental Index Hates Parasitic SEO:
Each site will only get so many pages indexed given a certain link authority. And each of those pages will rank based on the domain's authority score, and the authority of the individual page, but each page needs a minimum authority score to get indexed and stay out of the supplemental results - this is how Google is trying to fight off parasitic SEO.
Given that many people are leveraging trusted domains, it makes sense that if you have one that you leverage it in a way that makes sense. CNN will rank for a lot of queries, but it does not make sense for Google to return nothing but CNN. It is good for the health of Google to have some variety in their search results. This is why smaller sites can still compete with the bigger ones, Google needs to use the smaller sites to have variety and to have leverage over the larger sites...to keep the larger sites honest if they are too aggressive in leveraging their authority, or have holes that others are exploiting.
Extending a Profitable Website:
If you have a 100 page niche website you may be able to expand it out to 500 pages without seeing too much of a drop in revenue on those first 100 pages, but eventually you will see some drop off where the cost of additional content (via link authority that it pulls from other pages on your site) nearly matches the revenue potential of the new pages. And then at some point, especially if you are not doing good keyword research, have bad information architecture, create pages that compete with other pages on your site, are not actively participating in your market (gaining links and mindshare), or if you are expanding from a higher margin keyword set to a lower margin one, you may see revenues drop as you add more pages.
The solution to fix this problem is build editorial linkage data and stop adding pages unless they have a net positive profit potential.
What are the costs of content?
- the time and money that went into creating it
- link equity (and the potential to be indexed) that the page takes from other pages
- the mindshare and effort that could have been used doing something potentially more productive
- the time it takes to maintain the content
- if it is bad or off topic content, anything that causes people to unsubscribe, hurts conversion rates, or lowers your perceived value is a cost
How can a Page Create Profit?
- anything that leads people toward telling others about you (links or other word of mouth marketing) is a form of profit
- anything that makes more people pay attention to you or boosts the credibility of your site is a form of profit
- anything that thickens your margins, increases conversion rates, or increases lifetime value of a customer creates profit
- anything that reduces the amount of bad customers you have to deal with is a form of profit
Mixing Up Quality for Profit Potential:
I am still a firm believer in creating content of various quality levels and cost levels, using the authoritative content to get the lower quality content indexed, and using the lower quality content earnings to finance the higher quality ideas, but rather than thinking of each page as another chance to profit it helps to weigh the risks and rewards when mapping out a site and site structure.
Increasing Profit:
Rather than covering many fields broadly consider going deeper into the most profitable areas by
- creating more pages in the expensive niches
- making articles about the most profitable topics semantically correct with lots of variation and rich unique content
- highly representing the most valuable content in your navigational scheme and internal link structure
- creating self reinforcing authority pages in the most profitable verticals
- requesting visitors add content to the most valuable sections or give you feedback on what content ideas they would like to see covered in your most valuable sections
- If your site has more authority than you know what to do with consider adding a user generated content area to your site
Take Out the Trash:
If Google is only indexing a portion of your site make sure you make it easy for them to index your most important content. If you have an under-performing section on your site consider:
- deweighting it's integration in the site's navigational scheme and link structure
- placing more internal and external link weight on the higher performing sections
- if it does not have much profit potential and nobody is linking at it you may want to temporarily block Googlebot from indexing that section using robots.txt, or remove the weak content until you have more link authority and/or a better way to monetize it
Comments
Some solid advice. I am constantly amazed how much people forget about proper site planning.
Anyway, the bottom of the post has some huge font, I wonder if that's intended or not. I'd figure it'd have a similar, but less scaring effect if the font was at least 18-20px.
What about sites that have forums? The main site might have lots of great content but the forum is full of content that is user generated and not necessarily as good.
If it is important to get all your main content indexed put the forums on a subdomain to clearly divide it from your other content. Also prevent forum profiles and other pure noise pages from being indexed.
That is a good point - ensuring the user profiles don't get indexed. How do you suppose I do that? Would that be a case where I use the nofollow attribute on links that lead to user profiles?
I do use sitemaps to highlight the content I want indexed, but that doesn't mean the automated bots don't go ahead and follow links within my site.
Another problem I think I'm having is that my sitemap (http://www.madtownlounge.com/sitemap.asp) used to list all the venues and band profiles on my site. Well, I now have 8,000 venue profiles, but my sitemap is my most often accessed page! I don't want that because it's a very long list and users typically leave (65% exit rate). What I do want, however, is to get the actual venue profiles indexed - that was why I had set up my sitemap like that in the first place - to lead the bots to the venue profiles. I suppose I can again rely on my Google Sitemap(s) to tell the bots what to index - which I already do actually.
Thanks Aaron, good advice about Taking Out the Trash.
Allen McGuire, as Aaron said you simply can block Googlebot from indexing your trash sections using robots.txt
SkGold,
I do use robots.txt, but I can't do that in this case. In my case, I have user and music profiles all considered 'profiles' in general - with a bit flag being the only difference between the two. I need to explicitly tell Google what profile links to follow and index when crawling my site, and which ones to ignore. This would be best done, from what I can tell, by using 'nofollow' on internal user profile links.
Staying on point, however, what I'm looking to do is cut down on the indexing of trash and really focus on what I need to get indexed.
Aaron
Thanks a lot, really good info here.
How about writing another post which expands on one of the concepts -
"How to create self reinforcing authority pages in your most profitable verticals"
Regards
Alex
Interesting post. Thanks for shedding more light on the ideas that site structure does play a part in how successful your site will be. As Yuri pointed out above site planning and structure is an important part of seo.
I agree with you on the idea of mixing quality levels and content, but I hadn't been thinking of it in the same terms you describe here so you've given me more to think about. Always a good thing in my book.
I'll also give a vote for Alex's suggestion for a post expanding on the idea of self reinforcing authority pages.
Add new comment