Search Engines Giving You the Tools to Kill Yourself
Many publishers hide additional information sections that they want people to be able to select viewing if they show interest in the topic. For example, each of Think Progress's navigational sections are expandable, and some publishers have more information or other informational cues to make additional page content visible. These can be used deceptively, but if you have a strong brand and are trying to use them with the end user in mind, I doubt search engines will think the intent is bad.
AdSense Section Targeting:
As search has taken a larger and larger piece of the web search engines have given us ways to mark up our pages to suit their needs. AdSense section targeting made it easier for Google to target content ads to your site. That sounds like a good idea, but they also offer tags that offer publishers no value.
Google's NoFollow:
Nofollow was originally recommended to stop blog comment spam, but it has morphed into a tag that Matt Cutts wants you to use on any paid or unnatural link. What makes a link unnatural? In one form or another almost everything is paid for, by giving away value, exchanging currency, or nepotism.
Do You Trust Yourself?
If a page has many nofollow tags on it isn't that another way of saying that the publisher does not trust their own content? If a publisher says that they don't trust their own content or their own advertisers then why would search engines (or savvy webmasters) want to trust them?
The Machine is Broken:
Bob Massa recently highlighted how absurd the current use of the nofollow attribute is:
Mr. Cutts, speaking on behalf of Google presumably, made the comment, "if you want to buy links just for traffic, totally fine just don’t do it so they affect search engines".
This concept is completely flawed. This self serving philosophy is also at the very core of the problem. When the machine attempts to modify the behavior of people to satisfy it’s own ends, the machine is broken. What people do should not be seen as affecting the search engine. What people do should be the very reason for the engine to exist in the first place. If the search engine is being affected by the actions of people, is any logical person going to honestly assume that it is the people that are broken? That is exactly what is happening here.
Yahoo!'s Robots-Nocontent Attribute:
Search engines have got better at identifying duplicate content. Some search engines may boilerplate strip obvious navigational elements from pages. Some may place pages with too much duplicate content in supplemental results. Some may sites with too much duplicate content in reduced crawling status.
There are all of these ways to fight off content duplication and Yahoo! offers a robots-nocontent tag. One of the first people to comment on the news was Google's Matt Cutts, who said:
Danny, can you ask how Yahoo intends to treat links in the "robots-nocontent" section?
Don't Use the Robots-Nocontent Attribute:
It might be easy to add class="robots-nocontent" to some of your divs, but should you? I think it has little value. Sure you could use it in a sneaky way, as suggested by Jay Westerdal, but the problems with that are:
- it looks sneaky
- you are removing content from your pages (and will thus rank for fewer phrases)
- there are easier and more effective ways of changing the meaning of a page without looking so sneaky...like just rewriting an article, adding a spammy comment that looks like it came from a third party, or adding a few additional words here or there.
Yahoo! is the top network of sites on the web. Internally they have publishing teams and an SEO team. If their search engineers can't figure out how to use their own internal traffic stats and other relevancy measurements to refine their duplicate detection algorithms they deserve to bleed marketshare until they no longer have relevancy in the marketplace.
How to Change the Focus of a Page Without Using Robots-Nocontent:
If you want to change the focus of your pages here are some of the best ways to do it
- Ensure your page title and meta description are unique. Do not place the same words at the start of every page title on all the pages of a new website.
- Make your h1 headings and subheadings target a slightly different word set than your page title.
- If your page is thin on content, add more additional relevant unique content to the page. The solution to not getting killed by duplicate content filters is adding more unique content, not stripping out obvious required duplication (such as navigation and advertisements) that search engines should be able to figure out.
- If your site has comments or consumer feedback you can post or encourage feedback that targets other keywords. Comments offer free text. A 500 word page with an additional 1,000 words in the comment section may rank for 2 or 3 times as many search queries. Don't throw away the free content.
- For those who are really aggressive and have crusty links that will never be removed, consider placing your commercial messages on one of your highly trusted high ranking pages. People buy and sell websites, who is to say that the contents of a URL can't change?
Comments
I actually think having the option of using robots-nocontent is positive.
In my own site I have used JavaScript to hide e-mail forms and the like from bad and good bots. Yet there are other occasions, such as the use of Microformats [http://microformats.org/] where a page may contain functional html which is not suitable for search engine indexing and retrieval.
If nocontent were an industry standard, there wouldn't be a need to add extra JavaScript to keep such code out of search engines. (That Microformats are still controversial is another issue for another day).
My full take on the issue:
http://www.antezeta.com/robots-nocontent.html
Great to see an SEO also *not* supporting this attribute.
... and I always love to post a bit of the developers view on this blog :-)
Summary: It is not the business of a search engine to start telling us how to markup our own websites and content.
Follow Google's example and leave web standards and accessibility up to us... just concentrate on finding the sites more efficiently!
"Summary: It is not the business of a search engine to start telling us how to markup our own websites and content."
That's a weak argument considering no one objects to the NOODP META tag, META keywords tag, nor the META descriptions tag.
@halfdeck - sorry I should probably expand on that one...
I wrote an article in the past about how I thought it was Google's responsibility (seeing they have so much influence in the industry) to push accessibility in the industry.
I then talked with many Google engineers, and came out seeing their side. It's not their business to meddle in accessibility and best practice web development affairs. It's actually the industry and per business that should be pushing that... Google should just concentrate on what they do best... indexing the internet.
So... my reference is to search engines dictating on how we *markup* our websites. What you are refering to is meta-data, which I believe is actually very specifically about information / searchability of the document.
This information *does* help the search engines do their job.
I noticed on the bottom of the seomoz/blog there are now 116 nofollow links. As a navigation tool I find them useless as they give no indication as to where you might end up, and for the bots they are told they are untrusted links !?
I have no clue what a site might achieve with this, but this is surely a complete distortion of its usage ?
Dave
Aaron, I think you're too close to SEO (a bit far from publishing). I might want to show syndicated content to my visitors, but not have it dilute my content or alter my context. I might want to strategically deploy the nocontent label within my IA process, and then the class works ok. If the SEs want to provide additional granularity for those who want to make use of it, why is it "bad"?
Your focus suggestions above are SEO-style. Add a spammy fake comment to your post? C'mon. Let's see you sell that idea to a platinum brand client. And "add more content" is not always helpful, although it is obvious.
As for gaming the system, of course that's always possible. But the SEs can ignore as easily as follow the attributes... and compare or contrast. Nothing new here.
Hi John
I have actually had platinum brand clients recommend far shaddier to me.
If you don't want the syndicated content to be spidered by search engines you could use JavaScript.
If if is possible to game the nocomment system then people will do it, and the search engines will have to index the pages as if nocontent didn't exist anyway.
I love reading Aaron articles.. very informative always and always the first in all SEO Blogs. I always check the site everyday if there is a new info.. keep the good work Aaron... you rule... :P
Hi,
I used to read Aaron articles, It has excellent information.It is very helpful to me.
Keep sending more information.
there is another Indian Local search engine(ilaka.in), Here we can search Hyderabad, India Hospitals, Movie Theaters, Schools, Banks, shopping Malls..etc .We will get addresses with route maps.
I hope it is very helpful to Indian People.
Cheers
Vijay
Hi every one,
I'm new in the webmarketing world. And I'm french...
I wonder if you can explain me the drawbacks that can occurs using section targeting?
I'm actually in charge of a word press website and I used Google Ad Wrap plugin to target the section where the right content is in order to make the adsense conversion rate higher. Does it make sense? Do you think it can be efficient? What are the drawbacks?
Thank you.
Add new comment