Google Using Search Engine Scrapers to Improve Search Engine Relevancy
If something ranks and it shouldn't, why not come up with a natural and easy way to demote it? What if Google could come up with a way to allow scrapers to actually improve the quality of the search results? I think they can, and here is how. Non-authoritative content tends to get very few natural links. This means that if it ranks well for competitive queries where bots scrape the search results it will get many links with the exact same anchor text. Real resources that rank well will tend to get some number of self reinforcing unique links with DIFFERENT MIXED anchor text.
If the page was ranking for the query because it was closely aligned with a keyword phrase that was in the page title, internal link structure, and is heavily represented on the page itself that could cause the page to come closer and closer to the threshold of looking spammy as it picks up more and more scraper links, especially if it is not picking up any natural linkage.
How to Protect Yourself:
- If you tend to get featured on many scraper sites make sure you change your page titles occasionally on your most important and highest paying pages.
- Write naturally, for humans, and not exclusively for search bots. If you are creating backfill content that leverages a domain's authority score, try to write articles like a newspaper. If you are not sure what that means look at some newspapers. Rather than paying people to write articles optimized for a topic, pay someone else to do it who does not know much about SEO. Tell them to ensure they don't use the same templates for the page titles, meta descriptions, and page headings.
- Use variation in your headings, page titles, and meta description tags.
- Filters are applied at different levels depending on domain authority and page level PageRank scores. By gaining more domain authority it should help your site bypass some filters, but that may also cause your site to be looked at with more scrutiny by other types of filters.
- Make elements of your site modular so you can quickly react to changes. For example, many of my sites use server side includes for the navigation, which allows me to make the navigation more or less aggressive depending on the current search algorithms. Get away with what you can, and if they clamp down on you ease off the position.
- Get some editorial deep links with mixed anchor text to your most profitable or most important interior pages, especially if they rank well and do not get many natural editorial votes on their own.
- Be actively involved in participating in your community. If the topical language changes without you then it is hard to stay relevant. If you have some input in how the market is changing that helps keep your mindshare and helps ensure you match your topical language as it shifts.
Comments
This is a great look into another possible filter. I need to add this to my google's filter list.
Thanks aaron!
While this would probably work, wouldn't it be easy then to exploit? I mean how tough would it be to scrape your competitor's site a ton of times?
Also, why does Google need to introduce another measure that webmasters need to protect themselves from? Why should I need to change the titles of my pages? If it's not broken, don't fix it. Although with the way Google's been headed lately, it wouldn't surprise me to see them break another thing that I end up having to fix.
The goal is to make it hard to have reliable profit streams without creating real value. It forces you to either keep learning how to exploit holes in the system and keep protecting yourself from those who do, or it forces you to keep creating and sharing value, which increases the value of using Google.
They are thinking / hoping most people will opt for the later of those.
If SEO is too complex to explain and too expensive to implement in a reliable, meaningful, predictable, and measurable way Google would be happy.
I would imagine the latest algorithm tweaks to counter Googlebombing fit in with the nature of this post, particularly in regards to mixing up your anchor text.
I am thinking that the Google Bombing algorithm (as well as what Aaron has described) can and will hit relatively unpopular websites, who only have few content published on other sites (and which have the same anchor text, pointing at them).
Aaron,in some way we cannot use scraper for easy links?If we do this is black hat SEO?
You can use scrapers for easy links, but they don't have to count those links, and if your ratio of bad links to good links is too high then it might hurt your ability to rank, as noted in the above post.
Add new comment