Cats and Mice: The Shifting Sea of Search Results
Google can never show the most relevant results for everything. No matter what algorithmic loopholes they close they inadvertently open up others. And anything they trust gets abused by marketers.
- Search engines trusted page titles and meta descriptions. Marketers stuff them full of keywords. So then search had to move more toward trusting page content. Marketers used hidden text and other similar techniques.
- Search engines trust links. SEOs buy and sell them and create link farms. Search engines only allow some sites to vote, have some sites pass negative votes, make certain votes count more than others.
- Search engines place weight on anchor text. SEOs abuse it, so they created filters for too much similar anchor text, and offset those by placing more trust on domain names when they exactly match the search query.
- Search engines place weight on exact match domain names and domainers start developing nearly 100% automated websites.
- Too many new sites are spammy so they place weight on older sites. SEOs buy old sites and add content to them.
- Place more weight on global link authority. Spammers find cross site scripting exploits on .edu domains and media sites start posting lead generation advertisement forms on their sites.
- Bloggers are too easy to get links from and comment links are easy to spam. Search engines introduce nofollow to stop comments from passing PageRank. Then Matt Cutts pushed nofollow to try to get webmasters to use it on advertisements.
- Too many people are created automated sites, especially affiliates are creating a large number automated sites. Search engines employ human reviewers, get better at duplicate content detection, and require a minimum link authority on a per page level to keep deep pages indexed.
- Social news sites are providing a sea of easy link opportunities and low quality information. Too many people are doing linkbait. Perhaps Google may eventually only count so many citations in a given amount of time.
When your site changes in rankings it may not be just because of changes you made or changes in your field, it may also be due to Google trying to balance
- old sites vs new sites
- old links vs new links
- big sites vs small sites
- fresh content vs well linked pages
- exact match vs rough phrase match
- etc
Comments
SEOBOOK has helped me a great deal. However, trying to optimize in a sea of fakes and cons is destroying the net. People don't deep search because they find irrelevant content. They think they have entered the wrong query or the information does not exist when all they find is spam and link bait.
The SEs MUST find a way to filter genuine content from garbage to restore faith in the information seeker.
I think the very reasons you cite are the very reasons the best SEOs tell clients that they should not focus on ranking, but on traffic patterns and conversion/monetization of traffic.
Sure, ranking can help. But ranking is very fickle, even hour-to-hour, based on so many factors, it's the wrong goal to chase.
"I'm on ur wirless mous...
"surfin' the netz."
I don't think too many people are link baiting. Most people have never heard of seo and those that have think its due to either meta tags, traffic or link exchange. Thats not going to change for years.
Google certainly has it hands full. The funny thing is they are in constant battle with SEO guys who understand how searchengines and the Internet work.
Meanwhile, the person who has just got that Internet connection a few months ago, is not aware of any of this battle and is click happy.
Maybe launching a non technical campaign to educate users on what good content is may help there efforts in way the never imagined.
As a newbie to this SEO business, that post definitely clarified a few things for me - so thank you Aaron (this site has been VERY helpful in my effort to learn more about SEO).
It just seems to me that Google and SEO/SEM-ers are in a never-ending battle of one-upmanship. Google will come up with new guidelines for rankings, and SEO-ers will come up with techniques to use those guidelines to their advantage.
I just don't see anything wrong with that - as long as the website being ranked is useful to someone out there. I know everybody hates spam, but the internet is pretty much founded on anyone being able to get whatever they have to/want to say out there.
I think Kyle made a good point about the fact that maybe it's not as important for Google to continuously tighten their rules to get more relevance in searches - but rather they should be educating users on how to tell good content from bad and how to actually USE Google search. I may be wrong, but I'm pretty sure the majority of users out there do not know just how definitive you can get in your search with Google - I know I'm STILL learning the ins and outs.
On another note, I don't think many sites out there are consciously aware of "link baiting" unless they have an SEO working with them - I think people just realize that now, as always, if you wanna get noticed, you gotta bring something to the table that will wow people. I hope Google doesn't begin to penalize rankings for this.
Every time Google tells marketers to do anything they flood to it. Isn't everyone a blogger and linkbaiter now?
:)
All I could think at first was, "Aw, what a cute kitten." :)
It really doesn't matter how search engines rank websites. There will always be those who choose to exploit the process. The important thing is to recognize practices that are bound to be patched from practices that have lasting value. In the end, providing visible, worthwhile content and a great user experience will invariably produce good results, no matter how the algos might change.
Add new comment