Google works so well because they are scalable, but they are not adverse to paying people to review content quality, because they love human computation, just see their Google Image Labeler game. What if Google came up with ways to determine which users were real and trustworthy, and could give those users incentive to edit the search results for Google? And what if Google could give a similar incentive to advertisers and legitimate publishers?
What if just by reading this you are helping Google trust this site more?
Attention Data:
Google already is the market leader in tracking attention data on the active portions of the web. What if Google integrated attention data into their algorithm and to offset that decide to lower the quantity of links they would count in any time period or the weight they would put on them? What would that do to the value of link baiting? How can Google move away from links?
WebmasterWorld and Threadwatch both recently had great posts about a recent Google patent application about removing documents based on the actions of trusted users.
Google's Own Web Graph:
Google is setting up an alternate proprietary web graph outside of linkage data. Sure any single point of attention data may be gamed, but they are likely far more reliable when you triangulate them. And if a few data points fall outside of expected ranges for the associated site profile pay to have the data reviewed. Based on that review demote spam or further refine the relevancy algorithms.
A Complete Feedback Cycle:
Google is the perfect shopping mall. Google...
- verifies the legitimacy of user accounts by looking at their history, comparing them to other users, and challenging them with captchas.
- tracks click-through rates and user activity, associated with user accounts and IP addresses.
- hosts a bunch of web content, which is syndicated on many channels, and can further be used to understand the topical interests of the account holders and reach of publishers.
- asks for searcher feedback on advertisements
- allows people to note URLs using Google notebooks
- tracks feedback after conversions
- puts themselves in the middle of transactions with Google Checkout
Why wouldn't they be open to using those and other forms of feedback to help shape relevancy?
Opening up AdWords to display content partner URLs is probably a pretty good example of them adding an advertiser feature for improvement of organic relevancy scores. If advertisers think a site is garbage and Google knows that most of that site's traffic only comes from search engines it would pretty easy to demote that site. AdSense publishers also have created blacklists.
Popularity vs Personal Relevance:
Google can triangulate all these data points to see beyond just how much hype any idea creates. They can understand user satisfaction and brand loyalty, which are far more important than just how much short term hype an idea can generate.
If 100 searchers with somewhat similar profiles to mine are loyal to brands x, y, and z then I am going to see them more often, even if those sites are not well integrated into the web as a whole.
Digital Identities:
Bill Slawski recently posted about Google's Agent Rank patent, and there is a push to create a distributed login system called OpenID. Google may not need a single login to track everyone. All they have to do is get a large representative sample of legitimate web users to get a pretty good idea of what is important.
As Bob Massa said, search engines follow people.
Personalization is not going to be what makes SEO harder. It is going to be linguistic profiling, attention profiling, community interaction, and layered user feedback that make it harder to promote garbage and easier to promote quality sites. I still see spammy link building working today, but I don't see it staying that way 2 years out.