If Links Didn't Matter...
David Berkowitz recently wrote an article asking what if links lost their value? Over the past year real editorial links have only increased in value, as Google has been more aggressively requiring some minimum PageRank threshold to even index a page.
Many types of links have lost value as Google has got better at filtering link quality, but will editorial links ever lose their value? To answer that you have to realize that the reason links have value is that they are typically a proxy for trust based on social relationships or human judgement.
But links are openly gamed today and there are an increasing number of affordable marketing techniques that allow virtually any site to garner hundreds or thousands of quality links.
One day Google might come up with better ways to determine what to trust, but if they do, it is going to be based on who humans trust more, and who amongst those trusted sources does the best job of providing editorial value and noise filtering on their site. And this internal site filtering will become even more important as many hub sites leverage their brand and allow communities to contribute content to their sites.
There is one part of David's article that I think is off though, and that is the part on the keyword density:
Keyword density, the imperfect science of including just enough of the most important keywords on any given page without spamming the search engines, becomes more important than ever.
I don't think keyword density will be the answer to anything. I think a more appropriate phrase might be linguistic and attention based profiling.
Attention Profiling:
If links (and link acquisition rate) are a sign of quality, then likely so are RSS subscribers and RSS readers, as well as brand related search queries, custom search engine entries, instant message mentions, email mentions, and repeat visitors. Those are a few examples of attention based profiling.
Linguistic Profiling:
If you are the person that people are talking about then you are also going to help shape your topic's language. You may make up many of the new words used in your industry and your name may even be a core keyword in your industry.
You are not going to match your language better than the competition by caring about keyword density. The way you beat them is to have more market attention and work your business and name into the industry language.
Comments
Some of the deductions seem too far fetched. Specially the bit about 'social media'. While i agree that people will lose interest in 'optimizing' (stuffing?) anchor text for a link and mostly generic terms will be used, 'optimizing' a link for traffic will always continue - unabated. I remember, way back in 1993 when i first started browsing the net, there was no Google, no Yahoo for me. Yet i found what i looked for because links were there! (and i clicked on better placed highlighted links more often, btw).
I don't think links will ever be completely discounted by the search engines. The only way they could do that is if they created an algorithm that was capable of reading pages and accurately judging their quality, relevance and usefulness. That kind of technology is still far off on the horizon.
Linguistic profiling is a very interesting concept, logical extension, but I must admit that I hadn't really thought about it before.
Especially in the case of the big G, I think digitizing tons of authoritative books will give them an incredible ruler to judge sites by.... if you have a source of credible writing on a subject, what better to judge future writing on? To some degree anyway.
Of course writing online is different than traditional print, and that may be exactly where this idea comes into play. It's more than just saying that "this" site which is a trusted authority is talking about some topic and using some keywords, therefore "that" site must be okay.... weighting in all the other metrics of authority of course.
But now you are adding in an entirely different metric... do they use the same language, similar style, discourse, all without tripping duplicate content filters.
After all, it is the perfect group theory.... those within a group are more likely to walk and talk and interact in similar manners. And after all, the web is nothing more than networked, grouped interactions, so why not apply traditional group theory, linguistic, and discourse analysis to all of the other metrics. Web anthropology... have to imagine it will be, if it isn't already, a major area of college study by 2010.
I love the idea of attention profiling. At least in terms of RSS readers. Email and IM mentions might raise privacy issues, but it would be really valuable if they would get around it.
I think this comes down to the limitations of the algorithmically based search, and the power of mixing algorithm search and social user-user-website interaction.
If the engines can, as you suggest, start looking at HOW we work in our everyday lives (Google Desktop anyone?), seeing WHO we interact with, WHERE online we go, WHY we go there, WHEN we go there, and WHAT value we get from visiting those sites (information, community, inspiration), then we'll start seeing truly un-gamable search engines.
At that point, it ceases to become about having the right keyword density, or having great links, it's about having the right visitors. Which is down to having the right, relevant links, from the right places, and the right content, coupled with correct advertising and marketing to get the right people to your site.
A lot of variables to manage. And as I've been talking to other people about this, they seem to be agreeing with what I've been saying. And asking the next logical question:
What do I do to be ready for search 2.1?
The answer, of course, is simple. It's the same advise we've always given people: write great content, on theme.
Write the right content, and market well, and people will come, and then keep coming back.
To Pete: This might also be why Google released such a powerfull and free tool as Google Analytics is.
Imagine all that data at your/Googles fingertips... It must really produce invaluable information on cross site user behavior.
How does direct access figure into this? As a blogger, half of my visitors come directly to the site, often as word of mouth from another reader.
Can that be worked to increase authority/relevance in any way?
Personally I think that Editorial links will be devalued over time. Aaron is spot on about gaming the market. I think that keyword density is currently important with todays algo, but I think that time will soon pass. It's really just a matter of time before Google starts to group keywords in to specific sections. I do not know how many people have noticed, but Google has started to add about 10 additional keywords at the bottom of a search phrase for people to click on. They have to be doing it for a reason and my guess is to keep track of the click through's on it and see what grouping receives the most CTR so they can start to confirm things. Such as keyword1=keyword10 and keyword4 more than the rest and we should include that in the serps for keyword1. I have started to see more and more people saying that serps are beginning to show ranked pages that do not even have the keyword on page. Some have even said theere is no BL data that shows that anchor text either. Of course I am beginning to ramble on here :P
From a user perspective, however, a link is only as valuable as the content to which it links. If an SEO is able to create keyword specific copy that is very useful and readable, solid editorial linking, I believe, will still carry it's weight.
SEO writing 101: Start with good, user-friendly content, and everything else follows.
I don't think Google could ever change the effectiveness of that staple.
The number of RSS subscribers can be faked, IMO.
The rest is plausible, but not tangible enough to actually envision a move away from links-based rankings.
However, I think it's an excellent question and one that should be asked more and more.
Sure the number of RSS subscribers can be faked, but will the people faking that also be faking post reads on Google Reader? And natural related read profiles? And Google account history (including search and Gmail)?
I'm curious if/when the algorithms will be enhanced to start learning from user activity rather than just reporting on it. An example I can give: you have an online shopper that submits a credit card. It gets denied: stolen card. My code blocks that IP address for five minutes (can't block forever since they get recycled), the email address, and any other information I can tie to that transaction. If another transaction comes in within the next X-minutes with any of those same values, I then block any new associated information submitted (eg. same email, different credit card, different IP, etc. - all now blocked also) I continue to drill this person/process/server into the ground, making it next to impossible for them to game my system. Essentially any new data they submit to my system is only helping me further block more bad data - they will eventually give up and the data locks, over time, will be released.
The problem I have is that the white hats are continually abandoning solid ideas just to escape the black hats - those abusing the system. Instead of continually being reactive in nature, when will we be able to build on solid foundations rather than continually picking up camp and starting over?
.02
"... there are an increasing number of affordable marketing techniques that allow virtually any site to garner hundreds or thousands of quality links...."
Any suggested non Black-hat resources?
Add new comment