Noam Chomsky's Manufacturing Consent

Some people call Noam a conspiracy theorist, but I tend to think that just a label used to discourage institutional analysis, which is exactly something Noam states in Manufacturing Consent, an institutional analysis film about mainstream media bias. He also wrote a book by the same name that I still need to read.
Some of the underlying ideas that Noam frequently conveys in many of his interviews are:

  • Creativity is a fundamental need for humans.

  • The military (among other purposes) is in many ways an extension of technological institutions.
  • Authority should be challenged as to its necessity. If it does not prove useful it should be discarded as a source of power. Self regulating positive and negative market forces will keep most market aspects range bound and organized. This line of thinking is mentioned many times in A Thousand Years of Nonlinear History.
  • Power sources which are largely funded by a small group of people will be biased toward promoting the interests of that small group of people. Self preservation is a key goal of any institution.

He then goes into a bit of information about thought control

  • In a totalitarian government you do not need much public support to do whatever you want. In societies with more freedom you must set up a framework for controlling thought which makes it easier to control the people.

  • "Democracy requires free access to ideas, information, and opinion." When you hear politicians pushing laws to regulate the web to save the children make no mistake that first and foremost they are pushing to create a fragmented, filtered, and imperfect information source and network which keeps power in the hands of those who are already powerful.
  • Controlling people requires "necessary illusions and emotionally potent oversimplifications." This is part of the reason there is a left and right side to a story. Create these arbitrary pigeonholes for ways people should think and attach their identities to and hopefully they will not think beyond the categorization that already speaks and thinks for them.

Media shapes public opinion via

  • selection of topics

  • distribution of concerns
  • emphasis
  • framing of issues
  • filtering of information
  • bounding of debate within certain limits

He then talks about the concentration of power and bias of interest toward businesses associated with some publishing formats:

  • Most large distribution news publishing formats are owned by a small group of elites who are tied to other large business interests.

  • The AP and a couple other traditional news sources have an oligopoly over the mainstream news market. Some newspapers, like the New York Times, distribute a brief of the contents of their next day's paper to other newspapers to help set the daily agenda.
  • Many (perhaps most) newspapers consist more of ad space than news (and thus in many ways the advertiser is more of a customer than the reader). While the web and search allow individuals more opportunity (you would never be reading anything I write without them), search engines struggle with balancing this same issue, and are favoring old media by doing things like trusting certain sources to seed vertical search and overemphasize core domain authority in their algorithms. Google has also recently started paying large traditional content providers, including News Corp., MTV, and the Associated Press. They also purchased a portion of Time Warner's AOL. The WSJ recently published an article highlighting that Google believes content partnerships are a key to longterm growth.

Some types of information are created or promoted because they teach people not to think or to not question authority, or to rally behind a common pointless cause.

  • Sports and many other forms of news and entertainment are useful to help drive the masses away from
    issues of importance to their life and help build "irrational attitudes of submission to authority."

Some publishing formats (like 30 minute television shows) work great because they segment audiences and require answers to fit in a 20 second window.

  • Distribution via channels segmented via concision require you to convey thoughts quickly.

  • In limited time slots, it is hard to break new ground or get beyond conventional thought patterns previously formed by others. If you say things outside of the normal realm of thought you do not have enough time to state your reasoning behind your words, and thus can be misquoted or taken out of context and made to look like an idiot.
  • If you say something outside of the norm, like "education is a system of imposed ignorance" then you have no time to explain what that means, and end up sounding like you heavily bought into education. ;)

Noam Chomsky then went through a startling example of clear and overwhelming media bias.

In 1975 Indonesia invaded East Timor. The story of East Timor got a bit of press because the business community was interested what it meant for the Portuguese empire. As the killing reached genocide level in 1978 the US mainstream media coverage of the story dropped to zero. The US provided Indonesia most of their arms for the mass human rights violations and mass murders.

A declassified memorandum of a July 1975 conversation between President Gerald Ford and then-Indonesian President Suharto demonstrates clearly the extent of US support: Ford asks Suharto bluntly, "How big a Navy do you have and how big a Navy do you need?"

Around the same time the US heavily bombed Cambodia. The civilian deaths were not given strict numbers in the media until the Khmer Rouge gained power, at which point the US mainstream media started throwing out words like genocide and numbers like 2 million dead within a couple weeks.

People manipulate systems (as an SEO that is sorta what I do). In much the same way that most people are kept in the dark about SEO or public relations the same is true for just about any type of publishing or marketing business model. But, as Upton Sinclair would say:

It is difficult to get a man to understand something when his salary depends upon his not understanding it.

I like learning about power, authority, and publishing business models because

  • if you know flaws in other business models it is easy to build business models that eat at their flaws or revolve around markets they would never want to be in

  • on the web everything is so scalable that if you have a really great idea it can go far, especially as people learn to trust software programs and other consumers to help them make decisions which once relied on friends or traditional intermediaries.

The Need for a Credible Guide

A friend of mine is creating a how to website. I recently wanted to purchase an item related to his field, so the first thing I did was go off to his site to look for information about the product I wanted to buy. His article about the topic was quite thin in nature and perhaps even looked like I might have wrote it, which is bad since I know nothing about the topic.

If someone is buying a cheap accessory then you might not have to sell much or sell hard or go in much depth to convert them. If someone is buying something

  • that they know little about

  • is expensive (in terms of opportunity cost - time, money, other factors, etc.)
  • is hard to return

then you might have to provide more information to be able to build up enough trust to sell to them.

When creating an affiliate database driven site it is easy to give 1,000's of items the exact same weight, but if you can instead answer one or a few questions far better than anyone else does it is much easier to create a longterm stable income stream. Plus if most competing sites consist primarily of thin compacted data and your sales information and product guides are link worthy that provides a huge marketing benefit.

Also consider that as search engines dip further into vertical search and more thin compacted data sites are created one needs to provide better or more unique and compelling information to be citation worthy.

How Many Link Opportunities Come From 1 Story?

The AOL data release story has been circulating for a while now, and in spite of tech bloggers being quick with stories people are still getting a bunch of links for it. Here are some of the link opportunities around the idea

  • data was released

  • privacy concerns
  • data was removed, here is a mirror
  • free online keyword tools
  • creepy searchers and creepy searches
  • the searcher covered in the NYT
  • CTR by position
  • most likely? other statistical analysis

If you see an idea that is quickly spreading on the meme trackers there may be many additional link opportunities if you can give the story a unique spin or see other related ideas that people may think are important.

Clickthrough Rate by Search Result Ranking Position

AOL has a bias toward consumer (ie: non b2b) type queries, and they may have a higher % of brand related searches that both act to place a bit more emphasis on the top search result than general searchers from other search engines, but some people have dug through the 20 million search queries AOL gave away and come up some stats.

SEO Blackhat dug through the numbers and has a free tool for estimating clicks based on search volume estimates and rank position.

Updated Firefox SEO Extension

Yahoo! recently announced they are moving some of their link queries over to Site Explorer. The problem with that is that now there is no way to get .edu and .gov backlink data from Yahoo!

I had my programmer update SEO for Firefox to pull linkage data from MSN Search. In addition, he added some of the features that are in SEOpen and SearchStatus, such that you can highlight nofollows on a page and right click on a page and pull in some of the relevant link and other SEO related information.

After Yahoo! (hopefully) restores the ability to sort linkage data by TLD we will re-enable Yahoo! as a data source. There might be a few bugs in the newest version of SEO for Firefox as well...like if you query MSN Search automatically too quickly they may end up blocking your IP address.

Google's Depreciation of Anchor Text

I only attended a couple panels at SES, but Greg Boser was on one of them, and he always has a way of saying things in a clear way. He mentioned in the past that a divide and conquer technique was a great way for small sites to compete with larger rivals. He then went on to say that with Google's current reliance on site age and link related authority that it may no longer make sense to use a divide and conquer method to rank well in Google. If you look through Google's search results for competitive insurance related phrases typically they are dominated by old sites, government and education sites, news sites, and/or sites which are focused on all 50 states. In the past it might have made sense to make sites for each of the most important states, but with the current Google one site with an authority rank of 8 is probably going to be worth far more than a half dozen sites in the same vertical that only have an authority rank of 6 (there is no AuthorityRank meter...just assume it is some arbitrary value based on age and link equity).

Another thing which Greg mentioned in his speech was that it seems Google is really moving away from trusting anchor text as much as they used to. I recently bought an old domain from a friend that was just wasting away. It was old and had a few average type links from related websites, but had no relevant anchor text for the terms I wanted to rank.

I changed the internal link structure to focus the home page on a moderately competitive term. Just doing that ranked it in the top 20 for that term. I then got it a couple low-to-average-quality links with the plural version of that anchor text and got it ranked in the top 10 for both versions. In the past that site might have required either higher quality links or many more descriptive link anchors to rank.

In the past (say a year or two ago) I was way more focused on getting specific anchor text from external sources and probably went a bit far with it. Now it seems all you need are a few relevant decent quality descriptive links and your site will rank so long as your site has a bit of age and a few legitimate links.

Google Vertical Search Canibalizing Google's Organic SERPs

I searched to see if the movie An Inconvenient Truth was playing in a local theater. Google not only showed the Movie OneBox result, and offer a movie search feature, but they also rank the Google Video trailer in their search results and are caching the movies result page. Loren recently posted an in depth article showing how much Google is doing to add interactivity to and exposure for Google Video.

As Google adds features and consumer generated media to Google hosted vertical content pages many review sites and thin sites in high margin verticals will lose a good portion of their value, link equity, and traffic. A big thing that places Google ahead of most review sites is that they will not only collect and structure their own feedback, but their knowledge of language and the web graph makes it easy to access some of the best review information on other sites.

In a couple clicks I can go from reading feedback on Google to reading aggregated feedback snippets from other sites to reading some of the other best reviews on the web. For example, it takes little effort to see the official site, the contempt some sectors show the film, a more objective review, and a speech which inspired the creation of the film.

Indexed Page Quality Ratio

Large websites tend to have many useless pages associated with them. They may be caused by any of the following

  • poorly structured or poorly formatted user generated content
  • content duplication due to content management issues
  • canonical related issues
  • dangling nodes which act as PageRank sinks
  • navigational pages which are heavily duplicated and soak up link authority and do not provide a clean site structure

I recently have had a couple SEOs show me various navigational techniques which made thousands of thousands of somewhat similar mid level navigational pages.

Some pages make sense to be indexed and provide a great user experience if searchers land on them. Others provide a poor user experience.

Search engines do not like indexing search results from other engines, so if your navigational scheme has an element which acts similar to an internal search engine you probably do not want all those search pages getting indexed if they are heavily duplicates of one another.

I was talking to Stuntdubl the other day, and he stated one of the main things he likes to look at to get a general indication of the health of a site is to look at the ratio of quality pages indexed to total pages indexed from your site.

If lots of your indexed pages are heavily duplicated and/or of low value that may cause search engines to crawl or index your site less deeply and not index all your individual product level pages.

Bite ¿Byte? Sized Content

Recently Google allowed you to link to an exact minute and second of video. They also give each page of a book its own URL.

More Fun With AOL Keywords

A few people have created free cool web based tools which allow you to search through the 20 million keywords AOL recently shared with the marketing community. http://www.aolsearchdatabase.com/ - allows you to sort data by:

  • User ID
  • Keywords
  • Date of search
  • URL

http://www.askthebrain.com/aol/

  • allows you to sort data by TLD
  • allows you to sort data by keywords leading to a specific domain
  • by default the tool also displays the top few thousand URLs, the number of referrals to a URL, shows the top keywords leading to a URL, shows the keyword diversity ratio

http://www.dontdelete.com/ - search by keyword, keyword stem, or part of a keyword to find related keywords. For example, if you searched for dati it would return all keywords that had dating in them.

Anyone think I should add a link to any of these tools on my keyword research tool?

Pages