Internal Article Anchors From Search Engines

I recently searched for [Tippecanoe County Shrine Club] and Google ranked a huge Wikipedia page first. When will search engines start directing searchers to portions of a page instead of just to a page? How will that change affiliate, contextual, and web merchant business models?

Dreamy Google Sitemaps & a Page Strength Tool

Matt Cutts is looking for feedback on improving Google Sitemaps.

I'm expecting some creative answers here. I'll phrase it more generally: Forget XML files or even what Sitemaps looks like currently. What info would you want as a webmaster?

If you could design your dream webmaster or site owner console on Google, what would it look like?

Rand announced the launch of his page strength tool, which aims to be more accurate than Google's PageRank. The one downside to the tool is that there is a delay most all the data sources (for example I think my SEO for Firefox page is ranking at #10 in Google for SEO right now, but the page strength tool shows it as being at 3.5), but it is probably quite a bit more accurate than PageRank alone is.

Also interesting that on one front Google is requesting to look for ways to share as much data as they can with you while on other fronts they make external tools and ideas necessary and valuable because they are unwilling to share data they once shared. Thus markets which were once fairly open are getting more and more abstract. It happened with PageRank and SEO and now it is happening with AdWords too.

Over / Under the Radar Link Buys

PageRank 9 links cheap!!!! Or maybe not ;)

If a market inefficiency is so great that people focus specifically on that inefficiency then the inefficiency is going to dry out pretty quickly. Either the undervalued commodity is going to have is supply quickly exhausted or the market maker which lends the value to the commodity will remove the value. Within any topic or vertical there are ideas and sites focused on those ideas which will have high authority but limited income opportunity. Conversely the sites focused on maximizing revenue generation typically are nowhere near as authoritative. So they either have to create secondary sites, launch viral marketing campaigns, or hunt for authority where they can buy it at an affordable price.

For example, there are lots of sports equipment and sports collectible sites online which have limited authority. There are, however, authoritative sites about each and every sport. It looks like this site, from 2000 with about 20 edu links, DMOZ listings, and Yahoo! Directory listings allows you to sponsor pages for a year for $5 each.
I probably would not sponsor a few pages on that site. I would be more inclined to spend a few grand to buy exclusive sitewide sponsorship rights.

Not all of the sites are going to suggest a price for a reference on their sites (and in fact most webmasters are quite unaware of the value of their content and their link authority). You may have to hunt around to find those kinds of sites. But if you think of sites that

  • would have high authority; and

  • not be noticed by most of your competitors; and
  • almost no income

those will be the sites that will give you great long-term link value. Jim Boykin is great at finding those types of sites.

If you think that getting a link off the site which creates the standards that run the WWW is sneaky or that nobody will find it then you are probably wasting your money, and getting a bunch of links from smaller and more related sites is a better investment, especially in long term. The big pages tend to get spotted quickly, fill up with spammy links quickly, and either algorithmically handeled or manually handeled. I learned that in the past when I did a Mozdev donation.

Some people have assumed that I am a huge spammer because I donated to the W3C, but I have donated to many projects where I didn't donate just for a link, and am not ashamed to admit that I supported the WWW.

I was (and still am) a big fan of donating for links, but have generally got much lazier on that front recently because recently it has been far cheaper to create interesting content or tools to build up the authority of this site. it has enough exposure to where if my ideas are well implemented they are going to spread.

I however do sometimes make spammy pages or buy spammy links. Some are just to joke or play around or test the market. Others are dual purpose or passive lead generation streams (for instance on this page I am not selling anything to do with eBay, I just wanted to test the authority of my other blog and a number of people who find that page end up connecting it to this site and buying my book). I don't actively solicit most of my spammy links (like the ones on the splogs about wall clocks), but what does it really matter if you have a few spammy links if you also have tons of legitimate ones? If getting a few low quality links gets people to talk about you does it also increase your exposure and help build good free secondary links? Sometimes, methinks ;)

Who is the moral authority to determine relevancy of a link or a search result? Are their guidelines anything deeper than self promotion? And why does their opinion matter? So long as whatever you do is enjoyable and profitable and you weight the risk to reward ratios I don't think much else matters.

Search Engine Cloaking FAQs: an Interview With Dan Kramer, Creator of Kloakit

I recently asked Dan Kramer of KloakIt if I could interview him about some common cloaking questions I get asked, and he said sure.

How does cloaking work?

It is easiest to explain if you first understand exactly what cloaking is. Web page cloaking is the act of showing different content to different visitors based on some criterion, such as whether they are a search engine spider, or whether they are located in a particular country.

A cloaking program/script will look at a number of available pieces of information to determine the identity of a visitor: the IP address, the User-Agent string of the browser, the referring URL, all of which are contained in the HTTP headers of the request for the web page. The script will make a decision based on this information and serve the appropriate content to the visitor.

For SEO purposes, cloaking is done to serve optimized versions of web pages to search engine spiders and hide that optimized version from human visitors.

What are the risks associated with cloaking? What types of sites should consider cloaking?

Many search engines discourage the practice of cloaking. They threaten to penalize or ban those caught using cloaking techniques, so it is wise to plan a cloaking campaign carefully. I tell webmasters that if they are going to cloak, they should set up separate domains from their primary website and host the cloaked pages on those domains. That way, if their cloaked pages are penalized or banned, it will not affect their primary website.

The types of sites that successfully cloak fall into a couple of categories. First, you have those who are targeting a broad range of "long tail" keywords, typically affiliate marketers and so on. They can use various cloaking software packages to easily create thousands of optimized pages which can rank well. Here, quantity is the key.

Next, you have those with websites that are difficult for search engines to index. Some people with Flash-based websites want to present search engine spiders with text versions of their sites that can be indexed, while still delivering the Flash version to human visitors to the same URL.

What is the difference between IP delivery and cloaking?

IP delivery is a type of cloaking. I mentioned above that there are several criteria by which a cloaking script judges the identity of a visitor. One of the most important is the IP address of the visitor.

Every computer on the internet is identified by its IP address. Lists are kept of the IP addresses of the various search engine spiders. When a cloaking script has a visitor, it looks at their IP address and compares it against its list of search engine spider IP addresses. If a match is found, it delivers up the optimized version of the web page. If no match is found, it delivers up the "landing page", which is meant for human eyes. Because the IP address is used to make the decision, it's called "IP delivery".

IP delivery is considered the best method of cloaking because of the difficulty involved in faking an IP address. There are other methods of cloaking, such as by User-Agent, which are not as secure. With User-Agent cloaking, the User-Agent string in the HTTP headers is compared against a list of search engine spider User-Agents. An example of a search engine spider User-Agent is
"Googlebot/2.1 (+http://www.googlebot.com/bot.html)".

The problem with User-Agent cloaking is that it is very easy to fake a User-Agent, so your competitor could easily decloak one of your pages by "spoofing" the User-Agent of his browser to make it match that of a search engine spider.

How hard is it to keep up with new IP addresses? Where can people look to find new IP addresses?

It's a chore the average webmaster probably wouldn't relish. There are always new IP addresses to add (the best cloaking software will do this automatically), and it is a never-ending task. First, you have to set up a network of bot-traps that notify you whenever a search engine spider visits one of your web pages. You can have a CGI script that does this for you, and possibly check the IP address against already known search engine spiders. Then, you can take the list of suspected spiders generated that way and do some manual checks to make sure the IP addresses are actually registered to search engine companies. Also, you have to keep an eye out for new search engines... you would not believe how many new startup search engines there are every month.

Instead of doing it all yourself, you can get IP addresses from some resources that can be found on the web. I manage a free public list of search engine spider IP addresses. There
are also some commercial resources available (no affiliation with me). In addition to those lists, you can find breaking info at the Search Engine Spider Identification Forum at WebmasterWorld.

Is cloaking ethical? Or as it relates to SEO is ethics typically a self serving word?

Some would say that cloaking is completely ethical, others disagree. Personally, my opinion is that if you own your website, you have the right to put whatever you like on it, as long as it is legal. You have the right to choose which content you display to any visitor. Cloaking for SEO purposes is done to increase the relevancy of search engine queries... who wants visitors that aren't interested in your site?

On the other hand, as you point out, the ethics of some SEOs are self serving. I do not approve of those who "page-jack" by stealing others content and cloaking it. Also, if you are trying to get rankings for one topic, and sending people to a completely unrelated web page, that is wrong in my book. Don't send kids looking for Disney characters to your porn site.

I have seen many garbage subdomains owning top 10 rankings for 10s to 100s of thousands of phrases in Google recently. Do you think this will last very long?

No, I don't. I believe this is due to an easily exploitable hole in Google's algorithm that really isn't related to cloaking, although I think some of these guys are using cloaking techniques as a traffic management tool. Google is already cleaning up a lot of those SERPs and will soon have it under control. The subdomain loophole will be closed soon.

How long does it usually take each of the engines to detect a site that is cloaking?

That's a question that isn't easily answered. The best answer is "it depends". I've had sites that have never been detected and are still going strong after five or six years. Others are banned after a few weeks. I think you will be banned quickly if you have a competitor who believes you might be cloaking and submits a spam report. Also, if you are creating a massive number of cloaked pages in a short period of time, I think this is a flag for search engines to investigate. Same goes for incoming links... try to get them in a "natural" looking progression.

What are the best ways to get a cloaked site deeply indexed quickly?

My first tip would be to have the pages located on a domain that is already indexed -- the older the better. Second, make sure the internal linking structure is adequate to the task of spidering all of the pages. Third, make sure incoming links from outside the domain link to both the index (home) cloaked page and to other "deep" cloaked pages.

As algorithms move more toward links and then perhaps more toward the social elements of the web do you see any social techniques replacing the effect of cloaking?

Cloaking is all about "on-page" optimizing. As links become more important to cracking the algorithms, the on-page factors decline in importance. The "new web" is focused on the social aspects of the web, with people critiquing others content, linking out, posting their comments, blogging, etc. The social web is all about links, and as links become more of a factor in rankings, the social aspects of the web become more important.

However, while what people say about your website will always be important, what your website actually says (the text indexed from your site) cannot be ignored. The on-page factors in rankings will never go away. I cannot envision "social techniques" (I guess we are talking about spamming Slashdot or Digg?) replacing on-page optimization, but it makes a hell of a supplement... the truly sophisticated spammer will make use of all the tools in his toolbox.

How does cloaking relate to poker? And can you cheat at online poker, or are you just head and shoulders above the rest of the SEO field?

Well, poker is a game of deception. As a pioneer in the cloaking field, I suppose I have picked up a knack for the art of lying through my teeth. In the first SEO Poker Tournament, everybody kept folding to my bluffs. While it is quite tempting to run poker bots and cheat, I find there is no need with my excellent poker skills. Having said all that, I quietly await the next tournament, where I'm sure I'll be soundly thrashed in the first few minutes ;)

How long do you think it will be before search engines can tell the difference between real page content and garbled markov chain driven content? Do you think it will be computationally worthwhile for them to look at that? Or can they leverage link authority and usage data to negate needing to look directly at readability as a datapoint?

I think they can tell now, if they want to devote the resources to it.

However, this type of processing is time/CPU intensive and I'm not sure they want to do it on a massive scale. I'm not going to blueprint the techniques they should use to pick which pages to analyze, but they will have to make some choices. Using link data to weed out pages they don't need to analyze would be nice, but in this age of rampant link selling, link authority may not be as reliable an indicator as they would like. Usage data may not be effective because in order to get it, the page has to be indexed so they can track the clicks, defeating the purpose of spam elimination. There best bet would be to look at creation patterns... look to see which domains are creating content and gaining links at an unreasonable rate.

What is the most amount of money you have ever made from ranking for a misspelled word? And if you are bolder than I am, what word did you spell wrong so profitably?

I made a lot of money from ranking for the word "incorparating". This was waaay back in the day. I probably made (gross) in the high five figures a year for several years from that word. Unfortunately, either people became better spellers or search engines got smarter, because the traffic began declining for the word about four or five years ago.

If I wanted to start cloaking where is the best place to go, and what all should I know before I start? Can you offer SEO Book readers a coupon to get them started with KloakIt?

KloakIt is a great cloaking program for both beginners and advanced users, because it is easy to get running and extremely flexible and powerful. There is a forum for cloakers there where you can go for information and tips. I am also the moderator of the Cloaking Forum over at WebmasterWorld, and I welcome questions and comments there.

SEO Book readers can get a $15.00 discount of a single domain license of KloakIt by entering the coupon code "seobook" into the form on the KloakIt download page. I offer a satisfaction guarantee, and, should you decide to upgrade your license to an unlimited domains license, you can get credit for your original purchase towards the upgrade fee.

----

Please note that I am not being paid an affiliate commission for KloakIt downloads, and I have not deeply dug in to try out the software yet. I just get lots of cloaking questions and wanted to interview an expert on the topic, and since Dan is a cool guy I asked him.

Thanks for the interview Dan. If you have any other questions for Dan ask them below and I will see if I can ask Dan if he would be willing to answer them.

Why Linguistics is Important

As a marketer, in most cases you can not shape public opinion or create a profitable economy of scale unless you understand how words are used in a manipulative manner to shape opinion to create profit for external antimarket institutions (like Google).

Looking through economic history and the history of linguistics enables you to realize opportunities when others are not being honest or consistent in their policies, and it helps you form an argument which enables you to sound logical and reasonable while reframing the debate at an appropriate time. (For example, let's look at Google's nofollow policies.)

If you like to read I highly recommend reading A Thousand Years of Nonlinear History. Thusfar it is the most important book I have ever read, and is worth far more than my SEO Book, even though it will cost you less than $20. It is not for everyone, but if you are able to understand abstract patterns I doubt you will ever find another book that is more important or convincing at shaping your worldview to a more impartial or profitable worldview.

Maximizers vs Optimizers & the Hollow Middle

I get asked to review a wide array of sites being asked "what is wrong" and "why isn't this working".

Many times I think that the underlying problem is something I call cart before the horse syndrome. While you can view many data points in the competitive landscape when you view a site what you see now is not the way it has always been. Many of the most authoritative sites were created without any commercial intent, and then the site owner later fell into a business model, and as they saw profit started to maximize their profit potential.

If you start off with a lead generation form as your website and are unwilling to give anything away until people give you money or an email address then you should be looking more toward the pay per click market than at organic SEO.

There is nothing wrong with maximizing your potential profit, but if you create a site geared around converting 10% of the site visitors into paying customers right off the start you are probably going to limit your ability to gain any serious link authority and serious distribution unless your conversion rate and profits are so great that you can convince affiliates to push your product.

If you can afford heavy PPC spending by automating your sales process and maximizing your ROI that is fine, but if you want free traffic there are hidden costs to maximizing right out of the gate. It is like buying a 99 cent burger. Sure the upfront cost is next to nothing (and it seems like you are getting more for less), but as competing sites build traffic while you stagnate those invisible costs start to reveal themselves. You have to consider what search engines want and what your site visitors want. Try to create something that covers those wants and then roll commerce into it.

Seth Godin frequently stresses that getting people to PAY attention is a cost, and even if they give you no money PAYING attention is still a cost. Once you earn that it is worth a lot of money because it takes a long time to build trust. And trust is fragile. If I hadn't built up a lot of friendships and trust over the last couple years there is no way the SEO for Firefox launch would have went so well. The new links and new readers that tool brought in are probably worth far more than the tool cost to build, but it may not have spread so well (and it may not have covered its cost) if I had not worked so hard to build up my authority.

Alexa traffic stats for Seo Book.

Hitting the traffic jackpot once does not make one a marketing expert, but in spite of being on the delicious popular list and Digg homepage yesterday this site only doubled its typical traffic. A friend of mine says that it is a marathon and not a sprint, and that is the way you have to look at getting traffic, especially if you have a new site.

Back to the new sites I get asked to review. What do they need to spread messages or compete in the SERPs?

  • Set reasonable goals. Do not expect to rank for mortgage or search in one month if you have a $0 marketing budget and a site that is so bland or conversion oriented that it would never merit a single legitimate organic citation.

  • Pick a path and run with it. Be a maximizer or an optimizer, but know your path and run with it. If you are stuck mixing up in the middle you will probably do worse than a person who is working hard at either of the edges. After you are well established on either front and are beyond self sustaining then you have money to invest and room to play and test, but you need to have a clear message off the start. You don't want your site to one day say you believe on taking the hard and steady and slow and... way to the top, and then have visitors come to your site the next day to see a picture of a check for $50,000 that you allegedly made while you were on vacation last week.
  • Come up with a clear unique branding angle that makes you stand out. Make sure it is obvious what you want people to do on your site and make sure it is obvious what message you want them to spread away from your site. When it doubt it is better to be niche and unique over broad and not unique.
  • Do not chose cheapest as your branding angle unless you are a masochist.
  • Create a clean site design which reinforces your brand image. For example, if your brand is supposed to be fun and hip POO BROWN is a bad color. If your service is supposed to convey a sense of trust to businesses or people seeking health advice go lean on red and orange. I typically favor clean over going too far with a design. If you can find a good priced logo designer and spend a day learning a bit of CSS you can create a reasonably decent looking site for around $100.
  • If you are unsure of what you want to do participate in topical communities to learn about the market and what the market wants. If all of your marketing is done on your site and it is not backed up by friendships away from your site it is going to be hard to convert potential prospects if they dig further into the SERPs and can't find anything about you other than a few cheesy syndicated articles and free directory listings. The web is cool, but also make sure you find your way to relevant off the web (ie: real world) events. That is where you really solidify your friendships and get to know the people you really should know.
  • If you have down time make sure you keep learning. You should be able to learn quicker than the market leaders because you know less, are more hungry, and have less busywork filling your day if you are seriously focused on success and are new to a market. Read and experiment widely. Especially if you aim to be a consultant review that which you consume (it helps buid relationships, and most personal brands are not too deeply developed, so it also provides a cheap and easy relevant traffic source). Don't wait around for a golden day for things just to fall in place. Don't be afraid to be wrong. I have had people take the time to email me and tell me what a piece of shit I was for having incorrect information on my site only later to have them buy my product, put huge ads for it on their site, and recommend it on various community sites.
  • If you want to rank for competitive terms you have to give to get. Look to create ways to make people want to revisit your site many times and/or link to your site for having a definitive topical resource. When you create a (hopefully) definitive article it may go nowhere, but if you do a half a dozen of them well eventually one of them will take off. You are over-investing hoping that eventually one of the investments will pay big dividends. When you have a great idea make sure you tell a few friends to see if they would be willing to help you market it.

Containers, Aggregators, and Editors

I recently got asked to write a couple articles for various websites and publications. I said no problem, but then I kept putting them off. I just handed in one today and did not get feedback yet, but I am uncertain how well it went. Yesterday I handed in one and the editor was less than impressed. Then it sorta dawned on me, that I am a bit spastic and random in nature, and without using those words that is sorta how my article was described (in a nicer way though of course).

Some people do well with containers and other people driving them, but I have been so (searching for a word here...maybe undomesticated) that it is quite hard to fill in the box or create something that is exactly how someone else wants it. I got so focused on random abstract thoughts that I am only really good at doing something if it is something I really want to do when I want to do it.

I have a PowerPoint presentation and speech to put together and am hoping I do well with it. The biggest benefit to it over the articles is that the request for it came after I put together something similar in nature but in another format.

So I guess my (semi?)relevant marketing thoughts on this post are:

  • I think the closer you are to your audience the easier it is to be successful (at least for me).

  • The more passion and interest you have in a topic the easier it is to be successful.
  • It is definitely worth focusing on what you are good at, but it is also a good thing to occasionally try different containers or formats. I suck at many containers and do well with others. Respect the container, or throw the container away and try something new ;)
  • For most people publishing format (so long as it is legible) likely the format has little to do with your personal credibility level. Everyone is different and probably has their own best way to express themselves. I don't think mine is in 1,000 world articles...at least not at this point!

What have been your most successful publishing formats? Do you think the structure of the web will drastically change media consumption habits?

The Idiocy of Nofollow Abuse & Link Hoarding

Recently it was noted that Business.com started using nofollow on many of their outbound links. If you don't trust the content of a site then why link to it at all? To list it on your own site and then put nofollow on it is to say that you don't trust your own content. Which is especially stupid. And perhaps the quickest way to become irrelevant, if you are an editorial listing company.

It turns out they were likely using nofollow on the free listings to some of the higher quality sites, which in turn means that the links without nofollow are pointing at sites that are on average of lower quality than the sites they added nofollows to.

If I was a CRM company I would think that on average a link from a page that links to Salesforce.com is worth more than than a page that does not. If I was a software company I think that on average a link from a page that links to Microsoft.com is worth more than a page that does not.

I think it muddies their credibility. A lot. Think of the quality of their site from a search engineer's perspective

Oh, the only links they left live were the low quality ones. Outbound link authority nuked. Next.

A site which uses nofollow on most of their quality outbound links also reduces their outbound good link to bad link ratio. Even if search engines still counted Business.com links I think the loss of quality outbound links hurts their authority far more than whatever gain someone gets from having a link on a page with fewer links on it.

SEO for Firefox

SEO for Firefox.

Probably the coolest Firefox SEO extension ever created. :)

It now works on international versions of Google and Yahoo!, While it currently only works on the global Google.com and Yahoo.com sites and it pulls a ton of marketing data right into the search results to help you analyze how competitive a marketplace is. Learn more about SEO for Firefox.

Manuel De Landa's A Thousand Years of Nonlinear History - Cool Book

These are my opinions and ideas gelled with notes form the contents of A Thousand Years of Nonlinear History by Manuel De Landa. Keep in mind I may have misinterpreted some of his points and interjected my bias into the points. If something looks like a rational well thought out point I am probably syndicating Manuel De Landa's point. If something looks like a point of anger being expressed that is probably me adding one of my related views to further why and how I believe that portion of the book relates to my life or the world as I have experienced it. I believe the underlying points of the book are

  • progress is a misguided notion
  • reality and consciousness (and everything around us) are just states of energy and biomass hardened by history
  • our own history biases how we evaluate history
  • many things are not linear even if we have traditionally been lead to think of them in that way

Where applicable the book may also appear biased toward heterogeneous over homogenous systems due largely to the great blinding support of homogenous systems in the business community (homogenized systems are easier to extract profit from due to the lowered costs of mass production). This quote really states how he looks at the true cost of homogenization and discipline:

As with all disciplinary institutions, a true accounting must include those forces that increase (in economic terms of utility) and those that decrease (in political terms of obedience).

I think one of the biggest things I got out of the book was a fresh reminder from a different perspective that some of the scummiest aspects of capitalism are not intentional, but are just un cared for side effects of other business processes.

I was talking to a reporter about some tech companies recently when I stated that I thought that certain companies would do this or that and why. He asked if I got that from talking to those companies. I said no, that my knowledge was from thinking about economic theory and business theory stuff.

Many of those ideas came from reading this book.

Almost any book you read is going to be in some ways biased, but I would say this book was biased toward reality without so much propaganda or hidden agenda (ie: I think this book was written out of passion and interest more than to mislead me into trying to buy into something). This is really the most mind opening book I have ever read. As a marketer participating in a somewhat new network it is amazingly fascinating reading about how economics, biology, and linguistics have evolved over the last 1,000 years.

Below is a point by point Aaron Notes type review of the various sections of the book. I initially took the notes for myself, but thought it would be worth posting them anyhow.

From the introduction

  • reality and consciousness are just a state of energy
  • evolution or other 'improvement' to the state of living does not mean things are inherently better...just that they are different.
  • when a new state of being comes into existence it can co-exist with prior states...the new state does not necessarily have to supersede the old state.
  • when we look back at history we are biased by the path it has took and the narrative current society tells us about the past. to understand social dynamics you have to try to build things up from the bottom as well as break them down instead of just relying on breaking things down.

part 1

  • the creation of agriculture allowed an abundance of non human energy to by synthesized and stored for consumption, and lead to the creation of many cities. fishing or other energy sources could have also lead to the creation of cities.
  • trade winds were another important force of energy that were easy to capitalize on due largely to the inefficiency offered by limited competition and market separation
  • fossil fuels lead to the next major growth (again because they made it easy to store and synthesize non human energy)
  • cities act as parasites that suck off the surrounding area
  • currency was first created as a political means to collect excise taxes, but eventually enabled commerce with less friction
  • large parts of the reason why Western culture advanced quicker than eastern culture were competition and a lack of a large inefficient homogenized religious and political bureaucracy
  • The role of the isolated individual in society is typically largely overrated and over simplified when isolated down to the individual level.

  • Adam Smith's invisible hand theory is a bit too idealistic for real world application. The market friction which it ignores is largely what drives many business models and large socioeconomic changes.
  • Virtually any manufactured profit is created in cascading sets of quality with people further from the source valuing things of lower quality and emulating the original to raise the quality of their products and local skill level.
  • the establishment of reliable credit sources allowed powerful organizations, cities, and nations to sap the resources of surrounding areas
  • Many of the most profitable merchants gained freedom of motion, allowing their businesses to capitalize on whatever is high profit at that given time. ie: how Google does not do much in the physical world, but plays a large role in controlling human interaction with information and commerce. As new keyword develop they quickly are able to monetize them. As old markets die off due to political, cultural, economic or other forces they are not tied to them.
  • as companies grow in international power they create forces that attack government norms.
  • the lack of centralization within Europe caused increased investment in arms races that required societies to be more efficient and innovative to survive (when compared with the inward focused monopolistic stronghold on power in Eastern cultures)
  • most markets are range bound rather than actually reaching a single state of equality or equilibrium.

part 2

  • social class stratification via genetics and other aspects is a natural part of life, however it does not need to occur as aggressively as it does.

  • largely social stratification is driven by people who set up moral, ethical, religious or legal guidelines for others to follow. (which is a large part of the reason why I <3 civil disobedience, as undermines the abuse of such power).
  • if some of those systems lost power many social and economic markets would remain self organized by other positive and negative feedback loops.
  • many people prefer to view things through a hierarchical lens because it is much harder to understand and visualize the world through thinking of effects of positive and negative feedback and reciprocal causality. Even at a young age we are generally taught to develop our thinking patterns in terms of concrete causes and effects.

part 3

  • the military required interchangeable parts, and the US military bred a system which provided quality assurance over the railroads. after the government created systems to make railroads a functional business it handed over the reigns of profit to private enterprise
  • import-substitution dynamics and crafting of individual items gave way to automation and homogenation, such that interchangeable parts were cheap and easy to make.
  • many small businesses of similar trade exist near one another as being near one another improves their social circumstances, market mind share, and creates an environment where ideas can more easily flourish
  • most innovation comes from the smallest companies and individuals, as they are less confined by their business models.
  • after smaller businesses prove the profit of a business model larger businesses based on economies of scales either replicate and automate those business models or consume those companies
  • companies buy other companies to control them via internal rules instead of buying their obedience and productivity
  • with shareholders existing external to corporations there is a bias in management not to just make the business as efficient as possible, but to make pieces of it complex enough to not be comprehensible to outsiders, such that they justify executives continuing to receive (and increasing) their compensation level for running the company.
  • electrical energy made it easier to miniaturize machines, and thus increased productivity by making automation easier, quicker, cheaper, and more decentralized.

part 4

  • every non plant is a parasite

  • heterogeneous systems are more resilient than homogenous systems
  • humans make many pieces of the food chain more homogenous
  • genetic diversity is required to evolve new species

  • the genetic code within one animal type is quite homogenous
  • most human gene variations are superficial in nature

  • immigration is probably the single largest factor which causes a mixing of human genes
  • the dense population of cities made it easy for disease to spread
  • disease (local or imported) was a heavy factor in successful or unsuccessful colonization
  • richer individuals tend to allot for fewer children since they perceive a higher cost of living
  • whenever population declined (typically due to poor crop yield or disease) animals took back land

part 5

  • changes in the genetic code of one species changes the genetic make up of other species (this is especially true in predator pray relationships).
  • the definition of optimal is relative (strengthening any part of a system may make some other parts of it weaker)
  • extreme energy flows can shift equilibriums
  • social darwinism is quite bogus, as it fosters racism and ethnic cleansing. earlier this month an SEO I know who describes himself as a Jew explained to me how he viewed all muslims as terrorists and that he did not think ethnic cleansing was a bad thing. He objected to giving me his address when I offered to send him a 'Hitler was right' t-shirt.
  • genetic change is glacial compared to the speed of cultural change
  • while different cultures and linguistic backgrounds have a varying number of color labels the order of accumulation tends to be well aligned (typically starting with black, white, primary colors in certain orders). In Basic Color Terms: Their Universality and Evolution Brent Berlin and Paul Kay stated there are genetic constraints on perception guiding accumulation of cultural replicators.
  • while it is much harder to detect than the other way around, cultural materials may influence the accumulation of genes. an example of this might be how some people are lactose intolerant.
  • cultural policies can eventually become institutional, which can have both good and bad effects. an example of a good effect would be the curbing of incest. a few bad examples would include medication replacing nutrition and land erosion due to poor cultural farming policies.

part 6

  • when times are good humans outgrow their own good

  • the new world (the americas, australia, etc.) created supply zones and gave a place to put the excess growth of humans
  • many old world plants and animals spread to the frontiers of the new world ahead of civilization
  • military and trade ports host many people, animals, plants and goods in a wide array of states which are conducive the spreading disease.
  • because medical facilities in these locations saw people in a wide array of states it was important to make a clear distinction between that which is illegal and the concept of evil
  • the use of observation and binary systems improved medical care. he mentions how Michel Foucault stated they "treat lepers as plague victims"
  • discipline and homogenization are required to create economies of scale
  • when applied to the food supply (typically by big business) it comes in the form of gene control
  • some corporations create seeds that die if not used that year, and also introduce other genetic defects which require the use of excessive fertilizer or other (often monopoly controlled) inputs
  • this genetic control can be described as "etching entry points for antimarkets into the crops' very genes"
  • the gene makeup of many seeds are protected as trade secrets
  • short term homogenization may increase quality, but in many cases give enough time natural selection will perform better than over homogenized artificial selection. a hidden wealth stored in some poor countries is their food supply genetic diversity capital.
  • homogenized systems are more susceptible to epidemics
  • the genetic control applied to plants has also been applied to animals and some states went so far as trying to apply them to people.
  • eugenics is the belief that by studying hereditary and deploying selective breeding techniques you can improve the human race. alternate eugenics definitions here
  • While the immigration laws did not clearly state eugenics in them, some portions of the US believed that Northern Europe humans provided the highest quality genetic stock source.
  • starting in Indiana in 1907 over 20 US states started sterilizing thousands of people for things like being absent minded.
  • Those who still believe in ethnic cleansing after Hitler probably do not deserve to be alive and should cleanse themselves from the populous.
  • There are also non-traditional ways to control human reproductive cycles.
  • Some wars intentionally underequipped types of soldiers to allow them to be cleansed from the gene pool.
  • To this day the military recruiters pray on the young, poor, and those of below average intelligence. While that may sound ultra biased my thesis for that statement are based on my own experiences. I grew up in a poor town and joined the military when I was 17. While I am quite economically successful I have not yet decided where I wanted to move to, and still live in a mobile home (I moved into it with a friend a few years ago to cut my living costs back when I was just learning about the web and only making a couple hundred dollars a month). Earlier this month yet another military recruiter knocked on my door again. While being of about an average intelligence level I literally scored off the charts high on most of the military tests I took when I was 17 (even the nuclear power test) which should have been a strong indication that the test scales were scaled toward people who are of below average intelligence.
  • Early obstetricians and gynecologists screwed up much worse than midwives by making it easier to spread disease and also by excessively using forceps at birth.
  • Private enterprise also took other choices from mothers by sneaking in berthing formula while the mother did not know it was being given to the baby.
  • To this day tryptophan is common in birthing formulas but is illegal to buy as a supplement. around 15 million Americans were using tryptophan but it became illegal roughly around the same time that Prozac was launched as a wonder drug of the future. Few people questioned how shady that was
  • large public outrage is often required to get special interests to yield authority. It was 1892 before Hamburg improved sanitation of its water supply. They only did so after a cholera epidemic hit.
  • biotechnology allows us to fight microbes more efficiently by doing things like gene-splicing and gene-gluing enzymes from one creature to inject that information into other creatures. this creates the ability to produce large quantities of affordable microbe fighting cells.
  • while biotechnology makes it easier to fight micro parasites it makes it easier for macro parasites to be injected into a large portion of the food chain and form monopolies
  • In some instances totally useless and potentially cancer causing chemicals are created to help increase yield. In many cases consumers are not even informed of which food products are contaminated with the garbage.

    Do you really want trust the people who manufacture agent orange when they talk about the effects of chemicals they inject into the food supply? rBGH, which is illegal in many countries, is injected into many US cows to produce more milk. Fox News fired multiple reporters for wanting to air a report about how shady rBGH is. The Meatrix also provides a clip on rBGH.

  • efficiency of extraction and processing (including homogenized size and shape as well as predictable homogenized maturation dates) now are more important criteria than biomass value in many crops. The nutritional value of a crop is largely ignored in favor of the other "more important" (read as more profitable) genetic traits. Improving some of those other genetic traits also comes at the direct cost of lowered nutritional value.
  • when you buy food from outlets that sell on low price you are voting against genetic diversity in the food supply. and are voting against nutritional value of the food your children and their grand children will eat.
  • As nutrition is removed from the food supply drug companies hook people on garbage prescriptions that treat symptoms of an unbalanced lifestyle with poor nutritional input. Of course it will not be the fault of drug companies when things go astray. In reference to some of these drugs some FDA members have went as far as to claim:

    they don't feel that _______ is addictive because it doesn't carry the behavior of a person that is dependent on a drug. A person that will go out and steal to obtain their drug of choice or cause harm to another

  • At the same time children are medicated with these pills that (IMHO) wrongfully replace or cover up natural human emotions. Some of these things are blatantly over prescribed by doctors who learned from text books and journals sponsored or funded by self interested drug companies.

    “Journals have devolved into information laundering operations for the pharmaceutical industry”, wrote Richard Horton, editor of the Lancet, in March 2004. In the same year, Marcia Angell, former editor of the New England Journal of Medicine, lambasted the industry for becoming “primarily a marketing machine” and co-opting “every institution that might stand in its way”.

  • There are reports of things like

    Jeff Weise, the 16 year old who killed seven and then himself this week at his high school, had been taking ________.

  • In the above two _____'s they were two different drugs. But they were both in the same drug family. And the same drug family as the drugs associated with a kid in the Columbine shooting.
  • That drug family was born with the original drug being announced as a wonder drug of the future.
  • Around the time of the release of that drug family a natural supplement that about 15 million people were taking which worked on the same neurotransmitter was banned from the United States

    On March 22, 1990, the FDA banned the public sale dietary of L-Tryptophan completely. This ban continues today. On March 26, 1990, Newsweek featured a lead article praising the virtues of the anti-depressant drug Prozac. Its multi-color cover displayed a floating, gigantic green and white capsule of Prozac with the caption: “Prozac: A Breakthrough Drug for Depression.”



    The fact that the FDA ban of L-Tryptophan and the Newsweek Prozac cover story occurred within four days of each other went unnoticed by both the media and the public. Yet, to those who understand the effective properties of L-Tryptophan and Prozac, the concurrence seems “unbelievably coincidental.” The link here is the brain neurotransmitter serotonin — a biochemical nerve signal conductor. The action of Prozac and L-Tryptophan are both involved with serotonin, but in totally different ways.

  • L-Tryptophan, the allegedly harmful supplement, is still added to baby formula in the United States to this very day. To quote the federal government:

    "At the present time, an import alert remains in force which limits the importation of L-tryptophan into the United States, except if it is intended for an exempted use. FDA has provided for the use of manufactured L-tryptophan for special dietary purposes. Manufactured L-tryptophan is a lawful and essential component of foods, such as infant formulas, enteral products and approved parenteral drug products..."

  • Instead of exercising, dieting properly, and/or taking natural supplements now hundreds of thousands of people are hooked on (ie: recurring subscription based expense) addictive drugs that in some cases ruin their social relationships and have widely been reported to have HORRIFIC withdrawal related side effects.
  • Some doctor even offered my unemployed brother a free trial of one of these drugs even though he had no way to afford buying more.
  • systems highly focused on maximal yield efficiency often require external inputs. that reliance on external sources makes it easier for monopolies to corrupt or influence the chain for short term profits.
  • While mentioning the DuPont and Monsanto corporations De Landa stated "rather than transferring pest-resistant genes into new crop plants, these corporations are permanently fixing dependence on chemicals into the crops' genetic base."

Part 7

Before reading this book my only exposure to the concept of linguistics was from reading George Lakoff's rather introductory level Don't Think of an Elephant, so this next section might be a bit hosed.

  • dialects exist in a continuum of overlapping forms
  • linguistic patterns develop based on geographic and socioeconomic conditions
  • communication isolation leads to new languages
  • while people in different regions may speak different dialects it is also possible that many are not self aware of the differences in dialect
  • the further one moves from established prestige and power the more likely they are to find new emerging dialects
  • new dialects are standardized at seats of economic and political power to make it easier to govern or extract profit
  • the influence and standardization from the seats of power spread to the surrounding regions
  • Gottlob Frege's philosophy (as explained by De Landa) "The connection between a given name and its referent in the real world is effected through a mental entity (or psychological state) that we call 'the meaning' of the word."
  • Saul Kripke and Hillary Putnam stress linguistic inheritance by placing more emphasis on the historical and social aspects of language over the "inside the head" concept. Based on this theory "only certain experts can confirm the accuracy of usage."
  • one's ability to define a term is directly related to their ability to manipulate the items or systems being referenced (or their audience they are introducing the term to)
  • language related to survival is less likely to change than less common language
  • informal social networks act as enforcement mechanisms. dense networks are exceptionally self-reinforcing and quite stable in nature (and can thus withstand great pressures from societal norms from larger social networks)
  • middle class dialects change far quicker than local dialects or elite dialects (since the middle class is much more transitory than either of the edges)
  • the upper class can leverage their authority to influence governmental, religious, or other bodies with large reach to push their lens and linguistic frame of reference through to ambitious members of the middle class who soak up this information hoping to increase their own status
  • language or words do not mean anything until a group of people use them to communicate. the ability to introduce words (or word meanings) to a community and have them stick is proportional to your prestige and your number of contacts within the community
  • synthetic language has inflections, which are used to show things like verb tense
  • analytic languages express grammatical functions through word ordering (subject-verb-object)
  • the trade of objects and experiences with nearby cultures influences linguistic patterns in the local language
  • pidgins occur "wherever contact between alien cultures has been institutionalized" like slave trading ports. pidgins simplify the linguistic norms of their source language.
  • a creole is born out of recomplication of pidgins into a more complex language
  • language usually goes from conqueror to conquered
  • words usually travels from more complex language to a less complex ones
  • J.L. Austin's speech acts "Involve a conventional procedure that has a certain conventional effect, and the procedure itself must be executed correctly, completely, and by the correct persons under the right circumstances."
  • attempts at defining formal languages have generally failed since most people have many influences that are far more influential on their lives than a formal linguistic rule set.
  • the printing press helped harden languages.
  • The Protestant Reformation helped boost local languages and undermine Latin's role in religion and education and thus power
  • "The usefulness of a given set of sounds is guaranteed by the more or less systematic contrast that they have with one another."
  • all languages are in a state of constant change. not only with the addition of new words, but also large variants in word meaning and/or structural purpose.
  • even within a single core language most people speak multiple different dialects, with the dialect depending on their audience and speaking circumstance (ie: professional, technical, family, informal, formal)
  • cities contain both large impersonal environments and hyper focused subcultures with private lives that cause them to be the source for a wide diversity of language.

part 8

  • Noam Chomsky believes the diagram for the structure of language is an abstract robot
    • language consists of a dictionary and a set of rules
    • we can automatically check if a sentence makes sense
      • generative rules = universal across languages
      • transformational rules = not universal, language specific rules
  • Deleuze and Guattari
    • Chomsky not abstract enough
    • there is no universal language. there is always some overlap
    • need to look at history of social interaction and language to understand linguistic development
  • George Zipf
    • believed in "combinational constraints"
    • by looking at word co-occurance patterns you could predict what other words might appear
  • Zellig Harris
    • introduced "transformation" into linguistics
    • linguistic constraints come from "the gradual standardization (or conventionalization) of customary usage."
    • 3 main constraints guiding language
      • "likelihood constraints" - statistically modeled probability of co-occurance
        • "selection" - the set of the most common words grouped with a word. Words are defined by the words they commonly occur with.
      • operator-argument constraints
        • works on word classes (not individual words)
        • inclusion of a certain class of word demands that other word types occur
      • reduction
        • exceptionally common word pairs may morph into a single word, being reduced without losing meaning
  • Mary Douglas
    • also considers social elements of language in her model
    • "collective assemblage" - "intensity with which individuals are attached to a group"
    • breaks connection down into group and grid, indicating who we interact with and how
    • can create 4 quadrants using group and grid. many social forces drive people to one of the edges
    • "The fourth corner, the fully regulated individuals unaffiliated to any group, is plentifully inhabited in any complex society, but not necessarily by people who have chosen to be there. The groups [bureaucracies or sects] expel and downgrade dissenters; the competition of individualists...pushes those who are weak into the more regulated areas where their options are restricted and they end up doing what they are told."
    • the biggest limitations to her model is that they only work from within a social group or organization

part 9

  • in the 18th century language underwent strong unification and uniformation forces
  • unification - driven largely by the formation of nation states and disciplinary institutions
  • uniformation - due largely to testing, training, and observing people to create an obedient populous
  • linguistic unity is necessary for tapping patriotism to drive men toward war or peace
  • large energy flows in and around capitals and major cities made it easy for their local standards to spread
  • dictionaries and grammar guides solidify language. Dr Johnson's dictionary was viewed as so important to English that in the 1880's a bill was thrown out of the parliament because it used a word that was not in his dictionary
  • the increasing speed of global communications makes linguistic isolation harder
  • "The very idea of massified advertising meant that large-circulation newspapers were not in the business of selling information to people, but rather of selling the attention of their readers to commercial interests."
  • Following many other industries mass media quickly became largely driven by antimarket forces.
    • As noted in:
    • Examples of antimarket behavior:
      • "The formation of a cartel by six New York papers, which resulted in the formation of the Associated Press in the 1860s."
      • Reuters, AP, UPI, and French AFP still exert significant control over the global markets, operating as oligopolies
  • "Rather than aiming for objectivity [newspapers] aimed for widely acceptable neutrality."
  • "Although news agencies are not engaged in a conspiracy to promote 'capitalistic values' around the world, they do have a strong homogenizing effect arising from the routinization and standardization of point of view (with the concomitant distorting simplification) and, ultimately, from the very form of the flow, that is, a flow emanating from very few places to a large number of subscribers."
  • To appear authentic in nature newspapers widely distribute linguistically incorrect information (especially when quoting people).
  • Linguistic differences of lower classes were seen as a thing of barbarians, until those linguistically incorrect people were cherished as conscripts needed to fight in WWI
  • At the end of WWI French was seen as the most prestigious language in the world. By the end of WWII it was displaced by English.
  • As a contrast to traditional news organization the web is more of a community based many to many framework. The web allows communities separated by great distances to come together to discuss a topic.
  • "Computer meshworks have created a bridge to a stable state of social life which existed before massification and continues to coexist alongside it." The Cluetrain Manifesto is a great book to read about that reformation of communities and marketplaces brought about by the web. They have 95 thesis statements. My favorite is Hyperlinks subvert hierarchy.
  • The destiny of the web is still of course largely undecided. While it may allow communities to form easily it is also leveraged by many communities with misguided belief sets.

part 10

  • "The flows of lava, biomass, genes, memes, norms, money (and other 'stuff') are the source of just about every stable structure that we cherish and value (or, on the contrary, that oppresses or enslaves us)."
  • The Earth , institutional norms, social structures, and language can be viewed as bodies without organs that exist at various levels of stratification driven by the intensities of their catalysts and energy flows.
  • Different histories with different stratification levels and rates of change are constantly co-occuring.
  • Our history, language, and science have generally been viewed through an arbitrarily linear lens. "Western societies transformed the objective world (or some areas of it) into the type of structure that would 'correspond' to their theories, so that the latter became, in a sense, self-fulfilling prophecies."
  • While there has been significant homogenization over the last 300 to 400 years artificially becoming more heterogeneous does not guarantee a better state of humanity and blindly pushing toward heterogeneous structures is not a good idea since likely "the most destratified element in a mix effects the most rigid restratification" later on.
  • Rather than pushing hard for change without being certain of its effects we should "call for a more experimental attitude toward reality and for an increased awareness of the potential for self-organization inherent in even the humblest forms of matter-energy.

How does this relate to SEO, the web, and marketing?

The current fight over net neutrality is a fight for the belief in heterogeneous systems over monopolized homogenous systems. As noted in this book, antimarket institutions do not always add as much value as they extract from their market position. Network operators (and pocked padded politicians) assume they know what is best for everyone, but if you listen to Ted Stevens speak you will realize just how misguided many of them are.

Ad agencies like Saachi are trying to push brands to go after owning mind share for individual words, which could become self reinforcing if they did it early enough or well enough. Companies sue search engines because they don't rank where they feel they should.

Who controls language? Will search engines and authoritative websites act as our new dictionaries and encyclopedias that harden language?

The web and search engines providing new social dynamic in coding language.

In some ways search engines make the set norms more self reinforcing (via ease of access to the current status, search personalization reinforcing our current world view, reinforcement of citation data that led to the development of the current status, and running search business models that are more profitable if they limit their trust of new definitions and new statuses).

In other ways they make the set norms less self reinforcing (especially where language is loosely defined and/or a market has limited depth) since they make many opinions accessible and place many results near one another on equal footing it becomes easier for people of traditionally limited authority and reach to help redefine the meaning of a word.

Some vertical engines also put your or my words alongside or above the New York Times in importance. This changes the media bias from being a rather homogenized bias controlled by similar large business structures to a more diverse set of biases.

The web and search engines not only provide new social dynamics to coding language, but also in coding our status. Currently if you have a lot of trust in a popular topic search engines allow you to leverage that across to other topics.

Not only do our status levels rise and fall with the importance of the language we play a role in defining, but also search engines look at social bonds and social interactions that were likely hard to measure for past authority systems.

Search engines also have a way of trying to understand a universal personal identity. For example, our search history may manipulate the results of other searchers, and the level of trust on that search history data may depend on our ties to the search systems (financial ties due to being large ad buyers or large ad sellers, financial ties due to being a large source of content or in need of search referrals, length of time and volume of activity - search, publishing, or advertising).

In addition to controlling how we find content search engines also control the payout levels from the largest distributed ad networks, which in part determine:

  • What content is profitable? (Ad sales automation makes content production more efficient for smaller groups at the expense of many larger groups. Many web based business models revolve around amateurs working for free for other companies.)
  • How we structure content to be maximally profitable (in terms of money or in terms of reach and influence).
  • What topics people will focus on. (Smaller niches are now more profitable).
  • The type of bias they may be interested in writing the content from.
  • How evergreen versus fresh content is.
  • How topics will merge together or drift apart. (Hyperlinks will create new links where there never were links before, and many established trusted businesses will quickly consume new markets by quickly recognizing those connections and drifting toward those markets at a quicker rate).
  • How people will format content (ie: if I went with a traditional book publisher I would have been lucky to make 10% of what I make selling my ebook, and this factor also controls how much information people will typically put on a page and how frequently they publish)

Then there is the question of will the mass amateurization of content change the way citizens perceive content vs advertising and will people become more isolated or more involved in building a world that was more like the world they want to see?

Pages