Google as Affiliate, Affiliate Network, Ad Network, & Ad Agency

Google recently expanded their ad offering by inserting AdSense ads on maps, putting AdSense image ads & banners on image search results, opening up AdSense for Games, and monetizing Youtube with affiliate ads for Amazon.com and Apple iTunes.

The NYT article on AdSense for Games (linked above) promises a couple more new ad units in the coming weeks, and highlights Google's new ad strategy

For the text and graphic ads (but not video) Google will also look at the context of the game and the page it is on for clues that might indicate whether some of the ads targeted by keyword are appropriate.

Mr. Oestlien indicated one small feature of Google’s program that may represent a significant change in the company’s approach: It is starting to broker deals between game publishers and advertisers to have their products integrated into the actual play of the games. For example, a dog food company could have its latest kibble built into Pet Society, a game on Facebook that now has Google ads.

On the high end for brand advertisers Google is becoming something that looks, smells, walks, and talks like an agency. Take a look at this ad unit.

And on the lead and retail front, Google is looking to become the web's largest affiliate. Everyone in search marketing (and online media) need to take a strong look at the merchant beta test Google conducted

How long until Google goes after other online ad markets that are worth hundreds of millions or billions each? More and more Google searches may end up clicking through to a Google property or a Google navigational aid. If Google can get enough merchants to buy in, any (or all) of these could become affiliate links. If the data can be structured Google can take their tax.


AdWords effectively killed the longtail by recycle brand ads on longtail search queries. Look for that consolidation to continue. If the SERPs hold custom ad units by Google, is your lead value and brand big enough to be able to pay for the leads? If not, how can you deepen your experience to create a citation-worthy service that goes deeper than Google is willing to go?

Update: As John Andrews highlighted, Google aggressively cashes in on branding, so if you own a brand you owe it to them to be liberal with their guidelines.

Social Interaction & Advertising Are The Modern Day Search Engine Submission & Link Building

Years ago (well before I was an SEO, or knew what SEO was) search engine submission was a huge phrase. Only recently has search engine marketing replaced search engine submission in popularity.

Search engine submission was big part of the optimization game when search relevancy algorithms were heavily reliant on meta tags and on the page content. As search got polluted with on the page spam you needed to more than submit to compete for coveted valuable phrases, you had to build signals of trust from other sites. Link building was a requirement.

Many of the links that you could easily "build" have effectively disappeared from the web, through the use of nofollow and Google editing the PageRank of many (perhaps most) web directories. Recently Google removed their recommendations for directory submission and link building when these 2 points disappeared from their guidelines

  • Have other relevant sites link to yours.
  • Submit your site to relevant directories such as the Open Directory Project and Yahoo!, as well as to other industry-specific expert sites.

Might their reliance on directories be waning?

Absolutely.

Each additional link created and each additional web page published make Google smarter.

The web is a social network and search engines follow people. Once you think of the web from that perspective you have a HUGE advantage over competitors who are "building" one link at a time.

Google wants those who are well connected (and those who can afford to advertise) to succeed. Thus the evolution of SEO looks like...

  • search engine submission
  • on page optimization
  • link "building"
  • advertising, branding, viral marketing, public relations, & social interaction

Getting the basics right (keyword research, site structure, on page optimization) help make everything else you do more effective. But each day that passes you need a bit more (economic and/or social) capital to compete. What social interactions are built into your site? Why should bloggers write about your business?

Copyright in Reverse? How Will THAT Change Marketing?

Interesting story about the success of Guitar Hero, and the music included in Guitar Hero

The use of a sound recording in a video game is not subject to any sort of statutory royalty – the game maker must receive a license negotiated with the copyright holder of the recording – usually the record company. In previous editions of the game, Guitar Hero has paid for music rights. However, now that the game has proved its value in promoting the sale of music, the head of Activision, the company that owns the game, has suggested in a Wall Street Journal interview that it should be the record companies that are paying him to include the music in the game – and no doubt many artists would gladly do so for the promotional value they realize from the game.

A while ago I mentioned something along the lines of "the information you sell today, you might be willing to pay people to consume in a couple years." In some markets there is a lot of strategic value in asymmetrical information distribution, but...

  • the decline of copyright
  • the ease of local substitution
  • the near infinite level of competition online

make keeping secrets hard, and make selling information a difficult practice unless you...

  • keep adding to what you are selling
  • are aggressive at public relations and marketing and/or become synonymous with an information format or type of transaction
  • can build enough exposure to flip the business model around (like Activision's CEO is planning on doing)
  • interact with customers and personalize the experience
  • dig deeper than competing services and invest into infrastructure to provide a barrier to entry

The network quickly changes itself as imitators imitate market leaders and effective marketing strategies, quickly turning what was once a smart technique into dirty spam...constantly burning out and reinventing the field of marketing.

If you have a competitive advantage and/or find that something is particularly effective it is probably in your best interest to either burn it out quickly before competitors can also use it, or share the information publicly and become synonymous with the technique.

If a market seems saturated then pick a different niche or write from the opposite perspective that most the talking heads write from. The best time to buy a great domain, set up a new site, and build competitive search advantages is before the competition is aware of the value of a market or marketing technique.

Firefox Rank Checker Extension Now With Pretty Graphs

A member of the SEO Book community wanted to add graphs to the Rank Checker extension. Please give Site Rank Reporter a try, and leave feedback below. I have alerted him to this thread and he is anxious for your feedback.

A couple tips...

  • you must save the Rank Checker data to CSV before importing it to the Site Rank Reporter tool.
  • you have to have at least a few days worth of data to see the benefits of the charts.

How Does Matt Cutts Get Ready for Work? (Picture Reveals All)

A few months back I bought a drawing of Matt Cutts and forgot about it.

How to Update Firefox Extensions (and/or Uninstall & Re-install Them)

Since we have a number of popular Firefox extensions, I frequently get asked how to update Firefox extensions. Rather that writing 3 emails a week I figure it was quicker to jot down a quick blog post. To update or uninstall an extension you first have to click into the add-ons panel.

When you get inside the extensions area (by following the path highlighted above) you will see an Add-ons window with a Find Updates button at the bottom of it. That is an easy way to update many extensions at once.

The other way to update or uninstall is to scroll on an extension and click on it.

  • If you left click, Disable and Uninstall buttons will appear.
  • If you right click on an extension you will see a menu pop up with the option to Uninstall the extension. This menu also gives an option for you to Find Update.

Any time you do an update or uninstall you have to restart Firefox for it to take effect. If you uninstall an extension that you want to reinstall, go to the source where you downloaded it from to be able to reinstall it again. Instructions for installing an extension are well laid out on the SEO for Firefox page.

Should Google Recommend Downloading Illegal Copyright Works via Torrents? What About Cracks, Serials, Keygens, etc.?

I was just finishing up our guide to how to optimize for search suggestion, and noticed something worth discussing.

I am not sure if safe harbor covers companies that index content, cache/host content, and suggest searches for downloading pirated works...but if it does, I think the law needs changed. It seems Google could have thought about the torrent related keyword suggestions before launching search suggest as a default.

Part of the reason why I had to change my business model was the need for a more interactive higher value service, but another big part of it was also that I saw this sort of activity coming. It is too hard to create valuable information and sell it in a digital format unless it is broken up into pieces, is time sensitive, and/or has interactive elements added to it.

If you think Google respects copyright you are wrong. All content wants to be free, and, preferably hosted by Google, wrapped in AdSense.

The Google Search Advertising Cartel

Whenever I read a story about Google losing it's competitive edge or spreading itself too thin I think that they author just does not get the network effects baked into web distribution when a company is the leader in search and advertising, and how solidly Google competes where it allegedly failed.

Sideline projects, like their book scanning project, turn into a treasure for librarians and researchers who guide others to trust Google. Syndicated products and services like their book API nearly create themselves as an off-shoot of creating indexable searchable content.

They monetize search much more efficiently than the competition. And that is only going to increase as time passes, especially since their leading competitor would rather outsource to Google than fix their monetization problems. Google can take any related market it touches and buy marketshare or introduce a new product to push free and openness. Everything should be open, except Google itself.

To sum up Google's lasting competitive advantage (including brand, marketshare, price control, distribution, undermining copyright, strategic partnerships, etc.) I turn to telecom lobbyist Scott Cleland's Googleopoly:

Google arguably enjoys more multi-dimensional dominating efficiencies and network effects of network effects of any company ever - obviously greater than Standard Oil, IBM, AT&T, or Microsoft ever were ever able to achieve in their day.
....
The five main anti-competitive strategies in Google's predatory playbook to foreclose competition are

  1. Cartelize most search competitors into financially-dependent 'partnerships;'
  2. Pay website traffic leaders predatory supra competitive fees to lock up traffic share;
  3. Buy/co-opt any potential first-mover product/service that could obsolete category's boundaries;
  4. Commoditize search complements to neutralize potential competiton; and
  5. Leverage information asymmetry to create entry barriers for competitive platforms.

If you have a spare hour to read, you may want to check out Mr. Cleland's Googleopoly 2 [PDF]. I don't agree with everything in it, but it sums up Google's competitive advantages and business strategies nicely. Anyone can learn a lot about marketing just by watching and analyzing what Google does.

Search Engine Optimization - Evolution or Extinction?

The following is a guest blog post by Jeremy L. Knauff from Wildfire Marketing Group, highlighting many of the recent changes to the field of SEO.

Marketing is constantly evolving and no form of marketing has evolved more over the last ten years than search engine optimization. That fact isn’t going to change anytime soon. In fact, the entire search engine optimization industry is headed for a major paradigm shift over the next twelve months. Like many of the major algorithm updates in the past, some people will be prepared while some will sit teary-eyed amongst their devastation wondering what happened and scrambling to pick up the pieces. Unlike the major algorithm updates of the past, you won’t be able to simply fix the flaws in your search engine optimization and jump back to the top of the SERPs.

Why is this change going to be so different? In the past, the search engines have incrementally updated certain aspects of their algorithms to improve the quality of their SERPs, for example, eliminating the positive effect of Meta tag keyword stuffing which was being abused by spammers. Anyone who has been in the SEO industry for more than a few years probably remembers the chaos and panic when the major search engines stopped ranking websites based on this approach. This time around though, we’re looking at something much more significant than simply updating an algorithm to favor particular factors or discount others. We are looking at not only a completely new way for search engines to assign value to web pages, but more importantly, a new way for search engines to function.

Local search

A number one ranking for a particular keyword phrase was once the end-all, be-all goal but now many searches are regionalized to show the most relevant web pages that are located in the area that you are searching from. While this will probably reduce your traffic, the traffic that you now receive will be more targeted in many cases. Additionally, it give smaller websites a more equal chance to compete.
local.jpg

Google suggest

This August, Google Suggest was moved from Google Labs to the homepage, offering real-time suggestions based on the letters you’ve typed into the search box so far. This can be an incredibly helpful feature for users. At the same time, it can be potentially devastating to websites that rely of long-tail traffic because once a user sees a keyword phrase that seems like at least a mediocre choice they will usually click on it rather than continuing to type a more specific keyword phrase.
suggest.jpg

Devaluation of paid links

Google’s recent attempt to eliminate paid links has scared a lot of people on both sides of the link buying equation into implementing the “nofollow” tag. In the midst of this hypocritical nonsense, Google has also been taking great measures to devalue links based on quantifiable criteria, such as the “C” class of the originating IP, similarities in anchor text and/or surrounding text, location of the link on the page and the authority of the domain the link is from to name a few. Regardless of the effectiveness of any search engines ability to evaluate and subsequently devalue paid links, the fear of getting caught and possibly penalized is more than enough to deter a lot of people from buying or selling links.

Visitor usage data

Again, Google is leading the charge on this one. Between their analytics, toolbar and web browser, they are collecting an enormous amount of data on visitor usage. When a visitor arrives at a website, Google knows how long they stayed there, how many pages they accessed, which links they followed and much more. With this data, a search engine can determine the quality of a website, which is beginning to carry more weight in regards to ranking than some of the more manipulatable factors such as keyword density or inbound links. This puts the focus on content quality instead of content quantity and over time, will begin to knock many of the “me too” websites further down the SERPs pages, or out of the picture all together. The websites that will prosper will be those that produce relevant, original content that their visitors find useful.

TrustRank

Simply pointing a vast number of links with a particular keyword phrase in the anchor text to a website was once a quick and easy way to assure top ranking. The effectiveness of this approach is diminishing and will continue in that direction as a result of TrustRank. In a nutshell, a particular set of websites are chosen (by Google) based on their editorial quality and prominence on the Internet. Then Google analyzes the outbound links from these sites, the outbound links from the sites linked to by these site, and so on down the chain. The sites that are further up the chain carry more trust and those further down the chain, less trust. Links from sites with more TrustRank, those further up the chain, have a greater impact on ranking than links from websites further down the chain. On one hand, this makes it difficult for new websites to improve their position in the SERPs compared to established website; one the other hand, it helps to eliminate many of the redundant websites out there that are just repeating what everyone else is saying.

Google Chrome

Utilizing a combination of visitor usage data and a not so gentle nudge in Google’s direction, Google Chrome is set to change the way search engines gather data and present it to users. For example, when a user begins typing in the address bar of the browser, they are presented with a dropdown list of suggestions containing choices consisting of the #1 result in Google’s SERPs, related search terms and other pages you’ve recently visited. This gives a serious advantage to the websites that hold top ranking in Google and at the same time, gives a serious advantage to Google by giving their Internet real estate even more exposure than ever before.
chrome.jpg
So the question remains, is search engine optimization facing evolution or extinction? Certainly not extinction, not by a long shot, but in a short period of time it is going to be drastically different than it is today. The focus will soon be on producing a valuable and enjoyable user experience rather than just achieving top ranking, which is what it should have been all along.

New Backlink Analysis Strategy: Compete.com Referral Analytics

Compete.com quietly launched a referral analytics product as part of their advanced package ($499/month). Even as a free user you can see the top 3 results for any site, which can be used to see how reliant a site is on search. Why is % of search traffic an important statistic?

  • If search traffic (as a % of total traffic) is low (relative to other competing sites) then it could indicate that there are organic optimization opportunities that are currently being missed and/or that site has a large organic traffic stream that can be marketed to in order to help it improve any search related weakness.
  • If search traffic (as a % of total traffic) is high (relative to other competing sites) then it could indicate that the site is near its full search potential, that the site is not very engaging, and/or does not have many loyal users

Here are search stats for SEO Book. Note that Google controls a minority of the traffic to this site, which means they have limited direct influence on the revenue of this site. Some sites are closer to 90% Google, which makes it easy for Google to effectively remove them from the web!

This sort of data is important for considering the viability of a business model, the stability of a site, and what multiple a site should sell for. It can also be used when considering the CPM of an ad unit - search traffic is much more targeted and goal oriented than a person browsing a forum is.

Until everyone and their dog started looking at PageRank (and how to manipulate it) it was a rather sound way of finding the most valuable backlinks. But with the pollution of endless bought links, nepotistic links, and PageRank only being updated quarterly it is tough to glean much market data from only looking at PageRank. Tools like SEO for Firefox (especially when used on a Yahoo! backlink search) allow you to gather more data about the quality of link sources. But they all try to measure proxies for value rather than how people actually surf the web.

Microsoft BrowseRank research would use browsing data to supplement PageRank on determining relevancy. In Internet Explorer 8 (currently in beta) a person's browsing details are sent to Microsoft by default. With ~ 80% of the browser market, Microsoft does not need to use a random walk for the core of their relevancy algorithm - they know what people are actually doing, and can use usage data as a big part of their relevancy algorithms.

Using a tool like Compete.com Referral Analytics makes it far easier to poach top affiliates, discover the best ad buying locations, and replicate a competitor's best backlinks. Be forewarned that the tool only works at the domain level, so it is much better at analyzing Yahoo.com than shopping.yahoo.com.

Along with referral analytics Compete offers destination analytics, which let you know what websites people visit AFTER visiting a particular site...which should help you glean information about how sites are monetizing, what offers are working well, what sites are well referenced by another site, and what sites people go to if they can't get what they want on the current site.

At $500 a month, this tool is probably only going to be used by those who are already fairly successful rather than as an entry level tool.

Pages