Social Interaction & Advertising Are The Modern Day Search Engine Submission & Link Building

Years ago (well before I was an SEO, or knew what SEO was) search engine submission was a huge phrase. Only recently has search engine marketing replaced search engine submission in popularity.

Search engine submission was big part of the optimization game when search relevancy algorithms were heavily reliant on meta tags and on the page content. As search got polluted with on the page spam you needed to more than submit to compete for coveted valuable phrases, you had to build signals of trust from other sites. Link building was a requirement.

Many of the links that you could easily "build" have effectively disappeared from the web, through the use of nofollow and Google editing the PageRank of many (perhaps most) web directories. Recently Google removed their recommendations for directory submission and link building when these 2 points disappeared from their guidelines

  • Have other relevant sites link to yours.
  • Submit your site to relevant directories such as the Open Directory Project and Yahoo!, as well as to other industry-specific expert sites.

Might their reliance on directories be waning?

Absolutely.

Each additional link created and each additional web page published make Google smarter.

The web is a social network and search engines follow people. Once you think of the web from that perspective you have a HUGE advantage over competitors who are "building" one link at a time.

Google wants those who are well connected (and those who can afford to advertise) to succeed. Thus the evolution of SEO looks like...

  • search engine submission
  • on page optimization
  • link "building"
  • advertising, branding, viral marketing, public relations, & social interaction

Getting the basics right (keyword research, site structure, on page optimization) help make everything else you do more effective. But each day that passes you need a bit more (economic and/or social) capital to compete. What social interactions are built into your site? Why should bloggers write about your business?

Relationship Marketing Via Consumer Interaction

There was a time when people bought from those people who knew them.

You went to the local butcher or baker, and he knew your name, and your kids names. Personal interaction was a valuable sales and marketing tool.

We can apply this strategy to the web, too.

Interaction Marketing

Interaction marketing, as the name suggests, is about the marketing benefit that can be had from engaging with a visitor in a more personalized way.

It works well in an environment of anonymous, mee-too sameness, because people still crave uniqueness and personal attention. This startegy isn't limited to commerce, either. It applies to all kinds of sites, including blogs.

What Are The Benefits Of Encouraging Interaction?

Encouraging interaction can result in more repeat visits, more sales, more loyalty, and more attention. In many cases, it's quite a simple thing to do, and the pay-offs can be enormous.

A well-known example is: "Do you want fries with that?". McDonald's upsell is an example of interaction marketing. They're asking the right question at just the right time, and they're personalizing the service. And it works, to the tune of billions in extra revenue per year.

The Blogs Squeaky Wheel

One of the problems with blogs, this one included, is that the audience isn't just one audience. There are many audiences.

Some people are experienced SEOs and have been reading here for a long time. Others might have only just learned what the phrase SEO means. Most people are spread across the continuum.

How do you deliver an experience that works well for everyone?

Distinguish Between New And Returning Visitors

Seth Godin advocates distinguishing between new and returning visitors to your site, and targeting them with slightly different messages.

For example, new commentors could be delivered to a special welcome page, informing them about various, important areas of your site. You wouldn't necessarily want to do this for long-time users of the site, because it would slow them down, and might be seen as condesending rather than helpful.


One opportunity that's underused is the idea of using cookies to treat returning visitors differently than newbies. It's more work at first, but it can offer two experiences to two different sorts of people.

Here is a Wordpress plugin that will do just that: Wordpress Commentor PlugIn.

By default, new visitors to your blog will see a small box above each post containing the words "If you're new here, you may want to subscribe to my RSS feed. Thanks for visiting!" After 5 visits the message disappears. You can customize this message, its lifespan, and its location."

You could also try this one: Comment Redirect PlugIn

Another way to achieve the same thing is to send an email to new commentors upon registration, outlining the top posts and welcoming them.

You're customizing the experience only slightly, but the payoffs in terms of relationship building could be considerable. The user are more likely to perceive the interaction as helpful and personalized.

Ask For A Link In The Order Confirmation E-mail

That is certainly one of those "why-didn't-I-think-of-that" moments.

You could ask customers, or new sign ups, to link to you. Your customers are prime candidates to approach for links, because they are already familiar with you, presumably like you, and the relationship has already been established.

Amazon-Style Feedback Reminder

Amazon, and their partners, ask for a review a few days after you buy something.

Not only is this a great way to get feedback, customers may also provide you with content. Make it easy for them to do so.

Selective Advertising

Advertising can annoy visitors, and compromise your brand. You can give people added value by removing advertising for those who join up.

Similarly, you could leave advertising off new content. Create a different template, that includes ads, for your archived content. By doing so, you can monetarize most of your content without annoying your regular readers.

Further Reading

Copyright in Reverse? How Will THAT Change Marketing?

Interesting story about the success of Guitar Hero, and the music included in Guitar Hero

The use of a sound recording in a video game is not subject to any sort of statutory royalty – the game maker must receive a license negotiated with the copyright holder of the recording – usually the record company. In previous editions of the game, Guitar Hero has paid for music rights. However, now that the game has proved its value in promoting the sale of music, the head of Activision, the company that owns the game, has suggested in a Wall Street Journal interview that it should be the record companies that are paying him to include the music in the game – and no doubt many artists would gladly do so for the promotional value they realize from the game.

A while ago I mentioned something along the lines of "the information you sell today, you might be willing to pay people to consume in a couple years." In some markets there is a lot of strategic value in asymmetrical information distribution, but...

  • the decline of copyright
  • the ease of local substitution
  • the near infinite level of competition online

make keeping secrets hard, and make selling information a difficult practice unless you...

  • keep adding to what you are selling
  • are aggressive at public relations and marketing and/or become synonymous with an information format or type of transaction
  • can build enough exposure to flip the business model around (like Activision's CEO is planning on doing)
  • interact with customers and personalize the experience
  • dig deeper than competing services and invest into infrastructure to provide a barrier to entry

The network quickly changes itself as imitators imitate market leaders and effective marketing strategies, quickly turning what was once a smart technique into dirty spam...constantly burning out and reinventing the field of marketing.

If you have a competitive advantage and/or find that something is particularly effective it is probably in your best interest to either burn it out quickly before competitors can also use it, or share the information publicly and become synonymous with the technique.

If a market seems saturated then pick a different niche or write from the opposite perspective that most the talking heads write from. The best time to buy a great domain, set up a new site, and build competitive search advantages is before the competition is aware of the value of a market or marketing technique.

Is Buying Links Stupid?

This old chestnut.

There is a post over at Search Engine Land by Danny Sullivan entitled "Conversation With An Idiot Link Broker". To cut a long story short, some guy tries to broker a link deal with Danny, seemingly not knowing who Danny is, and Danny plays him along. Danny reports him to the Google spam team.

For the sake of furthering discussion, I'll play devils advocate :)

Regardless of anyone's views on link buying, it is wrong to mislead people. Danny clearly felt this guy was being misleading, and gave him a number of chances to clarify his position. But is buying and selling links really as "risky" a behavior as is being made out?

It might be considered a risky behavior if you spend a lot of time obsessing about Google, as SEOs tend to do. However, links are the glue that binds the web. Link buying and selling started long before Google existed. It will always happen.

It's called advertising.

But it would be disingenuous not to see what Danny is really talking about here. He's talking about buying links for the sole purpose of gaining link juice. I can understand why Google takes a dim view of this practice. . Paid links compromises Google's business model.

Fair enough. If I worked for Google, I'd take the same stance.

For Danny Sullivan, given the level of exposure of his site in the search world, the risks presented by link trading would be significant. Regardless of Danny's personal opinion on such practices, such a deal would clearly be a non-starter. The link seller is a fool for, above all else, failing to identify his customer.

However, for most sites, the reality is that the risk of link buying and selling is probably negligible.

Google taking out the occasional site amidst a storm of publicity doesn't mean much when there are tens of thousands of sites that clearly do not receive the exact same treatment. If one site in two got hammered, it would be a different story, but it is likely the figures run into one site in thousands. It then becomes a matter of weighing one's chances of being detected and punished by Google against the potential rewards on offer.

For example, there are credible, Fortune 500 companies engaged in buying and selling links. The risk of big names being taken out for any longer than a day or two is near zero. If you run the sort of big name site searchers expect to see in the results, Google probably aren't going to leave you out on a technicality. This would compromise their business model, because Google must deliver relevant results.

Is it up to the link seller to outline all the potential risks involved? Apart from the comical farce of a link seller failing to identify Danny Sullivan, how big a moral crime has the guy really committed? Do Google outline all the risks associated with using their products and services? Or is Danny cunningly implying that Google's algorithm cannot determine which links are paid, and in fact relies on people filing reports? ;)

A moral tone runs through such discussions, and I'm not sure it is entirely consistent.

Google are a business and their pronouncements must be considered in this context. They will act in their own interest, and those interest may or may not align with your own. Are we at risk of ceding the assumption of moral superiority to Google when they may not deserve it? Google, like you, are trying to earn a crust, and any organization may not be entirely transparent and morally consistent in all they do. Who do you call out, and who gets a free pass?

Google certainly holds the power, and if being in the SERPs matters a lot to you, then you should stay within Google's guidelines. It's also fair to say that, these days, even this approach offers no guarantees.

Tread wisely :)

Further Reading

How To Choose Domain Names For SEO

Domaining.

It has been a hot topic for a while now, yet many domineers aren't overly active in the SEO space. Yet.

Domaining is when you register a domain, or buy a domain on the seondary market, with the intention of deriving traffic, and turning that traffic into revenue. Traffic comes from type in traffic. i.e. people type a keyword into the address bar and add .com on the end. Domains can be valuable internet real estate, because, unlike a search engine, there is no middleman between you and the visitor. A lucrative pursuit, if you choose the right names.

Let's take a look at how domineering strategy can be applied to SEO.

Background

Aaron has a great interview with Frank Schilling. Frank is one of the biggest domaineers on the planet, and an articulate advocate of this strategy.

Add this lot to your feed reader:

http://www.sevenmile.com/
http://rickschwartz.typepad.com/
http://www.whizzbangsblog.com/
http://www.domainnews.com/

If anyone has other suggestions for great domaining blogs, please add them to the comments.

How To Select A Domain Name

Google tends to give weight to keywords in the domain name. This increases the importance of selecting a good name.

When choosing a domain name for SEO purposes, there are three main factors to consider:

  • Brand
  • Rankability
  • Linkability

Brand

Should you use hyphenated, multi-term domain like search-engine-marketing-services.com?

I'd avoid such names like the plague.

Why?

They have no branding value. They have limited SEO value. Even if you do manage to get such a domain top ten, you're probably going to need to sell on the first visit, as few people are going to remember it once they leave. It is too generic, and it lacks credibility.

In a crowded market, brand offers a point of distinction.

It is easier to build links to branded domain names. People take these name more seriously that keyword-keyword-keyword-keyword.com, which looks spammy and isn't fooling anyone. Would you link to such a name? By doing so, it devalues your own content .

It can even difficult to get such domain names linked to when you pay for the privilege! Directory editors often reject these names on sight, because such names are often associated with low-quality content. Imagine how many free links you might be losing by choosing such a name.

Is there a downside to using branded names?

Yes.

Unless you have a huge marketing budget, no one is going to search for perseefgxcbtrfy.com, which is a new killer, brand I just made up ;)

Thankfully, there is a happy medium between brand and SEO strategy.

Rankability

SEOs release the value of keywords. When naming your site, and deciding on a domain name, try combining the lessons of SEO, branding and domaining.

Genric + term is a good approach to use. Take your chosen keyword, and simply add another word on the end. SeoBook, Travelocity, FlightsCity, CarHub, etc. These words have SEO value built into them, because people are forced to use your keywords in the link. Also, Google (currently) values a keyword within the domain name for ranking purposes. Finally, such a name retains an element of unique branding.

These types of domain names score high on the rank-ability and link-ability meter. They are generic enough to rank well for the keyword term, yet contain just enough branding difference to be memorable.

The SEO Advantage

There is another advantage for SEOs in the domain space.

Dot com's can sell for 5-20 times as much as a .org or .net. Keyword + .com can sell for millions of dollars, depending on the domain name.

Expensive, huh.

But...

By registering or buying the cheaper .net or .org equivilent, building out the site, and ranking well for the keyword + net, or +org, you increase the value of the domain name markedly. Sure, you're one step away from pure domaineering and you still have Google to contend with, but you'll be head and shoulders above those who are undervaluing these names.

A lot of domaineers aren't operating in this space.

Yet.

Other Tips And Ideas

Leave The Keyword Out Entirely

Used the related search function on Google ~ + keyword and see if any of the related keyword terms fit. This can be a good strategy to use if all the good generic keyword names are gone. It might get you close enough to the action, without the enormous price tag. Might be more memorable, too.

How To Test A Domain Name For Penalties Before Buying It

  • Verify the site is not blocking GoogleBot in their robots.txt file
  • Point a link at the domain from a trusted site and see if Google indexes it
  • Within a couple weeks (at most a month) Google should list the site when you search for it in Google using site:domainname.com

Further Reading:

Align Your SEO Strategy With Site Structure

I'd like to take a look at an area often overlooked in SEO.

Site architecture.

Site architecture is important for SEO for three main reasons:

  • To focus on the most important keyword terms
  • Control the flow of link equity around the site
  • Ensure spiders can crawl the site

Simple, eh. Yet many webmasters get it wrong.

Let's take a look at how to do it properly.

Evaluate The Competition

One you've decided on your message, and your plan, the next step is to layout your site structure.

Start by evaluating your competition. Grab your list of keyword terms, and search for the most popular sites listed under those terms. Take a look at their navigation. What topic areas do they use for their main navigation scheme? Do they use secondary navigation? Are there similarities in topic areas across competitor sites?

Open a spreadsheet, and list their categories, and title tags, and look for keyword patterns. You'll soon see similarities. By evaluating the navigation used by your competition, you'll get a good feel for the tried-n-true "money" topics.

You can then run these sites through metrics sites like Compete.com.

Use the most common, heavily trafficked areas as your core navigation sections.

The Home Page Advantage

Those who know how Page Rank functions can skip this section.

Your home page will almost certainly have the highest level of authority.

While there are a lot of debates about the merits of PageRank when it comes to ranking, it is fair to say that PageRank is rough indicator of a pages' level of authority. Pages with more authority are spidered more frequently and enjoy higher ranking than pages with lower authority. The home page is often the page with the most links pointing to it, so the home page typically has the highest level of authority. Authority passes from one page to the next.

For each link off a page, the authority level will be split.

For example - and I'm simplifying* greatly for the purposes of illustration - if you have a home page with a ten units of link juice, two links to two sub-pages would see each sub-page receive 5 points of link juice. If the sub-page has two links, each sub-sub would receive two units of link juice, and so on.

The important point to understand is that the further your pages are away from the home page, generally the less link juice those pages will have, unless they are linked from external pages. This is why you need to think carefully about site structure.

For SEO purposes, try to keep your money areas close to the home page.

*Note: Those who know how Page Rank functions will realise my explaination above is not technically correct. The way Page Rank splits is more sophisticated than that given in my illustration. For those who want a more technical breakdown of the Page Rank calculations, check out Phils post at WebWorkshop.

How Deep Do I Go?

Keeping your site structure shallow is a good rule of thumb. So long as you main page is linked well, all your internal pages will have sufficient authority to be crawled regularly. You also achieve clarity and focus.

A shallow site structure is not just about facilitating crawling. After all, you could just create a Google Site Map and achieve the same goal. Site structure is also about selectively passing authority to your money pages, and not wasting it on pages less deserving. This is straightforward with a small site, but the problem gets more challenging as you site grows.

One way to mange scale is by grouping your keyword terms into primary and secondary navigation.

Main & Secondary Navigation

Main navigation is where you place your core topics i.e. the most common, highly trafficked topics you found when you performed your competitive analysis. Typically, people use tabs across the top, or a list down the left hand side of the screen. Main navigation appears on all other pages.

Secondary navigation consists of all other links, such as latest post, related articles, etc. Secondary navigation does not appear on every page, but is related to the core page upon which it appears.

One way to split navigation is to organize your core areas into the main navigation tabs across the top, and provide secondary navigation down the side.

For example, let's say you main navigation layout looked like this:

Each time I click a main navigation term, the secondary navigation down the left hand side changes. The secondary navigation are keywords related to the core area.

For those of you who are members, Aaron has an indepth video demonstration on Site Architecture And Internal Linking, as well as instruction on how to integrate and mange keywords.

Make Navigation Usable

Various studies indicate that humans are easily confused when presented with more than seven choices. Keep this in mind when creating your core navigation areas.

If you offer more than seven choices, find ways to break things down further. For example, by year, manufacturer, model, classification, etc.

You can also break these areas down with an "eye break" between each. Here's a good example of this technique on Chocolate.com:

Search spiders, on the other hand, aren't confused by multiple choices. Secondary navigation, which includes links within the body copy, provides plenty of opportunity to place keywords in links. Good for usability, too.

As your site grows, new content is linked to by secondary navigation. The key is to continually monitor what content produces the most money/visitor response. Elevate successful topics higher up you navigation tree, and relegate loss-making topics.

Use your analytics package to do this. In most packages, you can get breakdowns of the most popular, and least popular, pages. Organise this list by "most popular". Your most popular pages should be at the top of your navigation tree. You also need to consider your business objectives. Your money pages might not be the same pages as your most popular pages, so it's also a good idea to set up funnel tracking to ensure the pages you're elevating also align with your business goals.

If a page is ranking well for a term, and that page is getting good results, you might want to consider adding a second page targeting the same term. Google may then group the pages together, effectively giving you listings #1 and #2.

Subject Themeing

A variant on Main & Secondary Navigation is subject themeing.

Themeing is a controversial topic in SEO. The assumption is that the search engines will try and determine the general theme of your site, therefore you should keep all your pages based around a central theme.

The theory goes that you can find out what words Google places in the same "theme" by using the tilde ~ command in Google. For example, if you search on ~ cars, you'll see "automobile", "auto", "bmw" and other related terms highlighted in the SERP results. You use these terms as headings for pages in your site.

However, many people feel that themes do not work, because search engines return individual pages, not sites. Therefore, it follows that the topic of other pages on the site aren't directly attributable to the ranking of an individual page.

Without getting into a debate about the the existence or non-existence of theme evaluation in the algorithm, themeing is a great way to conceptually organize your site and research keywords.

Establish a central theme, then create a list of sub-topics made up of related (~) terms. Make sub-topics of sub-topics. Eventually, your site resembles a pyramid structure. Each sub-topic is organized into a directory folder, which naturally "loads" keywords into URL strings, breadcrumb trails, etc. The entire site is made up of of keywords related to the main theme.

Bruce Clay provides a good overview of Subject Themeing.

Bleeding Page Rank?

You might also wish to balance the number of outgoing links with the number of internal links. Some people are concerned about this aspect, i.e. so-called "bleeding page rank". A page doesn't lose page rank because you link out, but linking does effect the level of page rank available to pass to other pages. This is also known as link equity.

It is good to be aware of this, but not let it dictate your course of action too much. Remember, outbound linking is a potential advertisement for your site, in the form of referral data in someone else logs. A good rule of thumb is to balance the number of internal links with the the number of external links. Personally, I ignore this aspect of SEO site construction and instead focus on providing visitor value.

Link Equity & No Follow

Another way to control the link equity that flows around your site is to use the no-follow tag. For example, check out the navigational links at the bottom of the page:

As these target pages aren't important in terms of ranking, you could no-follow these pages ensure your main links have more link equity to pass to other pages.

Re-Focus On The Most Important Content

This might sound like sacrilege, but it can often pay not to let search engines display all the pages in your site.

Let's say you have twenty pages, all titled "Acme". Links containing the keyword term "Acme" point to various pages. What does the algorithm do when faced with these pages? It doesn't display all of them for the keyword term "Acme". It choses the one page it considers most worthy, and displays that.

Rather than leave it all to the algorithm, it often pays to pick the single most relevant page you want to rank, and 301 all the other similarly-themed pages to point to it. Here's some instructions on how to 301 pages.

By doing this, you focus link equity on the most important page, rather than splitting it across multiple pages.

Create Cross Referenced Navigational Structures

Aaron has a good tip regarding cross-referencing within the secondary page body text. I'll repeat it here for good measure:

This idea may sound a bit complex until you visualize it as a keyword chart with an x and y axis.

Imagine that a, b, c, ... z are all good keywords.
Imagine that 1, 2, 3, ... 10 are all good keywords.

If you have a page on each subject consider placing the navigation for a through z in the sidebar while using links and brief descriptions for 1 through 10 as the content of the page. If people search for d7, or b9, that cross referencing page will be relevant for it, and if it is done well it does not look too spammy. Since these types of pages can spread link equity across so many pages of different categories make sure they are linked to well high up in the site's structure. These pages works especially well for categorized content cross referenced by locations.

Related Reading:

Firefox Rank Checker Extension Now With Pretty Graphs

A member of the SEO Book community wanted to add graphs to the Rank Checker extension. Please give Site Rank Reporter a try, and leave feedback below. I have alerted him to this thread and he is anxious for your feedback.

A couple tips...

  • you must save the Rank Checker data to CSV before importing it to the Site Rank Reporter tool.
  • you have to have at least a few days worth of data to see the benefits of the charts.

Where Are You Placed On The Quality Curve?

Techcrunch is publishing a rumour that Yahoo might be looking to sell off Yahoo Answers.

"Yahoo Answers, which was launched in late 2005, is a staggeringly huge site. Recent Comscore stats say the service attracts nearly 150 million monthly visitors worldwide and generates 1.3 billion monthly page views. That's 67% unique visitor growth in the last year. Yahoo as a whole, though, has nearly 100 billion monthly page views, so it isn't a material percentage of total Yahoo traffic"

Nice traffic, however Yahoo Answers is full of junk content. There are now numerous competitors in the Q&A space.

If you're first mover, as Yahoo was, you can get away with low quality content, but as competition increases, the quality must also increase in order to keep people hooked. Whilst hugely successful in terms of traffic numbers, Yahoo Answers now must to respond to increasing competition. With rumours of a sale, it looks like Yahoo may instead be refocusing their efforts on their core business.

This is an example of the "curve to quality" pattern. First movers can get away with junk content for a while, but eventually competitors will up the quality and gain audience share as a result. This reinforces the need to adapt business models in light of competition, and the need to avoid commodity status.

We can see the same curve to quality pattern in the blog world.

Jackob Neilsen was advising a world leader in his field on what to do about his website. The guy wanted to know if he should start a blog.

Neilsens answer was no, and here's why:

"Blog postings will always be commodity content: there's a limit to the value you can provide with a short comment on somebody else's work. Such postings are good for generating controversy and short-term traffic, and they're definitely easy to write. But they don't build sustainable value. Think of how disappointing it feels when you're searching for something and get directed to short postings in the middle of a debate that occurred years before, and is thus irrelevant."

Also check out the graph "variability of posting quality" in Nielsen's post.

I suspect Nielsen is on the right track. Blog traffic is reportedly at an all time high, but they still only accounts for 0.73% of US traffic. Perhaps as the quality of the average blog increases, so to will the audience share.

Due to the pressure of competition, low quality content eventually becomes commodity.

Do you read mee-too search blogs? Not many people do. Most people gravitate towards the blogs that offer the highest perceived level of quality, as opposed to those that repeat the same news found elsewhere. Mee-too content is no longer an effective strategy in the blog world, or the newspaper world, as syndicated news services are finding out. There is simply too much competition.

There are other reasons why you might want to focus on quality as a strategy.

Google will always try to filter out low quality, commodity content in order to heighten user experience. Google approaches this problem in a number of ways.

In the remote quality rater document, Google lists a range of categories raters can attribute to web content. One category is "Not Relevant". This category applies to "news items that appear outdated" and "lower quality pages about the topic". Obviously, "lower quality" is a relative term and the comparison would be made between competing SERP results. Pages categorised as "Not Relevant" will receive lower SERP placement.

Also consider the notion of poison words. Posion words are words the search engines equate with content of low quality. If, just for example, forum content is found to frequently be of low quality, then it is reasonable to assume Google will look for markers that the site is a forum and mark this content down as a result. Markers might include a link back to a popular forum software script, for example.

This metric would not be taken in isolation as there are various other quality markers Google use. However, if the content is low quality and appears in a low quality format, you stand less chance of ranking for competitive queries.

The same might apply to commercial content, especially such content that appears in non-commercial query results.

Google's business model involves advertisers paying for clicks in the form of Adwords. The main SERPs are essentially a loss leader that facilitate people clicking on text advertisements. The main SERPs are the reason people use Google.

Such a business model would be supported by an algorithm that rewarded quality, informative content in the main SERPs. It could operate by downgrading any content deemed as purely commercial, and this would involve looking for commercially-oriented poison words. Posion words in this context might include "Buy Now", "Business Address", and other variants unique to commercial content. This would "encourage" those with commercial messages to list with Adwords because they would have trouble appearing in the main SERPs. It is unlikely such an algorithmn would apply to commercial queries, however.

Google filters in this way because there is much competition for keyword queries. Google looks to find the best answer. The answer of highest quality, both in terms of relevance and searcher satisfaction. As competition increases, the answers will get better, which is why you must aim to stay high on the quality curve.

How Does Matt Cutts Get Ready for Work? (Picture Reveals All)

A few months back I bought a drawing of Matt Cutts and forgot about it.

How to Update Firefox Extensions (and/or Uninstall & Re-install Them)

Since we have a number of popular Firefox extensions, I frequently get asked how to update Firefox extensions. Rather that writing 3 emails a week I figure it was quicker to jot down a quick blog post. To update or uninstall an extension you first have to click into the add-ons panel.

When you get inside the extensions area (by following the path highlighted above) you will see an Add-ons window with a Find Updates button at the bottom of it. That is an easy way to update many extensions at once.

The other way to update or uninstall is to scroll on an extension and click on it.

  • If you left click, Disable and Uninstall buttons will appear.
  • If you right click on an extension you will see a menu pop up with the option to Uninstall the extension. This menu also gives an option for you to Find Update.

Any time you do an update or uninstall you have to restart Firefox for it to take effect. If you uninstall an extension that you want to reinstall, go to the source where you downloaded it from to be able to reinstall it again. Instructions for installing an extension are well laid out on the SEO for Firefox page.

Pages