Keywords I am Targeting

Home > Cat > Keywords I am Targeting

Keywords I am Targeting

Find your keywords I am targeting resources listed below I get a lot of site review requests for sites that recently took a dive in Google where the page generally follows the above format. Every time their main keyword phrase exists on the page it exists in the exact same format, and it exists about everywhere.

There are two major problems with that format

  1. Over optimization: if a page is obviously targeting a phrase then Google may not want to rank that page for that phrase. When people write naturally (ie: for humans, not engines) there tends to be variations in it. Now some content management systems will cause some parts of the page to be fairly repetitive, but where you can mix it up.

    I see some examples of where Google is ranking a page focused on topic A for topic B just because topic B exists as a navigational element on page topic A. The same site has a more relevant page about the search query but the wrong page ends up ranking sometimes. While that is trashy relevancy from Google, to me that hints at where Google wants to head with their algorithms, showing me that Google is trying to figure out natural writing and reward pages for not being too repetitious or overly focused. They still need to do some serious work on how they interpret navigational elements into the relevancy algorithms, but when they do you can expect them to only get even more aggressive with favoring natural writing over spammy optimized content.

  2. Wasted opportunity: assuming you took the time to create unique content for each page it only takes an extra minute or two to mix things up to help the page rank for a much wider net of keywords.

If you can find a way to mix up your keyword phrases, like:

  • sometimes leave one of the words out

  • sometimes just use one or two of the words isolated from the others
  • use alternate version of the words
  • switch up the word order
  • use modifiers and semantically related text
  • make the internal anchor text slightly different than what you focus the page content on
  • use variation in anchor text from external link sources, and focus it on slightly different words than you focused the internal linkage on

you will end up ranking for a lot more phrases and will rank more consistently and reliably in Google. Others will bitch about the updates giving them the raw deal and Google being a power grab while you keep getting more and more traffic.

Search Relevancy Algorithms: Google vs Yahoo! vs MSN

What are the major algorithmic differences between the major search engines? I tried answering that question when I recently wrote an article comparing Google to Yahoo! Search to MSN Search.

Please let me know what you think of it.

Duplicate Duplicate Content

Both Bill Slawski and Todd Malicoat posted great posts about duplicate content detection and how to avoid producing duplicate content.

Todd also posted a link to this shingles PDF, which describes some of the ways to detect duplicate content.

The Google Crawling Sandbox

With Matt Cutts's recent post about the changing quality signals needed to get indexed in Google, and sites with excessive low quality links getting crawled shallower (and some of them not getting crawled at all) some people are comparing Google's current improving crawling standard as an early development of something similar to how the Google Sandbox prevents new or untrusted sites from ranking. WebmasterWorld has a 7ish page thread about BigDaddy, Where Graywolf said:

I'm personally of the opinion that we're starting to see the 'sandbox of crawling'

What is the Optimal Site Size?

Some people in the thread are asking for optimal site size for crawling, or if they should change their internal navigation to accommodate the new Google, but I think to some extent I think that misses the mark.

If you completely changing your site structure away from being usable to do things that might appease Google in a state of flux you are missing the real message they are trying to send. If you rely too heavily on Google then you might find they are in a constant state of being broken, at least from your perspective ;)

The site size should depend largely on

  • how much unique content you can create around the topic

  • how well you can coax others into wanting to create unique topical content for you
  • how people shop
  • how people search for information
  • how much brand strength you have (a smaller site may make it easier to build a stronger niche specific brand, and in most cases less content of a higher content quality is far more remarkable than lots of junk content)

Many times it is better to have smaller sites so that you can focus the branding messages. When you look at some of the mega sites, like eBay, they are exceptionally weak on deep links, but they also have enough authority, mindshare, and quality link reputation to where they are still represented well in Google.

Scaling Out Link Quality and Unique Content

Another big issue with crawl depth is not only link quality, but also how unique the content is on a per page level. I was recently asked about how much link popularity was needed to index a 100,000,000 page site with cross referenced locations and categories. My response was that I didn't think they could create that much content AND have it unique enough to keep it all indexed AND build enough linkage data to make Google want to index it all.

Sometimes less is more.

The same goes for links too. If you go too hard after acquiring links the sandbox is a real and true phenomenon. If you get real editorial citations and / or go for fewer and higher quality links you will probably end up ranking quicker in Google.

While it may help to be selective with how many links you build (and what sources you are willing to get links from) it also presents a great value to be selective to who you are willing to link at AND link out to many quality resources that would be hard to make look spammy. Rand recently posted

From a trustworthy source - Googlebowling is totally possible, but you need to use patterns that would show that the site has "participated" in the program. What does that mean? Check who they link to - see if you can make the same spammy links point to those places and watch for link spam schemes that aren't in the business of pointing to people who don't pay them.

So if you make your site an island or only partner with other sources that would be easy to take out you limit your stability.

What Makes a Site More Stable?

The big sites that will have 100,000 pages stick in the SERPs are real brands and/or sites that offer added value features. Can individuals create sites to that scale that will still stick? I think they can, but there has to be a comment worthy element to them. They have to find a way to leverage and structure data, be comment worthy and / or they need to have an architecture for social participation / content generation.

The Net Cost & Value of Large Algorithmic Swings

Some people say that in wild search algorithmic swings are not a big deal since that for every person losing someone must gain, so the net effect is not driving people toward paid ads, but I do not buy that.

If your sites are thin spam sites and you have limited real costs the algorithmic swings might not be a big deal, but when businesses grow quickly or have their income sharply drop it affects their profitability, both as they scale up and scale down. You also have to factor in the cost of monitoring site rankings and link building.

At the very least, the ability to turn on or turn off traffic flows (or at least finely adjust them) makes PPC ads an appealing supplement to real businesses with real employees and real business costs. Dan Thies mentioned his liking of PPC ads largely for this reason when I interviewed him about a year ago.

As Google makes it harder to spam and catches spam quicker eventually the opportunity cost of spamming or running cheesy no value add thin sites will exceed the potential profit most people could attain.

Authority Systems Influence the Networks They Measure:

Some people are looking to create systems that measure influence, arguing that as attention grows scarcer it will increase in value:

Attention is what people produce (as in "hand over the money" or "look at this ad") in exchange for information and experience. As Lanham writes in The Economics of Attention, the most successful artists and companies are the ones that grab attention and shape it, in other words, that exercise influence. With so much information, simply paying attention is the equivalent of consuming a meal or a tube of toothpaste.

Any system that measures influence also poses an influence on the market it measures. A retail site in an under-marketed industry that randomly winds up on the Delicious popular list or Memeorandum one day will likely outdistance competitors that do not.

Search has a self reinforcing aspect to it as your links build up. A site with a strong history of top rankings gets links that other sites won't get. Each additional link is a re validation of quality. The people at Google realize that they have a profound effect on how the web grows. Now that they have enough content to establish a baseline in most commercial markets they can be more selective with what they are willing to crawl and rank. And they are cleaning up the noise in their ad market as well.

The Infinite Web, Almost

Some people view the web as an infinite space, but as mentioned above, there is going to be a limit to how much attention and mindshare anything can have.

The Tragedy of the Commons is a must read for anyone who earns a living by spreading messages online, especially if you believe the web to be infinite. While storage and access are approaching free, eventually there is going to be a flood of traditional media content online. When it is easy to link at pages or chapters of books the level of quality needed to compete is going to drastically increase in many markets.

So you can do some of the things that Graywolf suggested to help make your site Google friendly in the short term, but the whole point of these sort of changes at Google are to find and return legitimate useful content. The less your site needs to rely on Google the more Google will be willing to rely on your site.

If you just try to fit where Google is at today expect to get punched in the head at least once a year. If you create things that people are likely to cite or share you should be future friendly.

Building Trust in Ad Systems

Google's ad network is large enough that they can afford to kill off portions of their short term income to improve long term network viability. The still sell ads on garbage sites because some advertisers find value there (and others have small accounts or have not researched their spend). Andrew Goodman recently had a great post about how Google is filtering out the profitability of advertising noisy spammy AdWords ads to minimize the number of them appearing on Google. Andrew wrote:

Post pages that don't give adequate access to the crawler - or adequate keyword cues - and you risk facing the wrath of the quality scoring algorithm. It's less of a worry as much if you have an established account - it's new accounts that face the toughest tests with the predictive aspect of the algorithm, intended to weed out specific types of violators, experimenters, and ham-fisted copywriters.

In essence Google is going to require you to build trust and market data over time to gain the ability to even be trusted enough to gain anything near maximal ad distribution (even if you are willing to overpay for exposure).

Jumping from Paid Search to Organic Search

Some people believe that old sites only rank well because of the links they have acquired over time, but I think even just existing for a certain amount of time without being manually or algorithmically tripped up for some spam infraction allows search engines to place more trust on your site.

Plus requiring sites to be a bit older to rank well requires an additional expense and / or level of knowledge that many people lack.

You can bet that if they are taking a lets wait and see approach on paid ads they are also doing the same on organic search results.

A Narrative on Link Relevancy & Link Authority

Caveman recently delivered a terrific post on WMW describing the evolution of SEO, and what it means to be relevant to Google.

...One day, a new search engine named G came along, and decided that if [a man] referred to himself as a pacifist, and others pointed to him as he walked by, then G would rank him as a pacifist.

...It did not take long before the criminal figured out that if the people who pointed to him as he walked by called him pacifist while they pointed, rather than just calling him by his name, his rankings went up for the term "pacifist." So he wore a sign - "pacifist" - and people called him that as they pointed, and his rankings rose.

...After a time, the man realized that if he got all of those he knew to call him pacifist, his rankings would rise further still, and that is what happened.

...So he thought, why not get strangers to call him pacifist, and in return he would refer to them as they wished to be referenced, and all those in his newly expanded network could rank even better for their respective terms. And so it was.

...

...This worked for a while, but eventually, G began to suspect that the faux-pacifists were getting better and better at creating the illusion that they were true pacifists, by begging, borrowing and buying the necessary accolades. It even became known that some faux-pacifists were bribing true pacifists to say nice things about the faux pacifists, so that G would be fooled.

...So, G decided to take drastic measures. They became a registrar so that that could look at each man's historical records. They learned to keep track of what each man said about himself and when, and what others said about each man, and when. And G learned to not trust those who suddenly one day out of the blue proclaimed themselves as pacifists, though their records bore no hint of that previously.

I think this narrative does a terrific job of describing the differences between real and synthetically manufactured authority.

In many small industries there is not much of a topical community, so it may not take much to rank in them, but if there are other legitimate sites ranking for the queries you want to rank for you really have to build reasons why subject matter experts would want to reference you in a positive light.

I think pointing out the social aspect of many links also drives home the concept of a natural editorial citation, and the fact that many real links are driven from social relationships.

Jim Boykin also recently posted on the historical importance of backlinks, and hinted at how he has been cherry picking killer links.

Shopping and Brand Comparison Pages for SEO

If you do affiliate marketing it is probably best to have few distracting features on your end landing page for each individual product or offer. It is best to sell it as THE ONLY option.

But in some cases it may take a while to gain enough authority to rank for individual brands or products if the market is competitive. Many comparison shoppers include multiple brands or product names in their search queries. Creating comparison pages makes it easy to rank for comparision type queries. If markets are competitive with a few top players it is easy to draw in a ton of traffic by creating pages comparing the top few brands, especially if you use key shopping or comparison review words in the page content like

  • category / product type

  • find
  • compare
  • send
  • buy
  • review
  • reviews
  • vs
  • versus
  • better
  • best
  • top
  • cheaper
  • cheapest
  • deals
  • coupons
  • fastest
  • speed
  • safest
  • power
  • feedback
  • side effects (for drugs, etc.)
  • free shipping

You can get a great list of relevant topic specific shopping / sorting words for your products, category or theme by going to comparison shopping sites like Shopping.com, Yahoo! Shopping or Froogle. You can also go to Amazon to read a ton of reviews because people will likely search for things in similar ways to how they write about them.

If your market is advanced or technical in nature and you can offer a significant amount of comparison data you may also want to create a PDF of the comparisons, as it is pretty easy to rank for most queries when you add PDF to the search query.

Covering smaller brands on some pages is a smart idea since they will be less competitive to rank for. Even if your review recommends other products once they get to the brand specific page you still can find cheap and easy traffic by creating targeted pages for those brands. Plus if they do any marketing to try to increase their marketshare many people will use search engines to look for reviews. Just like for the + PDF searches are not too competitive many of the brand term + product name + review are not too competitive (outside of the hosting industry anyway).

Quickly View the Depth and Quality of Competition in a Marketplace

I recently did a linkdomain:seobook.com search on Yahoo! and noticed the search results had "saved by X people" near many of the top listed backlinks.

Results that are bookmarked are typically sites that are frequently visited by real people. As the web becomes more read write the "saved by x people" on the top backlinks will be a great quick SEO tool for testing the depth of support for your site or a competing website. Yahoo! Linkdomain with tagged by results in it.

I consider this website to be somewhat legit, but my sites I would classify as less than legit have no backlinks from pages that are bookmarked on other sites, which is another way of corroborating the value of Jim's philosophy on finding the best pages to get links from within a site.

If you just search for a site it I believe Yahoo! returns those results roughly in order of authority and shows which pages were tagged in those results.

Yahoo! site search with tagged by results in it.

Although more information existing online increases the bar for what is required to be competitive / remarkable / linkworthy I also believe that search engines sharing as much data as they do still makes marketing easy. Thanks to them for that.

The two big downsides to tagging are that it is easy to spam, and currently few people outside of exceptionally techie circles use it much.

When Monetizing Old Domains...

If you find one of your old domains that Google has taken appeal in or buy an old site and build off the authority another person has developed you have to take a deep look at how competitive the marketplace is and your interest in the topic before you decide what type of content you want to create. If you care less about the topic than the person you bought the site from or if you abandoned the domain in the first place and are coming back to it only because it is ranking the site is most likely invariably headed toward a slow death. If you are aggressive at monetization and content creation you can squeeze out a large profit before the site dies.

If you already have adequate authority then content quality is not as much of a requirement as if you are trying to build up from scratch. It is hard to become a subject matter expert in a topic you are not interested in, so if you spend time trying to make great content as compared to average content you could end up creating 5 articles a day instead of 15. So long as neither are building too many natural unrequested links it is probably better to churn out 15 articles.

Occassionally it may be worth putting in the effort to create a serious great article, but it is only worth the additional effort if you think doing so will yield needed quality links. If your site as a whole is rather authoritative most of your articles can be pretty crapulent without it hurting you.

When churning out articles do not forget to factor original authority and the effects of the domain on CTR. For example, if the site is about Idaho mortgages it probably is not going to have enough authority to become a great California mortgages site or nationwide mortgage site, especially if the domain barely had enough juice to compete for Idaho or if the word Idaho is prominent in the URL.

Matt Cutts Announces Death of Cheesy Link Exchange Networks

I updated my SEO Book PDF today, noting among other things that Matt Cutts sorta hinted that an abundance of low link quality links could cause decreased importance in crawl priority.
Matt then went ahead and posted on his blog that some sites are completely removed from Google's index due to heavy reciprocal linkage.

The sites that fit 'no pages in Bigdaddy' criteria were sites where our algorithms had very low trust in the inlinks or the outlinks of that site. Examples that might cause that include excessive reciprocal links, linking to spammy neighborhoods on the web, or link buying/selling. The Bigdaddy update is independent of our supplemental results, so when Bigdaddy didn't select pages from a site, that would expose more supplemental results for a site.

I think that is probably the post that killed cheesy link exchange networks.

For those who recently said I was full of shit on my position on link exchange networks I am glad that Matt took the time to validate my position. :)

Can other people harm your site? Absolutely. The scalability of the web, and differences between living wages around the world created significant value in funneling around hollow PageRank to sell to naive webmasters which own sites which lack the qualities necessary to be citation worthy.

Knowing that having a certain percentage of shady links will kill your ability to rank in Google adds an additional opportunity cost to building shoddy links which. Things that were once "cheap" or "free" suddenly became expensive, and quality votes gained a bunch more value in the process as well.

This announcement of Matt's in combination with the search results reflecting this activity might be the single biggest thing Google has done in a while to improve the quality of information production across the web.

Fake it till you make it trading to the top still may work well enough in Yahoo! and MSN, but it is not a viable Google solution. If Google could plug some of their other holes they would be much harder to manipulate than most webmasters would like.

Pages