Who Do You Link to? New SEO Tool

Many people get stuck in the sandbox because they can't get high quality links.

The solution is to keep churning out mediocre content and build more junk links, maybe also rent a few decent ones. Sure aging can have an effect, but a large part of the ineffective SEO problem might rest within the fundamental techniques being used.

Insanity: doing the same thing over and over again and expecting different results. - Albert Einstein

Many webmasters are stingy with their links, afraid to link out to other sites. Some people view sending traffic away to other sites as losing your visitors, but linking sorta works on a karma like system.

If you don't link out to anybody and your content is not amazing then most of the best sites are not going to want to link to you. Why should they?

Certain sites are not going to want to link to your site no matter what, but you can still work your way into their community by linking at them. As you cite relevent and useful resources your site becomes more linkable. More of the sites you want to links from will send you some link love.

Linking out freely and regularly is one of the cheapest and fastest forms of marketing available. Many new webmasters drop the ball on the concept because they feel they need all the links to point their way.

In the spirit of linking, I am linking at Jim Boykin's new SEO tool, which tracks who you are linking at: Forward Links (beta). Nice touch on the beta name Jim.

Currently the tool only shows the first 100 outbound links it comes acrost.

I think Jim might further explore the neighborhood concept next week, during his WMW speech. Google Touchgraph also does a good example of showing the neighborhood concept.

Are there any good sites I should be linking to which I have not yet linked to? If so feel free to mention them below.

PayPal Upgrades Payment Solutions

Google is always amazing at timing their news to overshaddow competing services. Google's news of their new payment process system had no specific source and absolutely no useful details, almost as if they just wanted to whip something up to overshaddow another story.

In other, unrelated news:

A name synonymous with e-commerce made a move Friday to cement that status. Online payment service PayPal launched a new tool, which gives merchants the option of allowing consumers to complete credit card transactions on the merchants' own Web sites. The tool will preempt the need, and annoyance, of being redirected to PayPal's own site. The software allows Web sellers to run the checkout procedure as PayPal processes the deal in the background.

Website Payment Pro only costs $20 a month more than regular PayPal payments, and odds are that by ditching the extra screen one would make a few more sales. PayPal could still make a ton of improvements, a couple nice things would be:

  • allow merchants to run an affiliate program through PayPal instead of needing to install their own affiliate software or use a system like PayLoadz.

  • make it easy to download account history from a week or a year ago. make the data accessible as AdWords account data is

I am sure there are other ways to make it better as well, but I am sorta tired :)

How would you improve PayPal? They probably have at least a few months before Google launches their system. You can bet that Google's system will probably interface with their advertising and tracking software as well.

MovableType Blogs, TypePad Hosting, Comment Redirects & Google Ignoring Robots.txt Files

What is up with Google indexing all the TypePad comment redirects. Clearly the robots.txt file says Google should not be indexing those.

Is ignoring the robots.txt file an accident, or a normal feature at Google?

I have a rather small blog, with about 1,000 posts on it. Google is showing 5,000 pages from my site in it's index. Some of my normal pages are already not being cached because Google is indexing my site less aggressivley due to seeing no unique content on the pages where THEY IGNORE THE ROBOTS.TXT PROTOCOL. Pretty evil shit, Google.

Now I need to figure out how to do some search engine friendly cloaking or somehow issue Googlebot 403 errors when it tries to spider those URLs. Way to suck Googlebot.

Perhaps this issue would have been noticed and addressed by a MovableType employee if they didn't have blind trust in search engines and think all SEOs are scum.

Many TypePad hosted sites & MovableType sites are being screwed / partially indexed due to this problem. MovableType owes it to their paid customers to ensure problems like these are not happening.

Legitimate Guerilla Marketing Forum Advertising Threads - Google Site Targeting and Forums for Viral Advertising Success

Another idea as an extension of creative ways to use the new site targeted AdSense idea...

Go to a forum and participate for at least a few days to make it seem like you want to participate in the community. Make a few friends, and maybe ask them what they think of your new tool, product, idea, or offering.

When the pump is primed:

  • Have one of your new friends post about your new tool on the forums or community site.

  • Create an advertisement that looks like it is from the forum site owner that does not look like an ad. Using good tact you could almost make the ad look like an endoresement without offending the site owner.
  • Link that ad at the thread about your new product.
  • Collect feedback and participate in the thread with a few friends to guide that thread along to a happy ending.

Ads that do not look like ads...taking it one step further :)

Google Site Targeted AdSense - Killer Market Research Data

So some of my site targeted ads started running today.

Within the targeting there will be biases of the audience personality, and the bias of how well people know you or the product you advertise, but right now with the site targeting not having a ton of competition I can get a glimpse into how effective various AdSense formats and ad positions are, creating my own real world tested AdSense heat map.

Although I should, I do not have many AdSense sites yet. I have been pouring most of my time into this one.

Some markets are absurdly expensive in search, but poor in content.

Mesothelioma is a term which is so expensive that people joke about it, yet when you look at content pricing people end up making little off it. Why? Because many people want those large ad dollars, so there is a ton of junk mesothelioma sites, and limited ad spend to go around them, combined with a fear of click fraud.

If you create scraper sites then there is little sense spending time and money to test the markets. You can just put up a site and see how it works. If you are debating creating legitimate long term content sites the new site targeted ads is an excellent way to see how well certain niches pay.

Simply join an affiliate program or two, run a few ads, and see how much they cost you per impression.

Google to Offer Payment Services

The largest online ad and information broker wants to broker transactions too. No real details exposed, but this (WSJ sub req) is not a real surprise:

Google Inc. this year plans to offer an electronic-payment service that could help the Internet-search company diversify its revenue and may heighten competition with eBay Inc.'s PayPal unit, according to people familiar with the matter.

It will be interesting to see what sort of walls Google builds as they expand into other verticals.

You can't dip your toe into certain markets without a strong desire to shut out competitors, but if the moves are too blatent people will shun Google. Their search algorithms and business practices will get called in question much more frequently now. A while ago, due to that hum dinger too much similar anchor text filter Google rolled out, PayPal was not ranking for its own name. Now you will have people asking if things like that are an accident or a feature.

I can't see some marketers wanting to share transaction history details with their ad broker, but if the roll out is smooth and smart Paypal could be screwed. Interesting times indeed.

Notice how this news came out on a Friday evening after the market closed, so Google could generate spin and press all weekend long.

Search Engines Deweighting the Effects of Bad Links

So I got a call today from a person who wanted to automate a large link network system. They wanted to find a way we could work together, but I think the bulk automated links are not the way to go forward for most websites.

It is fairly hard to automate a scalable solution that:

  • search algorithms won't detect and

  • search editors won't detect and
  • people would want to link at

Sure you do not need people to want to link at it to have some value, but if you can't create something that people would be willing to link at it is going to be a constant battle trying to look authentic.

  • More and more of the cheap links get bought up by sites trying to be hollow middlemen. After they are bought up, buying additional links gets logarithmically more expensive. You can't grow with the web, at least you can't if you value your time. Thus the value of links from some link networks would diminish logarithmically over time.

  • Search algorithms get smarter and devalue more and more of the cheap & easy links.
  • Other webmasters hunt out the cheap and easy links, lowering their value and making them easier for other webmasters to find.
  • People are going to start mass pirating large quantities of RSS content to create semi authentic looking keyword net sites and link farms which will require algorithms to get much smarter at determining which links are legitimate.

A while ago I wrote that I thought if you had enough junk links that it could place a penalty on your site, but I am not sure how well that idea scales.

In some industries you can't help but get tons of scrapper links by just creating a site. Instead of trying to asign much of a negative weighting search engines may try to just discount the junk links as best they can.

Also, if people scrape the search results and link to all the top 20 listed sites it will not have much of an effect on the relevancy of those top ranked sites since they are all gaining the same links.

When you think about how the web scales, most every well ranked legit site in a competitive market will have many junk inbound links. Need proof? Look at the co-occuring links pointing at the top ranked sites for generic terms in your field.

The biggest way where I think junk links will hurt sites is if they do not have enough legitimate links to offset the effects of the junk links. It is especially easy for that type of effect to occur if you have few links or have not been mixing your anchor text.

The key to futureproof link building is to get links that are hard to get from well trusted sites. Sure that sounds blatently obvious, but sometimes it is cheaper to:

than it is buy ads, and that is a huge idea most websites seem to be missing out on. Instead of spending so much time figuring out what the latest spam techniques research is about many webmasters would be better off creating something people would want to link at.

Then again, some people who are uber proficient search spammers will always stay one step ahead of the algorithms. It is all about playing off of your strenghts and finding what you love to do.

Google Hand Editing "Search Engine Optimization" for PPC?

A friend told me that recently he has seen huge changes in Google's search results for search engine optimization.

Overture, SEMPO, and Search Engine Watch are all in the top 10. Is Google relying more on related community / hub links, placing more value on word relations, or placing more value on human review?

My friend has stated that arcoss the wide variety of sites he tracks that this is the only large change he recently noticed. Anyone else see any shakeups recently?

Is Google trying to get people to think SEO is PPC? That would be evil.

Update: I did a bit of digging around. My other site does not have much in the way of anchor text for search engine optimization and the Google API shows it ranking at #73 for search engine optimization. Perhaps they are better understanding that search engine marketing and search engine optimization are related. Yet another indication of how important mixing anchor text is & will become.

Wow My Alexa Ranking is Great! Should I Trust It?

In the past I did not write in my ebook about somethings I did not care for much. I never really mentioned Alexa because I did not view it as a big deal. An epiphany hit me that I should state why I did not care much for Alexa stats.

Over the last few days my Alexa has doubled from around 13,400 to around 6,800. Wow I am great. Not really!

If you looked at my actual server logs you would see that traffic has been fairly constant over the past week with only a small uptick in traffic of about 5 percent. Why the huge increase in Alexa rankings then? More new webmasters using the Alexa browser finding my site. The three things that helped boost the number of new webmasters reading this site are:

  • mentioning the hidden links on FT - WebProNews linked to my site
  • my site ranking for Corey Rudl's death, and his friends recently putting out a newsletter saying he died
  • About.com WebSearch recently listing my blog as a top SEO blog

Many new webmasters get information from each of those channels. Just a few people from each browsing my site with an Alexa toolbar caused the rankings to nearly double, which is a huge change on a logarithmic scale for a site in the top 10,000.

Traffic has not changed much, sales are about the same, and if you looked just at Alexa things would look a bit brighter than they really are.

Here is what I recently wrote in my ebook:

Alexa is widely tooted as a must use tool by many marketing gurus. The problems with Alexa are:

  • Alexa does not get much direct traffic and has a limited reach with it's toolbar
  • a small change in site visitors can represent a huge change in Alexa rating
  • Alexa is biased toward webmaster traffic
  • many times new webmasters are only tracking themselves visiting their own site

Why do many marketing hucksters heavily promote Alexa? Usually one of the following reasons:

  • ignorance
  • if you install the Alexa toolbar and then watch your own Alexa rating quickly rise as you surf your own site it is easy for me to tell you that you are learning quickly and seeing great results, thus it is easy to sell my customers results as being some of the best on the market
  • if many people who visit my site about marketing install the Alexa toolbar then my Alexa rating would go exceptionally high
  • the marketers may associate their own rise in success with their increasing Alexa ranking although it happens to be more of a coincidence than a direct correlation

A lower Alexa number means a greater level of traffic, and the traffic drops off logarithmically. You can fake a good Alexa score using various techniques, but if it shows your rankings in the millions then your site likely has next to no traffic.

Alexa by itself does not mean that much, but it simply provides a rough snapshot of what is going on. It can be spammed, but if a site has a ranking in the millions then it likely has little traffic. It is also hard to compare sites in different industries. For example, if I created a site about weight loss there would be many more people searching for it than a site about knitting. Also, you shouldn't forget the webmaster bias the tool has, which means my site will have a higher Alexa rating than it should.

Google Site Targeted AdSense Ads

This page was linked to from various site targeted AdSense ads to explain a bit about how the technology works. There are a couple links at the end of this post which also point to a few ways to creatively use Ads by Goooooogle.

Easy to Set Up:
I just set up my first site targeted AdWords ad account. Setting up a campaign was fairly easy.

This post is my intro to the site targeted ads, if you are interested with my thoughts of it click on and read with your bad self. hehehe Overpriced Impressions:
While there is lots of active discussion on them, I tried to avoid CPM advertising on most of the major SEO forums because I know that its not uncommon for me to generate 50 to 500 page views myself in a day when I am in the posting mood.

The $2 minimum CPM for forums is probably a bit rich for my business model, especially when I can participate in the threads and be seen as part of the activity instead of part of the ads. I might advertise on them soon, but am not yet.

Business Model / Quality of Business / Why to be Social:
I spent a couple hours picking out sites to advertise on, but some people have far more profitable business models than I do. Shortly it will likely get to where site targeting is not a viable option for my current business model, but I might as well try it out while its new.

A good link broker (hi Patrick) or an SEO firm can make far more money than my business model because my business model currently lacks recurring fees.

One major benefit my business model has over most others is that I spend most all day reading and playing on blogs & forums, and thus know many people who are hip and help market my stuff for me. Another benefit I have is that I have low living costs and limited infistructure, so I could change quickly.

Keywords & Site Targeting?
Some people have recently told me that the site targeting also allows you to target keywords on those sites, but I did not see that feature. Likely it will eventually be added. Yet another reason why primarily designing a site about a niche is huge: making efficient ad sales easy to target, automate, and buy.

When you pick your initial content sites to advertise on it allows you to add a number of keywords with the seed set of sites you entered to help refine the concepts you are interested in and offer other similar sites you may want to advertise on. Some of the suggestions were a far miss, but a large portion of them were dead on.

Another useful feature would be allowing you to specify filepaths. Currently it looks as though they only allow site and subdomain targeting, which can make it hard to reach other parts of sites with huge forums.

Context Without Search:
In the past you could not buy contextual ads without also buying in on Google.com search ads as well. With the new CPM program you can buy text ads, graphic ads, or annimations. When you place your bid it is a max bid. Google does a bunch of math to convert your CPM max bid to a CPC to compare it to the AdSense contextual ads for pricing purposes.

By picking what sites to advertise on, and shifting the ads from CPC to CPM, you lower your clickfraud risk profile.

A Low Noise Hello:
One of the best deals with the new CPM program is that you can make sure certain site owners know you exist for a low price. With blogs sometimes you can do that with a comment or a trackback, but it is not possible with many sites. A site targeted ad might be a good way to say hello.

If you randomly start seeing a bunch of ads from my site on your site then I probably targeted your site.

Other Cool Things:

  • Whenever there is breaking news you can quickly add that site to your account to make sure you advertise where large active streams of new traffic are.

  • In the same way that Google makes irrelevant ads pay a premium for having a low clickthrough rate on Google, this program also uses community feedback (in this case peer pricing pressures) to help ensure the ads stay as relevant as possible.
  • This program helps quality publishers get more value out of their content while lowering the fraud risks associated with participating in AdSense.
  • Displays clicks and cost per conversion with each URL.
  • Allows you to bid different prices on each URL within a group.

The Down Sides:

  • Just like all Google ads, the system is a bit unpredictable. I put in a max CPM and ad spend amount, and odds are my real costs will be nothing like I bid and I will get less impressions than the associated amout that I bid for.

  • There were no suggested CPM bid prices, or expected costs listed, just estimated pages viewed in the past.
  • If large advertisers buy up the best ads by overpaying for the best content sites that means that the average advertiser, which may be locked out of many of those sites, might not have much left but the clickfraud and scrapper sites to pick through.
  • When initially selecting a seed set of sites it helps suggest many others. It seems as though after you set up your ad group you can't get that feature back again without starting up a new ad group? But then again I am tired and maybe I missed something.
  • Sometimes I might want to advertise specifically because I want to reach a site owner, but Google considers clicking ads on your own site as clickfraud.
  • It is probably a bit easier to fake page impressions than ad clicks, and Google will quickly be dealing with another form of fraud.

Bonus creative site targeted AdSense ad ideas:

Pages