Question: I have a client that frequently ranks at the top of the search results then sharply drops. His website's Google rankings keep bouncing back and forth. Why do they fluctuate so much?
Answer: Via using spammy links, leveraging the internal link structure of a high authority site, or cross site scripting exploits just about anyone can rank for a day, but it is harder to stay there day in and day out until you build massive domain authority.
Google and other major search engines have many filters, editors, algorithms, and barriers which are used to prevent spamming or minimize the profitability of overt spam. I believe that Google has moved away from banning sites as much and instead moved to using filters more, because that makes it harder to know when / why / where something went wrong. Was your host down, did you screw up your robots.txt file or is that a penalty? The more they can obfuscate their algorithms the harder it is to do SEO and the more people will opt into Google's webmaster tools authentication system.
Here are a few of the most common reasons pages stop ranking / get filtered out of Google for their target phrases (ie: go from ranking in the top 5 results to many pages deep or near the end of the results).
Too Much Similar Anchor Text
If a link profile is natural many of the inbound links will use alternate phrases in the anchor text. If almost all of your anchor text is focused on your core phrase that may preclude your site for being able to rank for that phrase. This actually hit SeoBook.com about 2 years ago. Mixing anchor text is important, especially for a new site in a competitive marketplace.
Page Too Well Aligned with a Term
If your internal anchor text, inbound anchor text, page title, meta description, page headings, and page copy all target the same phrase too closely then the page might get filtered out of the search results.
This problem can occur due to being too aggressive, or due to scrapper sites that keep linking to you over and over again with your page title as the link anchor text.
You know you have achieved this filter when one of your former top ranking pages no longer appears in the top few hundred results, but a subpage of less importance and lower relevancy outranks it (perhaps even somewhere beyond #100).
The easiest way to fix this problem is to change the page title to target an alternate version. If that does not work you may also want to change your internal anchor text and try to get a few more inbound links that are not keyword rich.
Scrape You Very Much
If you have a new site with few trusted links a web proxy or scraper site may get credit for your content. The easiest ways around this are to ensure you have some absolute (not relative) links in your site's navigational structure, and to build some authoritative links to make it harder to knock your site down.
Site Too Aligned With a Term
Somewhat like the above filter, if it is obvious that your site is targeting a keyword it may not rank well for that phrase or derivatives of it. For example, it is probably not a good idea to start every page title on your site with your core keyword at the beginning of the page title. Google has a lot of patents in this phrase based IR area.
You know you have achieved this filter when you rank for alternate nearby phrases but few or none of the pages on your site rank well for shorter search phrases containing your core keyword.
Too Many Reciprocal Links
My wife's main website only ranked for one 5 word phrase until after we dumped the reciprocal link directory her SEO provider put on her site. After removing that page her rankings quickly improved. It is not that reciprocal links are bad (as some forms of reciprocation are a natural part of the web), but if an abnormally large percentage of your links are reciprocal then it is easy to get hit. You can also accidentally walk into this sort of penalty by launching an effective site rating / review awards program that gets a new site too many reciprocal links.
Getting Banned & Manual or Automated Penalties
Some sites that are penalized automated or manually are not completely banned from the search engines, but are stuck ranking somewhere beyond the top 30 results for everything. You know your site stands a good chance of having this penalty if your don't even rank for a unique string of text on your site wrapped in quotes.
Manual penalties and bans are not too common for quality sites, but they do happen. If you feel you are banned or penalized and your site is above board you can plead your case to Google inside their webmaster console.
Duplicate Content & Wasted PageRank
If you have an internal architectural problem and your PageRank is spread across thousands of pages of duplicate content then some of your good content will end up in Google's supplemental results, and won't rank for much. The remaining pages in Google's regular index may also rank worse because they will not have as much link equity as they once did.
You know you have messed this up if you keep track of your page count in Google and see it drastically balloon. This increased page count will also be accompanied by lower rankings, especially for long tail keywords that matched up with deep content.
Are You in Your Community?
Google may re-rank results based on local inter-connectivity. Large authority sites like Amazon.com and Wikipedia slip through based on their site's clout, but if you are a new and small player in a marketplace it helps to get some on topic / in community links. Re-ranking is more likely to occur for shorter queries where there is a significant community around the topic. Longer queries likely place more weight on on the page content.
Operator Error
If you have a problem with your .htaccess file or accidentally block a large portion of your site with robots.txt or robots noindex meta tags then your traffic will go down. Make sure that if you change your site architecture that you test to ensure your URL redirects and rewrites work properly. If you sign up for the Google Webmaster Central tools they will display any crawling errors or 404 messages they come across.
Minor Changes in Ranking
Sites with few links may also see their rankings bounce around quite a bit, and their crawl depth limited, until they acquire some high trust links or Google can figure out other ways to determine if the site deserves to be trusted.
If your rankings fluctuate a bit but are always near the top you may just need a few more links with target anchor text, or a few more authoritative links. Algorithms change all the time, and your strongest competitors are actively building their brands every day, so your site either grows with the web or fades in relevancy.
Microsoft likes fresh links a lot. Google may also place weight on fresh links, but they also look at link quality and rate of growth. If your link growth is too spiky or beyond what is normal they may filter or ban your site, like they did to their AdSense blog 2 years ago.