Hardly an hour passes before some poor frustrated website owner posts a message in
Google's Webmaster Help Forum
asking why their website dropped in the rankings of Google when they are still ranking high
in the other search engines. Often it's a dramatic plea for help from a non-technical webmaster whose
small business website has been near the top of the results for his best keyword phrase for quite some
time when suddenly, and without his making any changes to his site, it is no longer to be found and he
cannot see any reason why. So I've collected a list of the most common reasons here why a website's
search engine rankings fall with suggested methods of recovery.
Reasons Why Your Site Dropped in Google Ranking
There are many reasons why a website will drop in the rankings, so I've compiled
a list of the most common reasons here. Start here, but once you've taken care of any of the
ranking issues I describe, be sure to check Google's
Guidelines For Webmasters to be sure you're in compliance. Pay particular attention to the Quality
Guidelines, because that section contains the rules that will incur a penalty if you violate
them. The other search engines all have similar guidelines that you need to follow.
- Change in the Search Engine Ranking Algorithm: Search engines are constantly
changing the methods they use to rank websites in order to improve the general
quality of their search results and to weed out the sites that are resorting to
various tricks (ie. so-called "black hat" optimization techniques) to
artificially boost their rankings. For the past few years, Google has been making significant
changes to their algorithm on a frequent basis. While Bing tends to make less frequent and usually
more subtle changes than Google, they don't stand still for long either. The point is that search engines
do change their methods and you can't rely on your rankings to remain unchanged forever. A drop in
ranking can happen gradually, or it can be sudden when their methods change in a way that particularly
affects your website. When your site's rankings fall and you believe that you're abiding by the rules, your
best recourse is to reexamine your site in light of the Guidelines and work to make your site better and stronger.
- Loss of PageRank/Link Popularity: One or more links to your site that had been
providing a significant amount of PageRank to your site have been removed/deleted,
moved to a new unranked page, or the PageRank of the originating site has dropped (for
similar reasons). Websites with low to modest PageRank often obtain the bulk of their link
strength from a small number of strong links. So the loss of even one of them can have a
major impact on their rankings. In recent years, Google has begun to aggressively
discount or completely negate the PageRank value of links from low quality pages/websites,
and links they consider to be unnatural, such as text link ads, gratuitous press releases,
forum signature links, blog comments, and listings in low quality directories. The loss of
the value of such artificial links can cause a significant ranking drop, and excessive posting
of such links can lead to penalties. Artificial links are links that the webmaster generated
himself, as opposed to "editorial links," which are links that are posted by other
webmasters for the benefit of their users. See the Google Panda and
Penguin Updates information for details.
- Malware or Hacking: If Google detects malware on your pages, it will warn
users who click on your pages in the search results before sending them to your site.
This is not really a drop in ranking, but the effect is just as bad since very few
users will proceed past such a warning. See my article on
Removing Malware for help
on getting the Malware Warning removed and for removing the malware itself. Google has some Good
advice for repairing a hacked site, too. You can use Google's Safe Browsing Diagnostic
Tool to check your site using the form below:
- Penalty: The search engines are getting very aggressive about violations of
their guidelines and are quick to punish some transgressions. Some of the most common
causes of penalties include:
- Hidden Text: Hidden text is an old trick that can remain undetected by the
search engines for a while, but is always discovered at some point. Don't
hide text by making it the same color as the page background. Google sniffs that
out very easily these days and can get you a significant penalty. You can
use CSS methods for drop-down menus or tabbed <div>s to keep content invisible until a user requests it (by
mouseovers or clicks). As long as there is a legitimate reason for using CSS in this
manner and there is a clear way for users to reveal the content, you won't be penalized. Just don't
try to stuff keywords on a page in a way that only the search engines will ever see it. You'll
get caught eventually.
- Artificial Links: Artificial links are any links that are not "editorial links" -
that is, links that are naturally embedded in the page content as a resource for users.
This includes paid links (text or images), forum signatures, blog comments, directory submissions and any other link that
you generate yourself. While all of the major search engines prohibit buying and selling links, Google has
been the most aggressive in penalizing link buyers and sellers. If you're selling plain text links or image
advertising that can pass PageRank, your site may have its PageRank score(s) reduced or it may have its ability to
pass PageRank removed. In either case, it can mean your website's internal pages will
no longer rank as well as they had before. If you have paid
links or advertising on your site, make sure that all of the links have the attribute
rel="nofollow" in the <a>nchor tag in the HTML code or that the
links point to a redirection script on your site that is blocked in your robots.txt file. If you're a link buyer,
the effects of Google's Penguin Algorithm can mean that you've wasted your
money on links that are no longer helping your rankings and may well harm them.
- Thin Content/Low Quality Pages: Google's Guidelines prohibit sites that exist mainly to serve
advertising. You can have AdSense ads or affiliate links if your site contains a high proportion of original
high-quality content. It's only websites that have a large amount of content copied from elsewhere, and websites
with little or no information-rich, original content for the user that will be penalized. Business Ecommerce websites
should avoid copying product descriptions from the manufacturer or other e-commerce sites
like Amazon, and create their own unique descriptions for each product they offer for sale. In January of 2012,
Google announced that they would also reduce the rankings of pages where there is a large amount of advertising
"above the fold" that makes the user have to scroll down the page in order to see the
content. This includes AdSense ads. Overall, Google now reduces the rankings of websites with a large number of pages that have a high
ratio of advertising to content. It's all intended to improve the user's experience in the search
results. See my note on the Panda Update for more details.
- Multiple Domain Names: Having multiple domain names
pointing to the same content is a common mistake that new webmasters make, thinking
that it will lead to better rankings when just the opposite is true. While this practice
doesn't generally trigger an actual penalty, the search engines' methods for filtering out duplicate
content can damage your rankings. See "Duplicate Content" below. The solution is to
pick one domain name to use and install 301 redirects on all of the other domains pointing to the
primary domain you've chosen. See my article on Multiple
Domain Names and SEO to learn the best way to use domain names.
- Domain Farms: This is the practice of creating many websites and heavily cross-linking
them all to each other. It's not uncommon for people or businesses to own and operate multiple
websites, but if they are heavily linked to each other ("cross-linked") and don't provide
unique, high-quality content, Google may consider it to be a
"link scheme" you created solely to increase your rankings. If you own multiple domains,
it's alright to create a handful of links between them. You can also post individual cross-links within
the content of one of your sites pointing to another site you own when it's appropriate to the
content and helpful to users. But don't cross-link all of them on every page. All things in moderation!
- Linking to penalized, or so-called "bad neighborhood" sites: This is another
mistake that a novice might make without knowing that he may be doing something
in violation of the search engines guidelines. Make sure that any website you link
to is one that you would be happy to have your users visit and that has a majority
of its pages included in Google's index. In general, a couple of links to a bad neighborhood won't
hurt you unless you also have other quality issues, but you should always be careful in
posting links on your site. If you need to link to a site you think might be of questionable quality, the
best practice is to add the rel="nofollow" attribute to the <a>nchor tag.
- Cinderella Story or "Honeymoon" Effect: If your site is less than 6 months old, you may have been
getting an artificial boost in your rankings from Google to help your site be found by users.
But that extra ranking power for new sites doesn't last forever. You'll seem to be flying
high one day and not to be seen at all the next. Again, this usually comes down to low
link popularity since it takes time and effort to build quality links to a
new website. A continuous link building program is your best insurance against
falling rankings. See my Building Links article for
some good advice.
- Excessive Link Exchanges: Yes, I know that I recommend link exchanges for new
websites and I also know that Google discourages the practice. But although their guidelines are
ambiguous, their actions are easier to interpret. Limit your link exchanges to just a handfull of closely-related, good quality
websites. A dozen is all you need to get your site on the right track, not hundreds
or thousands. Don't do massive exchanges through automated programs that create link exchange
directories on your site. Don't make meaningless posts in online forums or comment in blogs just to get
links. Such artificial links (ie. links you create yourself) don't improve your rankings and are likely to reduce
Google's level of trust for your site. Trust is a term you will run into more and more as you
study Search Engine Optimization. Both Google and Bing use that term in their suggestions for
webmasters. A high proportion of artificial links can really hurt your site's rankings now. See my note
on the Penguin Update for more details on bad links.
- Canonicalization Problems: My personal favorite because, without changing a
thing on your site, you can fall victim to this problem in Google. All it takes
is someone linking to your site with the wrong version of your URL and you can
start to have some problems. This is mostly a problem for newer websites that
haven't firmly established themselves in Google. It means that Google has indexed
pages from your site with more than one version of your URLs. It can be two versions
of your domain name (ie. 'www.example.com' and simply 'example.com'), or it can
be the correct domain name, but using both HTTP and HTTPS. You can also have problems
if you set a rel="canonical" tag to the wrong URL. I can personally attest to
how easy it is to make that mistake. Fortunately, you can recover easily from this issue.
See Website Canonicalization Repair article for details.
- Broken links: If one of your internal pages is critical to your website's
success - either for its ability to draw traffic by its high search engine ranking or
because it's a critical navigation page - making a typographical error in a link can
mean a major section of your site is suddenly disconnected from the rest of your site
and therefore vulnerable to being dropped by the search engines. Normally, a critical
page will have several links, but novices will sometimes fall victim to this error.
The cure is to make a habit of regularly using a link checker like the free utility
Xenu Link Sleuth,
or the W3C's Link Checker.
- Server Problems: If Google has difficulty accessing your site, if it's slow
to respond or responds with an error code for a sustained period, it can lead to
problems. Search engines are very tolerant of short periods when a site may be unavailable
for maintenance or other issues, but if the problems persist over many days it can
impact your rankings. If you know your site will be down for maintenance, you should
set your server to respond with the HTTP response code 503, which informs the search engines that you're
aware of the situation, it's only temporary and they should try again later. Other server problems include
chains of 301 or 302 redirects that take too many steps to reach the final page. Two redirection steps should
be the most any HTTP request to your site should return. My Server
Response Checker will show you how your server is responding.
- robots.txt or robots <meta> Tag Issues: Your robots.txt file or a robots
<meta> set to "noindex" or "nofollow" may be blocking the search engines from
crawling some or all of your pages. Review your robots.txt file to make sure it looks good. The
"Fetch As Googlebot" tool in Google's Webmaster Tools console will let you test your robots.txt file to
make sure it only blocks the URLs that you want it to. The Webmaster Tools console also
shows a list of URLs Google found were blocked by your robots.txt file in the Crawl Errors report.
- Duplicate Content: You should do your best to prevent the same content from being
available through more than one URL on your website or anyone else's. When Google finds duplicate pages,
it tries to select the "best" or "canonical" version and devalues all of the
copies. But as a webmaster, you don't get to choose which copy they select as best. This problem crops
up quite often when people have blogs or shopping carts on their sites, or have separate sites for
different countires and/or languages. If your website uses a popular
blog, forum, or shopping cart script, be sure to look for the latest version of that software and any plug-ins
that might be available to help. Another source of duplicate content is other sites copying your
website. This is particularly annoying since you obviously had nothing to do with creating the
problem. It's a good idea to use services like CopyScape
to check for other sites copying your pages. They have both free and paid services available. If you
find someone copying your site, contact them and demand they remove it. If they don't comply, contact
their hosting service and tell them what happened. It can also be helpful to submit a
DMCA Removal Request to Google.
Google offers webmasters substantial help through their Webmaster
Tools. You can get a detailed analysis of your site's status in Google there. But for websites that
have incurred a manual penalty, there is a form called the "Reconsideration Request" form that lets you
tell Google that you have repaired any violations you found and ask that any penalties be removed. While
Google's automatic systems will usually remove a penalty over time if a website has been brought into compliance, filing
a Reconsideration Request can speed the process along by several weeks or months. Note that the Reconsideration
Request will only help if you have had a manual penalty applied to your website, which are penalties for
violating Google's Quality Guidelines, and only if you've fully resolved the problem that caused the penalty. Google considers
other issues that have the same ranking effect as a penalty to be a part of their algorithm and they rarely take any direct
action in those cases.
Google Panda Algorithm
On February 24, 2011 Google released a major update to its ranking methods, popularly
referred to as "Farmer" or "Panda" since the purpose was to target so-called "content farms"
which are large sites filled with low-quality content. The update also targets sites that have an excessive
ratio of advertising to content and sites that have a high level of duplicate or 'scraped' content. This is one of the
rare circumstances in which a Google ranking factor works on a site-wide level, as opposed to an individual web page. If your site's rankings dropped
significantly recently, you should read the article on Search Engine Land by Googler Vanessa Fox,
Google
Traffic Dropped With Farmer/Panda Update that suggests ways to analyze why your rankings dropped and how
to restore them.
Generally, you should keep your pages rich with fresh, useful, original content. Websites that
syndicate, republish, or repackage content from other sites are likely to have their rankings drop significantly.
The same holds for sites with a large number of pages that consist mostly of advertising and other template content
with little useful content. If you have low-quality pages that simply can't be improved, but whose content is still
important to your users, you should either add a robots <meta> tag set to "noindex" to those page, or block
them in your robots.txt file. A better solution is to merge that content into other relevant pages and delete the original page.
Since this is a sitewide ranking factor, you won't see any improvement in your rankings for quite some
time after you make any attempts to improve the quality of your site with regard to this update in Google's ranking
algorithm because they only update this information periodically.
Google Penguin Algorithm
Google has also made a second major change in its algorithms. This change has been officially named "Penguin" and
you can get a sense of how it works in the Inside
Search article on Penguin by Google's "distinguished engineer" Matt Cutts. In that
article, he examines some eggregious examples of webspam with overt and deliberate attempts to scam
the search engines. It's not the imfamous "over-optimization penalty" that he had discussed
at an industry conference earlier in 2012, apparently, but might well be a step along that path. Issues
like unnatural keyword usage and irrelevant and artificial links were highlighted in the article, but it's not clear
yet just what Penguin specifically targets. But the message is clear. Google wants websites that are designed to
deliver quality content for users to rank best, and websites that try to beat their algorithms should not
be rewarded. Bing has also recently made it clear that they consider many similar aspects of page quality and
usability to be important to their ranking methods.
One aspect of Penguin that is well-established is its targeting of artificial links, such
as blog comment SPAM, article SPAM, paid links, forum signature links, and other webmaster-generated links. If you're
trying to diagnose a drop in your rankings, check the "Links To Your Site" tool in the "Search
Traffic" menu in the Google Webmster Tools console. One thing to look for is large numbers of links coming from single
domains. These are often so-called "scraper" sites that just copy content or automatically generate
content that include links. And since they're low-quality sites, Google is targeting these links in their ranking systems.
Look for any links you created that might appear to be SPAM. Once you identify these potentially harmful links,
contact the webmasters of those sites and ask them to remove the links. If/When that fails, you can use Google's
Link Disavowal Tool to request that Google
ignore them. Removing such bad links has helped some websites to escape from Penguin issues. In the end, you want
the vast majority of your links to be natural (ie. "editorial") links from high-quality websites.
October 20, 2014 Update Google has confirmed it has deployed an update to
Penguin, dubbed by the SEO community as "3.0". They have said it should help smaller websites perform a little better. It also updates
their link data, so if you've been cleaning up your backlinks over the past year, you may see some
improvements.
December 10, 2014 Update Google has announced that Penguin will now
be updating on a continuous basis, rather than having periodic (ie. glacial) update releases. So there is
hope that websites that work on their backlink profile will see results much faster now.
May 21, 2015 Update Google tried to clarify the issue of Penguin
updates. Apparently, the internal data collection is continuous but it requires a "data push"
for that data to be incorporated into the rankings and such data pushes are only done periodically. Does
the term "geologic epoch" ring a bell?
This page was last updated on September 25, 2020