21.2 C
New York
Wednesday, October 4, 2023

Buy now

Your Site's Traffic Has Plummeted Since Google's Panda Update. Now What?

Google Traffic DropGoogle’s latest algorithm change has impacted nearly 12% of queries. That means that a lot of sites, big and small, are looking at their web analytics reports this week and seeing devastating traffic graphs down and to the right. It can be difficult to think strategically and objectively in such a a situation and your first reactions may be panic and anger.

Understandable, but if you want to restore your site’s traffic, you need actionable steps for evaluation and change. Even if it is truly the case that you were unfairly hit as a result of collateral damage, you want to spend your limited time and resources on tactics that are likely to bring your traffic back. Read on for more tactical things you can do now, and methods you might want to avoid.

Ineffective Strategies

What is unlikely to bring back your traffic is an appeal to public opinion or the sympathetic hearts of Google’s search engineers. Google extensively tests their algorithm changes to improve search quality and they don’t roll out substantial changes like this unless their internal data shows a marked improvement. Google is a data-driven company, so anecdotal reports just don’t stand up against hard numbers.

In addition, imagine that you’re Google. Now imagine the size of the web. Consider that many site owners whose sites were negatively impacted feel it was a mistake. The point of using algorithms to determine rankings is to avoid both the impossible task of judging each site individually as well as the editorial judgment and censorship potential that would require. The foundation of Google’s organic search product is using large-scale quality signals to determine how to rank the web. Without that, the whole thing falls apart.

A recent Wired article implied that Google did in fact look into and fix individual cases, but Google clarified that this was not the case and emphasized that this change is entirely algorithmic and that their internal data has found the change to improve search quality overall. So while they plan to continue to refine things, they aren’t going to roll the change back.

That said, once you’ve gone through the evaluation I recommend in this article, if you can’t find anything on your site to change, Google has set up a thread on their webmaster discussion forum where you can provide details of your situation. Later in this article, I’ll provide advice on how best to post to that thread.

Ask “Why Me?”

You may take the plummeting rankings as a specific statement from Google on the quality of your site and be personally offended. Some businesses affected are taking the PR route to voice their painful outrage. Gloria Tsang, whose site healthcastle.com lost 40% of search traffic in the last week, issued a press release to say: “To be named a low-quality site is saddening. I want to know why Google named us low quality.”

That’s a sentiment that’s being echoed in press releases, blog posts, board rooms, and discussion forums everywhere. While understandable, your “why me?” question should instead be aimed at, “What is it about my site that caused it to be affected?” This is an algorithmic change that doesn’t target specific sites. Google engineers didn’t create a blacklist a la Blekko. Instead, the algorithms detect specific quality (and non-quality) signals. See later in this article for how to turn the “why me” question around.

Demand to Speak To Google

This is the initial reaction of a lot of site owners impacted. In the Google webmaster discussion thread opened for just this purpose, someone writes, “I would VERY much like to speak with someone at Google.”

See above re: size of the web. It’s just not practical. And as noted above, even if you did speak with someone in Google search quality and they agreed with you that your site is amazing and was only hit due to unfortunate collateral damage, there’s nothing they would be able to do for your site specifically. As they note in their webmaster discussion thread, “this is an algorithmic change [so] we are unable to make manual exceptions”. They can use your site to tweak their algorithms and that in fact may eventually restore your traffic, but they can’t put your site on a whitelist that instantly restores rankings and they are unlikely to roll back the new changes for the sake of your site. Their internal testing has found that overall this change improves search quality and they have to make decisions based on large-scale numbers of searcher satisfaction.

Tell Google How to Change Their Algorithms

Generating high quality search results is amazingly complex and the search engineers at Google have been working hard at it for years. A comment such as this one in the thread Google opened for this discussion on this issue doesn’t take into account all of the moving parts that search engineers keep track of:

My suggestion is this.  You should tweak the algo for any site that shows a secure shopping cart with add to order buttons you should not have the same content standard as pure content sites.  E commerce sites can’t have only original content if they sell products unless they are mfg the product they sell, direct distribution.  So you should make the algo work for ecommerce by not holding them to the same standard on their product pages, pages with an add to cart button.

Spending hours crafting recommendations for how they should be writing algorithms is time that you could be spending investing in your site.

What To Do Next And How To Gain Traffic Back

The first step is to pinpoint exactly what’s causing a problem and narrow in on what those problems are. The first part of this process may be time consuming but it’s fairly easy in that it deals with hard numbers. The second part can be more difficult as it requires subjective evaluation. But the data can help there, too.

Check Your Analytics Data To Ensure You’re Investigating the Right Problem

If your site suddenly lost of all its traffic on February 24th, then yes, it’s likely that this algorithm change is to blame. But remember that site traffic fluctuates for all kinds of reasons so it’s worthwhile to check the cause before spending a lot of time looking for solutions.

  1. Check the date of decline. If it’s any date other than February 24th, it may be unrelated. Google tweaks its algorithms all the time though, and all are ultimately towards the same goal of providing the best search results possible, so the advice in this article may still apply. Don’t stop reading yet!
  2. Check your traffic sources. Is traffic down across all channels or just unpaid search from Google? If the traffic drop is not unique to Google organic search, something else may be going on. Is a section of your site returning errors? It could be that you don’t have a traffic drop from search at all and instead the ads team has halted an advertising campaign or you’ve lost a link from a prominent high-traffic site that previously sent lots of visitors. If traffic is down across all search engines, your robots.txt file may gotten misconfigured to block large sections of the site.

If you determine that your traffic drop is only from Google unpaid search, then this algorithm  change (or one like it) may be to blame. Below is an example from Google webmaster tools where this clearly seems to be the case.

Google Traffic

Pinpoint The Query Categories and Pages In Decline

Take a look at your web analytics data and Google webmaster tools search queries data to see what queries are in decline.  (This may be easier if you export the data into Excel.) Is traffic down across the board? Is branded traffic down? Are only certain categories impacted?

Consider the hypothetical case of a site about cars. Has traffic remained consistent for queries related to car reviews but declined for queries related to finding local used car dealers? Google webmaster tools provides average position for each query so you can also look at rankings declines.

Below is data from Google webmaster tools showing impressions, clicks, and average position for categories of queries for my fake car site (note the data is fake as well and is intended simply as an example). Note that this data compares month over month changes using exported historical data, but if you don’t have historical data, you can do similar analysis with the change percentage information Google webmaster tools provides.

Traffic by Category

Traffic by Graph

In this fake example, you can see that branded traffic held steady, as did queries related to car reviews. However, traffic from accessories queries dropped in half and traffic from searches for car dealers has slid to almost none.

The reporting package we provide at Nine By Blue not only tracks this data historically, but also tracks queries by category and calculates high rates of change in impressions, clicks, and position by those categories, which makes parsing the information a lot easier than tracking by individual query.

Let’s look more closely at the car accessories queries. This data is sorted by highest drop in impressions, but you can also look at the data by largest drop in clicks and ranking position. In this example, you can see that rankings dropped dramatically (100 spots) for some queries, but only a few positions in other cases. Isolating cases where the site may have lost a lot of traffic because it now ranks lower on the first page of results (vs more dramatic drops) can help you pinpoint pages that may be easiest to regain rankings for and which pages to look to for the more obvious signs of issues.

Google Data

In the cases where the rankings dropped but the site is still on the first page of results, the number of impressions may not have declined, but number of clicks and click-through rate likely have. Below are queries with significant declines in click-through rate for the car dealer category. You can see that for these examples, impressions are actually up, but since the site is now ranking lower on the first page than before, the click-through rate and subsequently number of clicks to the site is down substantially.

CTR Decline

Take a look at the pages that have lost the most traffic as well. One easy way to do it is to in Google webmaster tools Search Queries > Top Pages. If you change the start date in the date range field to February 25th, you can sort the pages by decline percentage. (You can sort this by both decline in impressions and decline in clicks.) Again, exporting into Excel first will make it easier to filter by pages that began with high traffic. You can also try sorting by decline in average position and can cluster the URLs into path-level data if your site is organized such that level of analysis makes sense.

Below is an example of page-level data from our fake car site:

Page Level Data

The FAQ section of the site looks like it may have been hit particularly hard. We can try clustering the data by path and, for instance, calculate average rankings loss per directory. Be careful with this type of calculation though. If one URL went from position 2 to position 500 and another went from position 6 to position 7 for instance, your average loss for the folder can look skewed, which is why it’s important to start with sorting page-level data.

Below, I’ve aggregated the data by directory and averaged the ranking decline. The FAQ section doesn’t look great, but the regional dealership directory potentially had the biggest rankings loss. (You can combine this with historical data and analytics data to prioritize by amount of traffic loss.)

Folder Data

To get accurate data, including how the site is ranking now for queries, filter the data to web-only, U.S. only and compare date ranges before and after the algorithm change, as shown below.

Filtering Data in Google Webmaster Tools

What you’re looking for is a sense of whether your entire site has been impacted or only particular topics or pages. Since this algorithm is site-agnostic and looks at specific quality signals on a page-level basis, you may think of this change as hitting your site, but in reality, it’s hit particular pages of your site.

You can see this even with sites that have publicly been very hard hit by this change. Take mahalo.com, for instance. Presumably, their traffic losses have been significant, as they’ve laid off 10% of their staff (even though they saw the writing on the wall early). But their site still ranks well for some queries such as [how to become a travel agent vermont].


By isolating what pages are still doing well, you can learn a lot about how to apply components of those pages to parts of the site that were affected. If the ranking pages don’t seem to have unique attributes, you may want to take a prioritize these pages for improvement so they don’t get hit by the next iteration of Google’s changes.

You can further use this data to prioritize investment in improving pages and determine which sections of the site need to be completely revamped vs. slightly improved.

Data In Action

You can do several things with this data, including:

  • Determine which sections of the site are unscathed.
    • Review these pages to determine if they need any adjustments to protect them from additional algorithm changes. For instance, Mahalo should take a look at pages such as the one shown above about becoming a travel agent in vermont to determine if it includes elements that could cause it to be a casualty in the next set of algorithm changes.
    • Otherwise, you can spend your resources on the pages that are having trouble.
  • Determine which sections of the site are heavily hit.
    • Compare the qualities of these pages to the ones that are unaffected to see if you can determine patterns about what caused the declines (as described more below).
    • Evaluate if some sections require too substantial an investment to improve and consider removing them.
  • Determine which sections of the site suffered minor losses as those may be easiest to improve so you can start gaining traffic back.

Take healthcastle.com, for instance, who sent out that press release. I’m currently in Australia, where the algorithm changes have not yet hit, so what I see in rankings on google.com is likely very similar to how things looked in the United States before the shake up. Ranking position isn’t the best metric to use for lots of reasons, but in this case, I don’t have access to traffic data, so it will have to do.

Here in Australia, the site ranks #1 for [acid reflux diet].

Healthcastle.com AU

I will make the assumption for purposes of this example that the site used to rank #1 for that query in the U.S. as well. Now, the same page that ranked #1 now ranks #6 in the U.S.

healthcastle.com US

While that’s surely causing a traffic loss, it’s likely still bringing in some traffic. That it dropped to #6 rather than off the first page entirely indicates the page can likely be saved with a little work.

On the other hand, take a look at the query [bill paying system]. Here in pre-algorithm change Australia, an Associated Content page ranks #5 (it may have ranked even more highly in the U.S. before the ranking shift as one of the pages that ranks above it here is a .com.au).

Bill Pay

In the U.S. that same page now ranks #17. Let’s take a closer look at that page.

Associated Content

The advice?

  1. Create a bill payment chart. This step has amazingly insightful tips such as:
    • In the first column, write “Bills”.
    • Notice that the “bills” column is wider than the rest, since you’ll be writing the names of your bills in rows beneath the “Bills” heading.
    • Make lines going across so there are empty boxes underneath your month headings.
  2. Pick a Place for Your Bill Payment Chart and Your Bills. These tips include:
    • I strongly suggest putting them somewhere where you’ll have easy access and visibility so you don’t forget to pay your bills.
  3. Choose a Day of the Week to Pay Your Bills.
  4. Once a Week Pay Your Bills and Check Them Off Your List.

The author concludes that she’s never paid a bill late since “creating” this “bill paying system”. I can hardly believe she’s giving away such wisdom for free.

Look For Patterns

When I asked Google for their advice for site owners, they told me:

Google has always said that the best practices for sites are to work on original content, original research, authoritative information, or other compelling ways of adding value for users. A site that users love or mention to their friends is the sort of site that Google typically wants to return in our search results. A good litmus test is to ask “What unique or compelling value does my site offer that other sites don’t?”

They had earlier told Search Engine Land:

Sites that believe they have been adversely impacted by the change should be sure to extensively evaluate their site quality. In particular, it’s important to note that low quality pages on one part of a site can impact the overall ranking of that site.

This may sound like a lot of non-information, but we can actually learn a lot from it.  When the news of this change was first announced, someone emailed to ask what changes I plan to make in the recommendations I give people about how best to optimize for search. I said that my advice is exactly the same as before. Don’t optimize for algorithms; optimize for what search engines are trying to accomplish with their algorithms and ensure that your pages are the most valuable results for searchers.

In a recent Wired interview, Matt Cutts and Amit Singhal provided even more detail. They noted that with the launch of the Caffeine index, they could crawl and store more of the web.

The problem had shifted from random gibberish, which the spam team had nicely taken care of, into somewhat more like written prose. But the content was shallow. It was like, “What’s the bare minimum that I can do that’s not spam?”

They don’t claim to have solved the problem of determining high quality content from poor, but have put together concrete signals to that give clues that they can use.

That’s a very, very hard problem that we haven’t solved, and it’s an ongoing evolution how to solve that problem. We wanted to keep it strictly scientific, so we used our standard evaluation system that we’ve developed, where we basically sent out documents to outside testers. Then we asked the raters questions like: “Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?” … . “Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?” [Our] job is to find a plane which says that most things on this side of the place are red, and most of the things on that side of the plane are the opposite of red.

Creating a Tactical Plan From Google’s Advice

If we break down what Google’s saying, it’s that ways to avoid being negatively impacted by this (and other) changes are to have a site with:

  • Original content and original research (not aggregated or syndicated from other sources)
  • Authoritative information (deep and useful content, not simply words about a topic; content that answers people’s questions and that they find credible)
  • Compelling added value (if the content isn’t unique, does the page add significant value over the original source?)
  • Significant user engagement, including links and social sharing
  • Valuable content across the entire site

Look at the pages of your site that are still doing well and the pages that now rank highly for the queries you no longer rank for. Are there patterns that match the bullets above? Certainly it will be the case that for some queries you may find that pages that you don’t feel provide value are ranking, but you should find some common elements in pages that are doing well.

It also may be useful to take a look at the sites that have experienced the largest drops. What elements do they have in common? Are there any similarities with your site?

It may be temping to focus on sites that you consider low quality that are still ranking well. However, Google has been very clear that this algorithm change is a work in progress so those sites may not continue to rank well for very long.

Original Content

Do your pages consist of original content or primarily of content aggregated or syndicated from other sources? For my fake car site, my dealership pages primarily are comprised of business description information that’s replicated across a number of local directory sites.

In the case of healthcastle.com, the same content exists across multiple pages of the site.

Healthcastle Duplicate Content

Note that in their case, this content also exists across other sites. According to their press release (and their “open letter to Google“, this is because other sites have stolen it and they are the original source. Google recently launched an algorithm update intended to address this issue, so any scraper sites shouldn’t be outranking the original (and in the cases I checked for this site, they weren’t), but it’s certainly the case that Google’s latest changes didn’t completely solve this problem. If scraper sites outranking yours is the primary issue you find, it may be worthwhile to post to the Google webmaster discussion forum s described later in this article.

If you syndicate your content or publish content that authors also publish on their own sites, look at requiring use of the rel=canonical attribute.

Authoritative Content

It’s not enough for content to be original. It also should provide substantial value. Do your pages provide the very best information for the topic? Do pages contain more ads than content? The healthcastle.com page is very ad-heavy and while I can’t say definitively that is contributing to its rankings decline, there’s significantly more advertising than authoritative, valuable content above the fold. Note that I’m not at all saying that sites shouldn’t have advertising or that pages with advertising on them will rank below sites without them. I’m simply saying that if a page consists primarily of advertising rather than content, there’s very little on the page that fits the authoritative and compelling value criteria that Google describes.

Acid Reflux Diet Page

As for the content on the page, it’s not bad. But it hasn’t been updated since 2006, so the information could be out of date. Does any newer research exist about this topic? The article contains two myths and six recommendations so I’m not sure if it’s as comprehensive as it could be.

How does this compare to the pages that are now outranking it? The first result is an about.com page, which has been itself criticized for potentially being a content farm.


It also has its share of ads, but more content is available above the fold. It was last updated at the end of 2010, so potentially contains newer medical data. It contains a much more complete list of foods, along with links to all kinds of resources to related information. It’s arguably a better result for searchers.

For queries your site has lost rankings for, look at the pages that now rank well. Even better, have someone not involved with the site take a look. How do the search results look? How does your site compare to the pages that rank more highly? Are they offering more value to searchers?

ezinearticles.com, one of the sites hit the hardest, plans to require authors to write a minimum of 400 hundred words rather than the current 250. But note that length alone isn’t  good measure of value. Google is trying to measure what page across the entire web provides the very best information about the subject matter queried.

Compelling Added Value

It’s certainly possible to rank well with aggregated data, but the page should provide substantial additional value beyond the original source. For instance, take a look at the query [best cell phone rate plan].

Rate Plans

The number one result both before and after the change is a page from myrateplan.com. This page doesn’t have an overwhelming amount of content and the large portions of the site rely heavily on aggregated content from cell phone carriers. However, the wizard featured on that page is highly valuable as it sorts through the details of every rate plan and provides useful comparison charts and recommendations.

Significant User Engagement

One way to gauge potential value is to look at the number of links to the individual pages. If people are linking to a page, they are likely finding it valuable.

Google webmaster tools provides reports on the pages of your site with the most links (to get a true picture, export the data and filter out multiple links from a single site). To get real insight, combine that data with information from web analytics about which links bring the most traffic to the site. Pages that aren’t getting traffic from external links might need a closer look for quality.

According to Yahoo Site Explorer, that healthcastle.com page has 224 external links (although once you consolidates multiple links from each site, that number is greatly reduced). The best way to judge which of these are indicators that users are finding the site valuable is to filter it to include only editorial links from “real” sites. And one of the best ways to generate that list is using data — whether or not those links bring traffic.

Site Explorer Links

But since we don’t have that information, we can take some guesses. The link from suite101.com? Likely not being valued all that much by Google considering that Google’s Matt Cutts’ response to the fact that they had lost 94% of their search traffic was “Oh, yes. Suite 101, I’ve known about it for years. I feel pretty confident about the algorithm on Suite 101.”

How about this link: http://awccanadianpharmacy.com/blog/page/2/?


(The random Viagra links in the middle of each blog post is a nice touch.)

And why are sites about single cup coffee brewers linking to a page about acid reflux? If your external link profile looks like this, it’s time to dig deeper. Where did these links come from and why aren’t more of the incoming links from reputable sites?

Web-wide Engagement

How about engagement? If the site allows comments, is your audience actively discussing the topics on your site? Are they sharing the content on social media sites?

If the pages have few links and little engagement and the quality of the content is high, you may need to raise awareness that the content exists through traditional marketing channels (such as press releases, guest blog posts, etc.) and social media channels (for instance, seeking out discussion forums where your topic area is being discussed and answer questions).

Visitor Behavior on Site

Take a a look at your web analytics. Is there a high bounce rate from search? If so, searchers may not have a good first impression of your site and may be going back to the search results to click on another listing. Even if the site has the most useful and compelling content in the world, if that’s not obvious to the searcher due to overwhelming ads, bad design, or other factors, they won’t stick around.

How’s your click through rate from search results for queries in your topic area? Look at the data provided by Google webmaster tools before February 24th for queries your site ranked highly for. How’s the click through rate? If searchers weren’t clicking your listing, investigate why that might be and how to improve the title and meta description on the pages to better compel clicks.

How many pages do typical visitors view? Does the navigation make it easy to see related content and to understand the structure of the site and what it has to offer?

Does the site get many return visitors? If the site isn’t engaging and delighting your audience, what’s missing and what can you do to change that?

Low Value Pages

When looking at the data for your site as described earlier in this article, are there certain pages or sections of the site that have been impacted significantly more than others? (The earlier example that showed a page that had an average ranking decline of 600 positions is a good candidate.) Look at those pages to evaluate if quick changes can improve them.

If that seems unlikely, consider removing those pages or blocking them with robots.txt until you’re about to devote resources in improving their quality. Blocking with robots.txt is a good option if the pages get traffic and provide revenue from other acquisition channels; removing the pages is a good option if they hurt the overall credibility of the site for visitors.

Some pages, such as the Associated Content bill pay system article mentioned above, may be better to be scrapped entirely.

Make a Prioritized Plan

At this point, you should have compiled enough data to make strategic decisions about how to proceed. You know:

  • What pages are still driving substantial traffic and should be protected.
  • What pages can be improved fairly easily.
  • What pages would take considerable resources to improve and may be hurting the overall perception of the site.

You also likely have a good sense of what issues may be causing the rankings declines. Do the pages need more original content? More valuable content? Do you need to raise more awareness of the content in order to increase engagement and links?

Start with the pages or sections of the site that have had only slight drops and for which you can identify improvements. If you see consistent patterns, you might consider how you could change the overall processes for creating content, site design, and engaging with visitors.

  • Do the pages contain more ads than content?
  • Does the site provide exactly what your visitors want?
  • Can you think of creative ways to use the data you have differently (such as in the form of wizards or visualizations)?
  • Is the first impression of the site engaging and credible?

Your initial reaction upon this review is that even after identifying what content has suffered the biggest drops, you don’t see obvious quality issues. If that’s the case, try to get feedback from an objective third party. Have them answer the same questions Google set up such as:

  • Would you be comfortable giving this site your credit card?
  • Would you be comfortable giving medicine prescribed by this site to your kids?
  • Do you consider this site to be authoritative?
  • Would it be okay if this was in a magazine?
  • Does this site have excessive ads?

It could be that this is an opportunity to learn ways to provide value to your audience and subsequently not only increase search traffic, but return visitors, conversions, and loyalty as well.

File A Reconsideration Request?

In many of the discussion threads, site owners have wondered if they should submit a reconsideration request after making changes to the site. Since this Google change is a algorithmic search quality change, and not a penalty applied to specific sites for webmaster guidelines violations, generally, a reconsideration request would not apply. However, if, during the evaluation of your site, you find webmaster guidelines violations and clean them up, then a reconsideration request would be in order.

Reaching Out To Google

What if you do all of this analysis and you can’t identify obvious quality issues with your content? What if scraper sites now outrank you with content they stole from your site? As noted earlier, Google has started a thread in their webmaster discussion forum so you can provide them with more details.

Before you embark on this route, it’s important to keep in mind that even if Google agrees that your site was collateral damage and shouldn’t have been swept up in this change, you may not see things fixed quickly. Google has been very clear that this was an algorithmic change that overall improved search quality. Because it didn’t target specific sites and is based solely on signals across the web, Google can’t manually restore a site’s rankings. And they aren’t going to roll back the change. Instead as they continue to refine things, they will use the examples from this thread.

As they posted:

Note that as this is an algorithmic change we are unable to make manual exceptions, but in cases of high quality content we can pass the examples along to the engineers who will look at them as they work on future iterations and improvements to the algorithm.

Some have taken this to mean that this is a special case and they realize their algorithm change wasn’t as great as they originally thought. That’s unlikely to be the case. Google tweaks its algorithms hundreds of times a year and with every change, they continue to refine based on searcher behavior and feedback. In fact, this type of feedback is one of the primary reasons I created the webmaster discussion forum and webmaster trends analyst position when I worked at Google. Search engineers can use this information to look more closely at specific examples and make improvements.

Crafting Your Post

If you truly believe your site was caught up in collateral damage, you are likely understandably upset. But the most useful kind of post is one that’s reasonable and objective. Provide specific details, such as queries and URLs that have lost ranking. If your content is being scraped, point out examples. Mention that you’ve gone through the type of analysis and evaluation described in this article and logically describe the ways your pages provide value for the example queries over the pages that now rank.

Avoid using charged words and being defensive. Don’t tell Google how to change their algorithms; just provide facts. For instance, the first post in the thread includes comments such as “this move violates the democratic principles of the Internet.” That’s a subjective statement that doesn’t provide Google with any useful information.

Don’t just ask Google to evaluate or reconsider your site or just say that your site is high quality. Provide specific examples. For instance, this post gives Google very little to work with:

“We’ve noticed numerous instances where our top (1st-2nd) SE rankings have dropped several places to several pages.  This is a huge and devastating hit to the well-being of our website.  Please review…”

And this post, while understandable, isn’t the intent of the thread:

“Please look at my site and tell me what is wrong with it.”

This post, however, provides actionable, detailed information:

For example, we wrote a review for the latest episode of the CBS show “The Good Wife”.


We are ranking for this review on page 3 for the keyword:

“THE GOOD WIFE “Great Firewall” Review”

But there is a site that scraped our feed that is ranking for the same keyword on page 1


It’s not useful to mention how much you spend on AdWords or that your site was featured by AdSense. Google’s unpaid search results are not influenced in any way by a site’s relationship with AdWords or AdSense.

When To Expect Changes

If you identify areas of your site to improve, you should see changes in search results not long after you make those improvements. As Google recrawls and re-analyzes your site, the new signals associated with your pages based on your improvements will be used for future ranking.

If you feel your site was collateral damage and Google agrees, it’s difficult to say when you might see changes. Google has said they’ll be making changes in “the coming weeks” but in reality search engineers are always tweaking the algorithms and this will be a long term iteration. Google is likely to look over the feedback to pinpoint large scale patterns, then tackle those first.

As your improve your site, make sure you are also maximizing other acquisition channels and engendering loyalty from your existing customer base. Give your customers reasons to come back to your site again and again. Reach out to your potential audiences where they are — other sites, social media, offline. A comprehensive strategy that doesn’t rely solely on search will help get your business through the peaks and valleys inherent in the changing nature of the web.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Related Articles

Latest Articles