Just when life is chugging along at an easy pace, you hit a brick wall. This was the case about two weeks ago when one of my sites went from number one in the search rankings to all but invisible overnight. Needless to say, I was not happy. I was, instead, extremely confused.
Google’s been cranking out notable changes to its engine for weeks now. As far as I could tell, my site was not in violation of any new policies or algorithm changes so I was puzzled.
Why would a small site that had ranked extremely well for 10 years suddenly up and disappear? I checked every keyphrase I normally ranked for… no Copywriting Course site. I even searched for the actual URL of the home page… nothing.
Then I discovered something. Pages that had been deleted long ago were popping up on the search engine results page (SERP). Weird!
I’ve seen changes come and go. There have been some previously huge changes to Google that have sent my sites into a temporary whirlwind in years past. But this was different. There didn’t seem to be an obvious reason. That’s when I decided to enlist the help of my bud and SEO expert, Jill Whalen of Boston Search Engine Optimization firm HighRankings.com.
The Things Nobody Tells You
After running a command, Jill discovered a lot of duplicate content associated with the site. Huh? There’s no duplicate content on that site. Or so I thought.
Yet there it was in black and white. (Or blue and white, as the case may be.)
When I took a look at the actual file names of the URLs, it dawned on me what was going on.
A week or two earlier, I’d hired someone to move this site from one hosting company to another. Before she did, I wanted to go through the files and clean out some old junk I no longer wanted to keep. When I was done, I told her to go ahead and move the updated list of files.
The things nobody tells you. What I didn’t realize was that moving the entire list of files would make all of them live: including all the test index pages I’d saved over the years. Suddenly, the web was being flooded with about a dozen ever-so-slightly different versions of my index page. Not a good thing!
Of course, once they all got indexed the Googlebot choked. And – just as Google has outlined time and again – one page was allowed through and all others got clipped by the duplicate content filter. They may still have been indexed, but they were filtered out of the rankings. In this case, the only thing that eventually ranked was a YouTube video on one of the test pages. Internal pages from the site showed up in the SERPS, but not the most current index page.
Leaping Into Action
As soon as I discovered what had happened, I deleted the test pages from the server. Then I created a robot.txt file to prevent a few other pages from being indexed.
About a week later… poof! The original index page was back in the rankings at number one for my primary keyphrase and in the top seven for other keyphrases. Phew!
Just goes to show that – even if Google has recently bombarded us with changes – rankings shifts may simply be caused by human error .
Want more great website and SEO copywriting articles and tips? Subscribe to Karon’s Marketing Word Copywriting Newsletter. In addition, you’ll get discounts, product announcements and 2 of Karon’s popular SEO copywriting ebooks.
I am happy you got your rankings back. I got a headache just reading the post. But at least now if I ever have such a problem, I will know who to “call” about how to investigate and get it fixed.
Technology is so much fun!
Thanks Gretchen. Nice to have the mystery solved!
Great post Karon – nice to know even the experts have challenging days 🙂 Also, great that you figured that out and shared quickly.
As I was looking at this, as always, I wondered how I’m impacted. Where should I start looking for all those dupes? And with what tool. Any suggestions?
dp
Hi David! You can do the same site command I mentioned in the article. Just enter whatever URL you want to see. It will return a list of pages that have already been indexed in Google. Then you can see how many (if any) duplicates you have. Mine were easy to spot because they all had similar URLs… index2.html, index-video.html, index-offer.html, etc.
I had a client a few years back who had something similar happen when they switched over to a new hosting company, but for a very different reason. To keep the new design from being indexed while it was being tested they put a block in their robots.txt file. Unfortunately, however, they forgot to change it when the site went live and so for a month the searches engines were effectively blocked. Once they changed their robots.txt everything was fine. It cost them a lot of business, however, over a simple little error.
As I was reading this, I was reminded that the hosting company you are using can actually be a factor in SE rankings. If your web site is down a lot due to hosting issues, your web site could get dumped by the search engines. I realize that’s not the case in this situation, but thought I’d mention it.
I was hit during the “Panda” algorithm change. I had originally used HTML pages for my blog, then switched to WordPress and I transferred my old HTML tips to WordPress. But I didn’t delete the HTML pages because they were still getting traffic. Bad! I had to put 301 redirects on 400 pages (luckily I had a VA to help me). What a project! But within a few weeks, my rankings for my 2 main keyword phrases were up to 1 and 3 respectively. Whew!
Yep! It can happen, Ellen. Glad you got it straightened out.