Over the Christmas period of 2015, we had a client come to us about one of their many sites getting hit with a penalty. We hadn’t been doing any work on this site prior to this, it had been managed by their inhouse team and as a result of some very questionable link building tactics (and if I’m saying they’re questionable, you know they’re downright questionable) and a rather poorly managed site with duplicate content and many other OnPage issues throughout.
We managed to get the penalty removed and grow their traffic to more than it ever was previously.
Table of Contents
The penalty this site got was a “Partial Match Unnatural Links Penalty” – These kinds of penalties don’t penalise the entire site, but instead penalise one or a number of pages on the site.
This means that the direct cause of the penalty is from the backlink profile of a specific page, or a number of pages. Though we also wanted to work on the OnPage issues whilst removing the penalty, so once we had it removed, we’d see a significant jump either back to the previous rankings or beyond.
- 6 Year Old Domain (US Based Company, .com)
- $3,500 For Penalty Removal
- One Time $8,000 OnPage / Content Cost (No Link Building Campaign)
- Travel / Tourism Niche – Avg. Sale of $700
Removing The Penalty
The links that had gotten this site in trouble in the first place were a mixture of poorly put together web directories, spammed article directories and random pages from all over the place – Random Wiki submissions, comment links the works..
We had 3 steps we needed to follow to get it removed.
Manual Link Removal
Google appreciate the efforts put in to remove links that have been placed, and when submitting the reconsideration request it’s a good idea showcase links you’ve actively removed.. Show’s a the effort was put in.
We had our link building assistant go through the exported link data from WMT, Ahrefs and Majestic.. Manually going through a (after removing all duplicates) 17,000 backlink profile and creating a new document with all of the “bad links” – There were only around 2,300 good backlinks.. So that gives you an insight into how bad things got.
We then ran a tool we use to grab the contact information/page of every domain, the 15,800 backlinks were from 780 referring domains, but due to the spammy nature of these domains only around 400 or so had contact information available. Our link building assistant created a new email address under the clients domain and sent emails to the 400+ domains we had found.
Only around 45 of the 400 domains we sent emails to managed to actually remove the link. This really doesn’t matter all that much though, because we still had evidence to show in our reconsideration request that we had managed to remove a number of links.
So we had a list of 15,800 spammy backlinks from 780 referring domains.
We needed to whittle this list down, so that we only had links that were actually live on the site, so I:
- Removed the 45 domains (and all associated links) from the Excel doc – Down to 735 domains / 15,100 links.
- Ran a live link check on the remaining 15,100 (or so) links, and removed any that were no longer there – Down to 615 domains / 11,000 (or so) links.
- I ran a final uptime monitoring test of the 615 domains, any that were down for longer than the 24h period I removed – Down to 612 domains / 11,000 links.
After the live link checker ran and found that over 4,000 of the pages had no live links, I took those 4,000 pages and ran through 2 premium indexing services so that Google would re-crawl and update their index to see the links were no longer live, removing the need to include them in the disavow file.
I kept a separate 4 spreadsheets each time we worked on the links.
Once I had these 5 spreadsheets down, I got our inhouse link building assistant (Dan S.) to work on the disavow file itself:
- The links I couldn’t find contact info for, Dan inserted the domain and added a “We couldn’t find the contact info for this site” prior to the domain.
- The links we contacted but didn’t get back to us or didn’t delete the links, we put the message “We contacted this company, via email on 2 occasions, but they did not get back to us or delete the links”.
- The 3 domains that weren’t currently up (through the 24h downtime test) Dan put into the file with the message “These domains aren’t currently up, but may be in future”.
Here’s what an example disavow file looks like:
We then uploaded and submitted the disavow file.
Now that we had over 40 links deleted and had an up-to-date disavow file submitted, I had Dan write up a reconsideration request explaining the process we took, showcasing all 45 of the links we had deleted, and included samples of the emails we had sent to the ones that did not respond.
We submitted the request on December the 15th, and on December the 27th/28th had the request accepted and the penalty removed – Normally penalties get responses quicker, though we expected some delays over the holiday periods.
Our initial thoughts on the OnPage from a quick review was that of utter horror.. I decided it’d be best to do a full site audit as soon as possible, so we could get a true idea of the extent of work required.
Once we did the audit, I had my developer extract the entire site and set it up on a local server so that we could make our OnPage changes offsite and then upload the changed site once we had finished. We did however upload the sitemaps after the changes were live, as we built the sitemaps after we had done the changes offsite and uploaded it, as you can get a lot of bugs building sitemaps for large sites off the domain itself.
I had Matthew (our OnPage specialist) do a complete audit of the site, which he spent around 12 hours total creating.
We noticed immediately that the site had:
- Poor Meta Data (Auto-Generated from the CMS)
- Duplicate Content Problems
- Not Much Internal Linking
- Poor Sitemaps (Literally just covering the main pages and category pages, no product/blog posts etc)
Matthew used a plugin in the CMS they were using to allow us to update the meta data, as previously every page’s meta data was automatically generated from the CMS.
He systematically went through every page on the site (using the crawl from Meta Forensics) and updated the meta title and description with customized options, rather than having a previously automatically generated (with no keyword usage) version, most had no meta descriptions at all.
Duplicate Content Fixes
Due to the way the CMS was displaying options in the category pages, it was causing a ton of duplicate content – So we added canonical tags to each of the pages that went on from the main category page.
The weird, out of date commenting system the CMS had put on the blog posts also made additional pages for every comment a user made on the posts. We had the developer hotfix this and updated the commenting system to a newer version, which stopped this issue, and removed about 40 duplicate post pages.
Due to the site being re-designed 4 times in 6 years and changing CMSs 2 times in that timeframe, it had a ton of broken links onsite and from links pointing to pages.
I then had Matthew export the 404s on webmaster tools and use the screaming frog scan to check for any 404s onsite. He found all the 404s then found the relevant pages he could 301 them to.
He then formatted the 404s/301s into a spreadsheet, which resulted in over 100 404s. He sent the spreadsheet to the developer who updated all the 404s to 301 to the relevant pages.
The site had over 50 blog posts from the past 6 years and we decided to have our writer manually go through every post in the blog and update any grammatical/punctuation errors, add (where relevant) additional content, schema markup/formatted data (such as table of contents) and improve the overall OnPage of every post they had on the blog.
We had him make notes on the updates he was doing, and this added 78 internal links and over 3,900 words to the site.
The previous sitemap had literally only got about 20 pages on a site that had well over 200 unique pages.
We split the sitemaps into:
- Main Pages (Home, Blog, About etc)
- Blog Posts & Blog Categories
- Special Offers (Due to this site being in the holiday niche it’s custom quotes, not specified price products, though it has got special offers with fixed pricing that change a lot)
Re-Submission / Indexing
After we put the site live, we used webmaster tools to fetch & submit the homepage (and all links), deleted the previous sitemaps and submitted the new ones and fetched/submitted every blog category page on the site (and all links) as we used a lot of internal linking in the content update, this tends to help the crawl rate jump quite a lot.
I also submitted the homepage, sitemaps and category pages through 2 premium indexing services to up the crawl rate further, and submitted a lot of the good quality links they had through indexing services as well.
This lead to Google crawling and updating the site, and showing improvements to the traffic in just 5 days.
Once the penalty removal request was accepted, we started to see traffic slowly coming back to how it was before. The real results kicked in once we uploaded the updated site with all of the OnPage tweaks, we saw a massive leap over the course of around a week.
After the penalty was removed on December the 28th, we saw a 100% increase in organic traffic, going from around 50 – 60 visits/day to an average of 110 visits/day.
We started uploading the OnPage changes the week commencing January the 18th, on Friday the 22nd we saw a MASSIVE leap in organic traffic that has stayed at a steady rate, increasing by over 250% to an average of 351 visits/day.
Due to fairly specialized type of website this is and the decent link profile that had occurred naturally, rather than the horrible one the inhouse team had tried to build.. We didn’t believe it was necessary to do a link building campaign on this website, and it’s new positions mean it’s dominating it’s niche/area. As a direct result of this impressive campaign however, the company who owns this site signed over 3 other websites full time to us in the Summer of 2016.
If you’d like to see similar results, then why not get in contact with my agency today.