Tools for Automating Tasks for Google Reconsideration Requests and Profile Cleanup
Tools can eliminate a large portion of the sites from the link review list for your Google Reconsideration Request. First you want to identify all deindexed domain/pages, link networks and spammy sites that the tools will uncover. My usage tells me these are more than accurate enough to warrant their usage in reducing the number of manual reviews you do which is by far the most time consuming part of the job! Some tools are just used to collect links for review.
Using only the Google Webmaster tools list of links is likely not a good idea so you should use as many as you can afford and find. One trick I learned was to change the font color from each so when you merge the lists in excel it is clear where the dupes came from which will aid in identifying sites that were found by numerous tools which again helps in the review process. You can also find links with characteristics that mean they do not need or will need special attention during review or may well be removed immediately for example nofollow and 302ed links pass no juice so… carry on!
Gathering Your Link and Scraped Content Data
Gathering your link data is where you should start! We have used all listed below:
- Majestic SEO
- Google Webmaster Tools
- Bing Webmaster Tools
- Link Detox
- Link Prospector
- Client list
- Raven Tools
Tools for Gathering link Data
Searchmetrics: We can get a list of back links from and the analysis tools are great for getting a comprehensive overview of a sites link profile. In particular geo location by country and other link metrics useful for cutting review time. Using the competitor analysis it is easy to get a quick snapshot of a sites link profile which again may lower the time spent reviewing the site. This is more for when you are doing the actual removal reviews.
Majestic: The Majestic toolset provides many useful metrics to quickly check aspects of the linked domain and anchor diversity as well others. It also provides a quality indicator but… I’m not a fan of those metrics from Majestic.
ahrefs: Another great tool that will find links others missed it also has some very useful link info like is the link nofollowed, link is a form or Image, anchor text and more. The data format is our preferred format as it has most of what we need to send the message to the webmaster or skip review if it is nofollowed.
Webmaster Tools: Google and Bing Webmaster Tools both provide lists of links and a disavow tool. I do not know much about the Bing tools as for the most part I don’t see it sending a lot of traffic so I’m more worried about Google. Google’s link tools are most often mentioned for this however IMO, they are not very reliable since the tool only supplies a sampling. That said I will always start a forensic audit in GWT looking for clues like “intermediate links” which show sites Google knows are redirecting or are sites where A links to B and both sites are on a 3rd party site (often a directory). This is very important for unnatural linking because it indicates a footprint of a link network has been possibly discovered.
LinkDetox: I was very skeptical at first about the ability of an automated tool to detect what takes me hours to do manually. So I ran the tool for a site I was working on and it did not return any false positives and uncovered many sites that other tools and Google Webmaster Tools did not find. I was a little underwhelmed by it charging for anything of value. I don’t want to register for the other tools when this is all I wanted or am the least bit curious about. Charge me for the one time usage. As to the actual disavow file it was poorly formatted had no delimiters and required a manual edit to get it into any kind of usable form. The comments or lack therof are quite simply a joke. Does it do a good job of identifying bad links. Yes! I think the data formats are poor. I’d rather pay more and get usable data without subscribing to yet another link analysis tool!
Link Prospector: I do use this a little for collection but its real value is in detecting deindexed and scraped content. I can’t begin to say enough about this tool. I have been replaced for the most part by Link Prospector. All I really still use is my knowledge of Google operators. Link prospector gets the results and puts them in a nice spreadsheet I can use any way I choose.
I am a webmaster who has for the most part forsaken tools because I just don’t make decisions based primarily on data. I wanna’ see and experience the SERP which I then cut and paste the results into excel. You can use this tool to test sites before moving them along the review process to being reviewed. Do I really need to review a deindexed site or page or scraped content? No you are pretty safe in assuming the are low quality pages Google has removed for some reason.
Client List: Mainly looking for a list of paid links but or any others others they recorded
Raven Tools: Provides a list of links but IMO did not find it overly useful or adding anything another service didn’t do.
Tools for Pre Processing the Review List
These tools are an important part of the process because you can use these tools to add sites/pages to the removal list without actually reviewing them! “How can you do that? Won’t you make some mistakes?” No! These sites are flagged by Google for you because a page or site that is deindexed is likely not the quality you want and they definitely are not doing anything positive for the site since they are deindexed and not even in the search index. Scraped content is the same deal!
Check the Page for Your Link!
The first thing I did when I started this procedure was think about the different reasons that could result in no review being required. The list included the sites that have already removed the link either by an error, the link was removed without being requested (don’t list it as good or bad… just don’t review it), All of these can be removed from the list! So once you have established there is still a link that may need to be removed it should go into the master list. This is the only tool that requires no further processing.
I did not actually think of this until I started reviewing sites and realized some had the links removed, the site did not resolve (shuttered?) and other tests that can be done by simply requesting the file, checking the response code and checking for your link. If any of these fail…nothing to you can do here!
Link Detox: EEEEEEEEK My Spam Network has Been Detected!
LinkDetox is very good at finding networks and deindexed sites. In the case of link networks you should begin the shuttering of the link network. Note I did not say redirecting! IMO, only pages that rank should be redirected. We have seen where redirecting may have some bad mojo so be careful what you redirect to the money site it may be toxic! The safe bet is if the page doesn’t rank it has little link equity anyway. In the cases where it does rank it is better to move the content to the main site with a 301 from the original. This is a clear indication to Google that you are taking the process seriously.
Many sites have a small blog network or a network of resource and reference sites supporting the main site. These were attacked pretty savagely with many getting unnatural link messages. If you were using these techniques and are doing a reconsideration request IMO, this is where you start the cleanup…in your own backyard! This is also one of the offences that have been on Google’s radar pretty much since day one. If you have a link network and Google knows about them if you do not deal with them the reconsideration will never go through to remove the penalty. So deal with it!
Since these are often a big part of the reason the site is not ranking dealing with them will show up in the rankings almost immediately. If nothing happens then you can almost be sure the sites were not contributing to the rankings. I’m pretty sure it won’t be because Google didn’t find the network. It is likely that all the equity was drained off by Google just ignoring all links on the site. The sites returned in this report will be set for the contact/ final phase of the review and removal process. If no contact info is found you can add them to the disavow process list because if you can’t contact them you must disavow the link so add to the disavow list.
Finding Deindexed sites in Your Back link Profile
The easiest way to find deindexed pages and sites is to do a site:search of Google. Again doing them one at a time and updating your spreadsheet is tiresome and very time consuming. Enter Link Prospector! For this you add a column to your spreadsheet and concatenate the site: syntax and domain in a separate column. Then you open a custom report and paste the column from the spreadsheet into the custom report. I strongly suggest setting the number of results to more than 1 but not more than 3. More than that and you are wasting bandwidth and tokens! If you set it to 1 you get only results from Google! Run the report! There is another test I do when the URL is not the root domain. I test for root domain as well.
Finding Scraped Content
I use Link Prospector for this as well. The technique you should use is to use a sentence from the last paragraph. I’d avoid the last sentence and be sure to quote it! Drop it into a custom report and voila when it is through processing you now have a list of all the sites scraping your content. I was not positive but it seemed that in cases where the PR was higher than your domain you are likely being outranked! DMCA that sucker! You can also add that to your list of removals for final processing. You still have to review the sites but the loading of the offenders in a spreadsheet and doing the queries saves a ton of time and energy.
Final Processing to get Your Final List of Reviews for Links Removal
Once the tools identify “problematic” or non reviewable links I can remove them from the review list by looking for contact info on the website. Non reviewable links would be links that do not need review because Google has done the work for you by (deindexing the site or page) or another tool has identified a network or scraper. Once I have determined there is no contact info in the form of an email or found no contact form/page it can be added to the disavow list as an unwanted link you are unable to contact! These should be removed from the review for removal list and added to the disavow file as an unreachable contact. You could also do a whois but… automation is not easily found and often these domains are not public they are locked down records.
Once your link and content profiles are cleaned up you can do monthly or more infrequent monitoring to keep these healthy which makes you wealthy and a wise SEO!