Este artículo tiene:
Removing obsolete content: Learn how to remove pages indexed by Google and when to do it
Google every day looks more and more for webmasters and at the same time, gets to work less and more efficiently. To do this, it created Search Console and gave us control in many applications that we didn’t have before. One of them, is URL removal in its search engine, a very useful tool that helps us to have more control over our blog.
In SEO, it is important to have control and without a doubt, this tool makes our job easier. Learn how to remove obsolete content Google.
How to remove error reading url?
By many factors, we can be given the situation of having to de-index URLS from Google. Some of them can be:
- Web page change
- Attack on our server
- Pages created erroneously
You have decided to change the website, move from WordPress to Drupal, Joomla! to WordPress, etc… and the change is so big that you are not going to keep urls nor are they going to have the same sections and/or post. Well, you will have dozens of URLs to delete with obsolete content.
The first step to remove obsolete content Google is to redirect as much content as possible to a URL that maintains the features, but if not, you will have to remove the URLs of those sections that were not working as you would have liked (Only as a last option).
At this point, Google will have already indexed all the content so we will have to remove obsolete content Google manually to save time.
Attack to our server
You have suffered an attack on your server and have had hundreds of malicious URLs indexed that Google is detecting as low quality content and is affecting your on-page SEO.
It is not a widespread case, but surely at some point in your web life, you will see a case and you will need to remove the URLs that have created you. Keep calm, the evil is already done, it only remains to fix it and hope that thanks to Penguin, the waiting time is little to return to normal.
Pages created erroneously
You created a page and put in a URL that is wrong, of course, you noticed it late, and it’s already indexed. Don’t worry, we’ll use Webmaster Tools to fix this mess in less than 1 minute.
Yes, for next time, try to think about the URL name before, as it is possible that it has been shared on social networks and starts to give error if accessed through them. As an extra tool, we will use .htaccess to redirect via a 301 the old URL to the new one, so we will have everything 100% solved.
Delete pages indexed by Google, is it possible?
In 4 steps we can remove obsolete content Google:
- First we must access to Google Webmaster Tools with your account, the same that you have linked to your website, blog,etc… to access the control panel of that website.
- On the left side menu you must go to Google Index and then click on URL Removal. If you are already registered and logged in Webmaster Tools, you can access faster by clicking on this link: https://www.google.com/webmasters/tools/url-removal
- UOnce we are inside this tool, we must click on hide temporarily, and indicate the relative URL and not absolute, in case of putting it absolute, the URL sent to delete will be wrong and will never be deleted.
- Choose the correct option:
- Temporarily hide and delete from cache: With this option we get that the page does not appear in Google results for the next 90 days, if we do not delete or remove that page from our website, after this period, it will be indexed again.
- Delete Directory: In this case, the effects are the same as deleting a page, but it will delete all the content that exists in that directory. Delete the directory from the website or block it with the robots.txt file or after 90 days they may reappear.
- Remove from cache only: Google will remove each requested page from the search engine cache, but not from the search results.
How long does it take Google to remove an url?
Generally speaking, it takes between 3 and 24 hours to remove it from their search engine results, and you can finally breathe easy, knowing that you won’t be found by those URLs, and they won’t access those pages.
To avoid future indexing, we advise you to modify the “robots.txt” file, so it will not index unwanted content again. To block access to search engine spiders you must include this line:
You no longer have an excuse for having pages indexed in Google that are not wanted