It is possible that at some time you need to eliminate a URL of Google so that does not appear more in the results search. Perhaps one is some content of your Web that does not interest to you that it appears in Google or a page created by a malicious software.
Other pages that usually are eliminated of the search of empty Google are those of test or of content, and also those that have an obsolete content that is not going away to update and it is not worth the trouble that is in the results of Google.
In any case, you need that page does not appear in the results search and here we are going to you to explain how to eliminate URL of your Web that no longer you want that they are in Google.
To use label puts robots with Noindex
The first option that you must so that eliminate an obsolete URL of Google, is to go to the source code of the page that you wish to eliminate and to add the goal labels robots with the value of Noindex.
That is to say, once you are in the source code of the page, you must look for the label <head>. It is in that part of the code in which you must include the following line:
<meta name= robots content= nonindex >.
If you analyze the semantics of the code, you are indicating the robots to him search that they do not index that page when they arrive at her, to eliminate the URL of Google.
Before including it, you must verify that it does not exist previously. The possibility that exists that goal element already is including in the page only that with value INDEX. In that case, it is as easy as to replace it by NOINDEX.
It remembers that this change not only affects to the robot of Google but also to those of other finders as, for example, Bing or Yahoo.
That yes, you must especially consider that is not an immediate process, if that URL already appeared in Google. If it is of new creation, when adding noindex will not be indexed by the robots search.
In addition, noindex in fact is not an order that rather occurs to Google through source code him of the Web but an indication. This means that Google, when arriving at her, could follow it or no.
To eliminate URL with Google Search Console
If you have your Web put in Google Search Console, also you can eliminate a URL thence easily. You only must Access to the option Optimization in the menu of the left and select To eliminate URL.
Once in that section, it is necessary to select to create a new request of elimination and, next, to write the URL to eliminate and to give him to continue. You would only have to hope that Google eliminates the page.
This action will affect so much to version HTTP as to https without mattering if they begin by www or no. Is important that you know it so that you do not send requests for each one of the versions of the URL: if you introduce mipagina.com will be sufficient.
You have the option to consult in the own page of Google Search Console the state of the process of elimination of the URL so that you know if you must conduct some battle more or no.
Ten tells that, using this option, it is very probable that this elimination is only temporary and nondefinitive, for that reason is necessary to verify it once in a while.
The reason is that the URL takes in eliminating of the searches about 90 days, but in the course of that time the URL returns to be including by connections that aim towards her, for example, will appear again in the results of Google.
For this reason, and although generally people usually uses the option of Google Search Console by fear to touch the code, usually are more effective the method of the goal element robots since the one is the robot that by itself reads the order of not indexing that URL.
In fact, the recommendable thing is to use Google Search Console when it is an emergency situation as, for example, that you have published by error a page with personal and confidential data. If you are not in a hurry because that URL disappears, he is more effective to use some of the other methods.
To eliminate a URL with Robots.txt
The robots.txt file serves to indicate the robots search what information can solicit of a Web and what information no. In this way, manages the access of bot that Accesss to the Web.
Through this file and by means of the Disallow directive, we can indicate to him to Google and to other finders which are urls that we do not want that they index.
For it, we only must open the Robots.txt file and add Disallow: /and followed, the URL that we want to eliminate. We will have to make a line different by each from them until we have them all. In fact what we do with this is to prohibit the access him to Google and the rest of robots, consequently will not be able to index it. In order to do this it is essential that the URL also has the goal labels noindex and that already is desindexada. If the URL is indexed and we blocked it by robots, Google will not be able to Access to her and it will continue it indexing.
When finishing, we kept the file and we return to raise it the root of the directory of the Web. When Google returns to arrive at him, it will know what URL must ignore. With this method esteem that Google takes of 24 to 48 hours in eliminating the URL.
How much take does Google in eliminating a URL?
Generally, esteem that Google can take between 3 and 24 hours in eliminating a URL of its results search. Nevertheless, and since we have seen more above, in fact one is a temporary elimination of 90 days.
Once they have spent these 90 days, the URL can return to appear in the results search if it has links aiming towards her. For that reason, to obtain to the definitive elimination of the URL the best thing it is to come to the redirections. Of that form, no external link will aim at the URL that we want to erase.
To eliminate urls in Google of massive form
The situation can occur to want to eliminate several simultaneously urls. If they are too many, processes either described more above can be turned into an authentic nuisance, but there is a form to do it with more facility.
The solution consists of installing an extension of Google Chrome called Google Webmaster Tools Bulk URL Removal from Github.
Once it is decompressed, it will appear a button in Google Search Console that will give the option us to load a file with all the urls that we want to eliminate.
This file must have each URL in a different line and the .txt extension. Most recommendable it is to do it in a notepad since one keeps by defect with that same extension.
When the file has been loaded, Google Search Console will begin to send requests of elimination of automatic form, one by each URL that we have written. We will only have to leave the open navigator until it has finished.
Once you have eliminated urls not wished of your Web, you must realise some redirections so that everything is structured well in the page. Although there are several options, we propose two here to you:
- Redirection 301 if a new URL exists that has similar content to which it has been eliminated
- Redirection 410 if there is no URL that occupies the place of the eliminated one
In both cases, the finder will understand that URL no longer exists and what it will do will be to index the new one in the first case, and to forget the URL in the second.
That yes, you must have a little patience so that these changes take place, since there is to hope to that the robot search of Google returns to happen through your Web to detect them.
He is very recommendable to make scheduled inspections of the state of urls of our webpage, since errors as links that do not work, contained that it is not positioning or the existence of pages without utility affect of direct form to the SEO.
Several tools exist online that serve to determine if there are problems with urls of a webpage, among them own Google Search Console. If is badly something, it is necessary to fix it as rapidly as possible to avoid that the penalties arrive.
In addition, a Web with urls that does not work or that is not of utility harms the user experience reason why, not only we will be losing to that will not return to enter our Web, but their short permanence in the same, will make to go down their positioning.
For this reason, he is advisable not only to verify the state of the pages of our project, but also to know how to react and to correct it in time so that the Web is not affected.
In summary, the actions to exert would be: to add the label puts robots NOINDEX in the source code of the Web, to make the redirections that correspond to each eliminated URL and to send the request of elimination of URL through Google Search Console.
With these so simple steps we will obtain that Google does not index content that we do not want in addition, to clean to our project Web and to avoid that type of errors affects to his positioning in the results search.