Post by account_disabled on Mar 6, 2024 22:01:28 GMT -8
The valid ones with warnings are often pages that in themselves would not have indexing problems such as to decree their exclusion, but which nevertheless require attention for one or more behaviors considered anomalous. For example, among these reports you can find pages that are correctly indexed, therefore visible in search results, but which are however blocked by Robots.txt files. This situation occurs when a web page is first put online normally and then - later - blocked using robots files. In these cases it is worth remembering that the robots.txt does not serve to exclude a resource from the index , but to ensure that it is not followed by the spiders of one or more search engines.
They are very different things that are often Venezuela Phone Number confused. Therefore, if you are interested in excluding one or more pages from the index, you must first insert a robots meta tag set in Noindex, then - once deindexing has taken place - you can block the resources with robots.txt. Of course, if a page is not needed I prefer to remove it and (if necessary) redirect it, but to each their own. Valid Pages A big SEO problem is considering that valid pages are all right regardless. Many colleagues also neglect to study the valid pages in index coverage, because they are "valid", but is it really appropriate for them all to be visible on Google? Index coverage divides valid pages into two classes, those submitted via the sitemap and those that are also indexed but not submitted. This first distinction should already make alarm bells ring in your head.
From my point of view, everything we would like to be positioned should be listed in the sitemap, while what we would NOT want to be positioned should not only be excluded from the sitemaps, but should not even be reachable from the website, at least not through explicit paths with href in the source. The "valid" pages must be studied carefully , because they can help us understand whether we have compiled the sitemaps appropriately or whether we have instead let them list pages with no relevance for positioning. On the other hand, by studying the valid pages that are not sent via sitemap, we may find paths that lead to pages that are central to the business model, which however for one reason or another are not found in the sitemap. Of course, if you manage a 600-page website it is difficult to run into problems of this nature, but if you manage one with 6 million pages, the consideration is anything but trivial.
They are very different things that are often Venezuela Phone Number confused. Therefore, if you are interested in excluding one or more pages from the index, you must first insert a robots meta tag set in Noindex, then - once deindexing has taken place - you can block the resources with robots.txt. Of course, if a page is not needed I prefer to remove it and (if necessary) redirect it, but to each their own. Valid Pages A big SEO problem is considering that valid pages are all right regardless. Many colleagues also neglect to study the valid pages in index coverage, because they are "valid", but is it really appropriate for them all to be visible on Google? Index coverage divides valid pages into two classes, those submitted via the sitemap and those that are also indexed but not submitted. This first distinction should already make alarm bells ring in your head.
From my point of view, everything we would like to be positioned should be listed in the sitemap, while what we would NOT want to be positioned should not only be excluded from the sitemaps, but should not even be reachable from the website, at least not through explicit paths with href in the source. The "valid" pages must be studied carefully , because they can help us understand whether we have compiled the sitemaps appropriately or whether we have instead let them list pages with no relevance for positioning. On the other hand, by studying the valid pages that are not sent via sitemap, we may find paths that lead to pages that are central to the business model, which however for one reason or another are not found in the sitemap. Of course, if you manage a 600-page website it is difficult to run into problems of this nature, but if you manage one with 6 million pages, the consideration is anything but trivial.