Journal logo

Excluded page in Google search console is not a problem anymore | Problem solved

Get your article indexed using Google search console

By ghadermPublished 10 months ago 5 min read
Like

Coverage problems in Google Search Console, or the problem of excluded pages in Blogger blogs one of the problems that appear a lot for everyone who owns a blog on Blogger, and certainly everyone who has had this problem wants to know what is the real reason for this problem, and most importantly, what is the best way to solve it, and this is what you will find here in this article to learn the reason and the best solution to this problem

So when you visit the Google search console site and go to the coverage section, you will see four categories: Error, pages that are valid but include warnings, valid pages, and excluded pages

First, you should know that the most important thing is to see the error section. If the number is zero, this means that you don't have any problems to worry about, and the rest of the categories are simple matters

Second, you have to see the number of valid pages, if this number is equal to the number of posts you have published on your blog, this means that you don't have any problem, but if the number of articles you have published is greater than the number of valid pages, then this means that there are problems

As for the valid pages that include warnings, you have to click on them to see the details. For example, it shows you "Indexed, but it is blocked using robots.txt." This means that these sites are already entered by the search engine, but the robots.txt file that you set is the one that prevents the indexing process

When you enter the details to find out what pages are facing this problem, you will see the static pages and section pages, the static pages are the ones that you put such as "About Us", "Contact Us", "Privacy Policy" and others, as for the section pages is the sections that you put for your posts, such as to collect all posts related to a specific niche and put them in one group

These two types of pages you must prevent from indexing so that they don't appear in search engines, as they weaken the archive of your blog

After that, you compare the number of pages that the site has placed for you in this category with the number of fixed pages and sections you have. If it is equal to it, then you don't have any problem

Finally, we come to the fourth category, which is the excluded pages, and here it is possible to find a huge number greater than the number of your posts already, so how does this happen

The great thing about this site is that with one click you can know the details of anything you want, so you click on this category and you find the full details, for example, you find the type of exclusion on one of the pages “alternate page that includes an appropriate keyword, and when you click on this type, you will see more details and links to your pages and articles At the end of each link there is a letter M1 or M0. This means that there is another page that is very similar to your page or that there is another main page with the same topic you are talking about and your page is the alternative to the main page

At the beginning of the page, there is an option called: More details

When you click on it, Google tells you more details and what you should do and tells you exactly that your page is an exact copy of another page that Google marks as basic, so no action is taken on your part

And the second type, which are pages that have been blocked by robots.txt, which are things that you have blocked yourself, so they don't cause any problem for you

There are many other types, but they don't cause any danger or problem to your blog, and all you have to do is just ignore them, as long as the number of valid pages equals the number of your posts, so you do not have any problem

The most important step that you must take on this site is to set the best options for robots.txt. this is done by going to the control panel on the site and then clicking on search settings, and here you will find a part of the custom robots.txt file

Here you block every page you don't want to It is indexed like any topic you already had but you deleted and also you go to the static sections and copy the link and type "Disallow" and then you put the link, you put Disallow:/search and at the end of the file you put your sitemap :/ "link to your blog" Here, you save

After this step, you copy your robots.txt file and go to the option to test the file to test its durability and safety. You send the link and the file is already being sent

After you have done all these things, you have to go to the excluded page that you want to archive and then click on "URL check" after that click on "indexing request" and wait for the indexing to be done and if you are lucky your page will be archived within hours

For me, the search engine crawled my page within a day and it was archived the next day

Conclusion:

The problem of excluded pages is faced by many people who own a new website and want to archive its content, so I shared with you the reasons and solutions for this problem

how tobusiness warsbusiness
Like

About the Creator

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2024 Creatd, Inc. All Rights Reserved.