If you do not know what a Robots.txt file is? And how to add Custom Robots.txt file to Blogger, then you read this article completely.
This is very important for the SEO of your blog, due to its incorrect use, your blog can be completely removed from the search engine. Follow this guide to know the benefits and use of Robots.txt File.
The direct contact of the robots.txt file is related to the indexing and crawl rate of your blog. Generally all search engines have their own robots or bots that crawl your blog and the links given in it and index it in the search engine. Because of which whatever you update in the blog, the entire update also shows in the search engine, and what crawl will happen in your blog, it control robot.txt.
What is Robots.txt
Robots.txt is a text file with some simple code, which gives instructions to web crawlers which pages to crawl in your blog and which do not.
This file is saved in the server of any website or blog so that search engine bots can be managed. It is mainly used to prevent any non-importance webpages (Demo page, Archive, Labels etc.) from crawling.
In simple language, through robots.txt file, you have the authority of your blog, which page you want to show in the search engine and which page you want to hide.
Robots.txt File to Blogger
The Note-Search engine scans the robots.txt file before crawling any web page.
By default Blogger’s robots.txt file looks like this, as shown in the screenshot. To see the Custom Robots file, after slipping (/) after the homepage URL of your blog, type robots.txt and press Enter.
Now let us see what is the meaning of this Robots.txt code?
This code means all search engine bots can crawl our blog. Asterisk (*) means all robot (googlebot, yahoo etc).
Disallow: / search
This means that wherever the first search keyword comes after your domain name, the search engine will not crawl it.
This link is the label name (Menu) of our blog. Similarly, all the labels in our blog will not be indexed.
This code helps to index your blog’s homepage by crawl.
Sitemap gives search engine information about all the existing and new posts of our blog. That is, whenever the web crawler scans our robots.txt file, sitemap will get information about all our published posts.
How to Add Custom Robots.txt File Blogger
To add a Custom Robots.txt file to Blogger, first of all login to Blogger and come to Dashboard.
- Now you go to Settings and click on Search Preferences option.
- In this page you will find a section Crawlers and indexing, where there will be an option of Custom robots.txt [Disabled]. To enable it, you have to click on Edit and tick Yes.
- After this, copy the code given below and paste it in the box of Custom robots.txt and click on Save Changes.
Copy Robots.txt code
Disallow: / search
Note: Instead of http://example.com/sitemap.xml inside Sitemap, you use sitemap of your blog.
I have explained the entire process of adding Custom Robots.txt file to your Blogger here, you can also add robots.txt file to your blog by following it.
If you want to hide a post or page from here, then you should have proper knowledge of robots.txt because the misuse of its blog can completely damage the SEO of your blog. Do not tamper with its settings at all without knowledge.
- Image Optimization in SEO
- Propellerads Review
- Best Affiliate Marketing Programs
- How to increase Blog Loading Speed
- How to Fast Index Post in Google
If you have any question related to custom robots txt, then you can ask in the comment and for similar blogging tips you can like our Facebook page.