How to Add Custom Robots.txt file in Blogger
Hey Everyone, If you don’t know what is Robots.txt blogger file and how to add Robots.txt file in your blogger website then this post gonna tell you the complete information about that.
Robots.txt file is very important for the SEO of your website, Using this file in your website helps you to rank faster in search engines and Google bots can easily read code under Robots.txt file which helps in crawling web pages on your website.
Under this post, we’ll discuss the benefits of Robots.txt file and How to add that to your blogger website.
What is Robots.txt?
Robots.txt is a type of file which is consists of simple codes that help to provide instruction web crawlers or bots which post you’re looking to index on search engine or which doesn’t.
Robots.txt file is directly linked to your blog post. Using Robots.txt file you can give instructions to crawlers which post you want to index on the search engine. Generally, all the search engine have own bots like Google and Yahoo both will have their own crawlers, which crawl your web pages and index them to search engine.
All the links on the website are crawled by bots and then they index them to search engines. Using Robots.txt file coding you can manage your indexing data by giving instructions to crawlers.
Robots.txt file is very important because when you update any articles, the bots crawl your pages and index updated articles to search engine results.
With the help of Robots.txt file, you can manage the important website data which you want to index and the non-important like Demo pages, Archives, Levels, etc. can be removed from indexing.
How default Robots.txt file looks like
Generally, the coding under default website looks like
# Blogger Sitemap generated on 2019.12.31
You can check your website Robots.txt details by entering https://www.example.com/robots.txt in the search bar.
Note: Enter your website name instead of example.com.
Let’s understand what is Disallow and Allow coding stands in Robots.txt file.
1. User-agent: *
User-agent: * code means search engine bots are allowed to crawl our website or blog.
Code “*” means all the bots (Google, Yahoo, etc) can crawl our website.
2. Disallow: /search
Disallow: /search means all after your website name if there will be any search keyword then the crawler will not index that to the search engine.
Example: https://www.example.com/search/blogging, Here Crawler will not index your blogging category.
3. Allow: /
This code provides access to bots index your website data.
With the help of a sitemap, the crawler can scan all the existing and new posts on our website. A sitemap helps to know the total number of published posts, categories, levels, etc.
You can check your website sitemap by entering URL in the search bar,
A Sitemap for blogger website: https://example.com/atom.xml?redirect=false&start-index=1&max-results=500
A Sitemap for WordPress website: https://example.com/sitemap.xml
Note Sitemap will work if you’ve created that for your website.
How to create Sitemap and robots.txt to blogger
1. To create Sitemap for your blogger website you’ll need to visit online sitemap generator
2. Enter your website name then click on generate Sitemap.
3. Copy code.
4. Visit blogger dashboard > Setting > Custom robots.txt > click yes > Paste code.
Under this post, we’ve discussed on robots.txt file and why this is important for our website. If you find the information helpful please do like the post and share with your friends.
Sharing is Caring 🙂 🙂
More Interesting post
- Best Google AMP blogger template.
- What is Alexa Rank and how to increase Alexa Rank?
- How to set the custom domain to blogger?