Home Blogging How To Add SEO Friendly Custom Robots.txt File in Blogger

How To Add SEO Friendly Custom Robots.txt File in Blogger

0
SHARE
Robots.txt file are often used by almost all big search engines to categorize and archive websites and blogs, or for the webmasters to read your website’s source code. If you’re a newbie in blogging era in this modern day bloggers, and don’t have much technical knowledge. In this article, I’ll show you how to add SEO friendly custom robots.txt file into blogger blogs.
image about seo friendly custom robots.txt file in blogger blogs

In blogger, you’re able to easily customize your custom robotx.txt file according to your own choice. Many of professional bloggers know about robots txt file, but many of new and young bloggers, doesn’t know about this file. So first of all you must know what robots.txt file is.

What is Robots.txt File?

The robotx.txt is also called Robots Exclusive Protocol (REP). It is a text file which is created by webmasters to instruct different search engine robots of which pages and how to crawl and index pages of your website in the search results. It is a group of different web standards that will be used to regulate web robot behavior and indexing techniques. Robots.txt file is saved or uploaded into your web server where you blog or website hosted. All web hosting companies provides the same server to host your site. In this file you can easily restrict any web page of your site from the web crawler robots so that it can’t it index in the search results.

NOTE – Keep in mind that, the entire search engine robots and crawlers first scan your robotx.txt file and then crawl your web page and showing it in SERP (search engine result pages). 

Where is Robots.txt file located?

The robots.txt file placed in the root directory of your blog or website. Search Engine spider or crawler directly read your site file and crawl and indexed pages according to this file. Don’t worry, if you’ve not robots file into your root directory of your blog or site, web crawlers automatically crawl and index each and every page of website or blog.

You can easily access your robots txt file by going your web url which is look like that. 

https://www.techora.net/robots.txt

Simply you can change the www.techora.net and replace it with your own web URL and check in the web browser, you can easily see out your file code / commands.

How Robots.txt file does look like?

If your blog hosted on blogger, then generally the default robots.txt file is something like that which is given below:

User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.techora.net/feeds/posts/default?orderby=UPDATED

This is the by default file which is use in blogger, but you can easily change it which i tell you later in this article below.

Check Out the Cheat Sheet About Robots Text Tile

Block all the web crawlers from all your web contents

User-agent: *

Disallow: /

Block a some specific web crawler from a specific folder

User-agent: Googlebot

Disallow: /no-google/

Blocking a specific web crawler from a specific web page

User-agent: Googlebot

Disallow: /no-google/blocked-page.html

Prevent Specific Posts / Pages from Being Indexed by Crawler

Disallow Specific Post

Disallow: /yyyy/mm/your-post-url.html

In this command, yyyy means that the year which is show in your blog post url and mm show the month of that post, and then your post URL.

Disallow Specific Page

Disallow: /p/your-page-url.html

Parameter About Your Website Sitemap

User-agent: *

Disallow:

Sitemap: https://www.techora.net/sitemap.xml

If you want to get a useful benefits from this file, then always adding the robots.txt file into your root directory, and it’s URL is look like that:

https://www.techora.net/robots.txt

Explanation About Robots.txt File:

Generally, this file is divided into different sections. So I’ll want to tell you about all these things before adding this file in blogger server.

#1 – User-agent : Mediapartners-Google

User-agent generally is the main thing about robots file. By default, it look like that ( User-agent: Mediapartners-Google ) which show that this is a blogger platform from Google, and it is also very useful in Google AdSense robots which help you to serve better and relevant ads. If you doesn’t add AdSense ads into your blog, though leave it as it is.

#2 – User-agent: *

If you’ve any knowledge about programming related subjects then you know that the meaning of ” * “ asterisk symbol. But if you don’t know then i tell you that, it generally specifies that it allows all search engine robots, spiders, and crawlers to crawl your site.

#3 – Allow: /

In this command, Allow keyword used which specifies that “To Do” things for your website. It’s simply mean that crawler can crawl your homepage.

#4 – Disallow: /search

In this command, the keyword Disallow specifies the “not to do” things for your blog. In this command, you’re restricting search engine robots to not crawl your search pages of your website. Means if your page look like:

http://youdomain.com/search/label/mywork

Then crawler cannot crawl and indexed this page of your website.

#5 – Sitemap

Sitemap is the very important for search engine to tell them for your new activity on your blog. You can always create your website sitemap and submit it to all search engines especially Google, Bing, Yahoo, Yandex etc. You can also add your blog sitemap into your robots.txt file which helps you better ranking in SERP’s. In the sitemap file, you all posts , pages are available, so search engine robots can easily crawl and index your all web pages and new posts directly when you published on your blog or website.

By default, robot will index only 25 posts of your blog but if you want to increase the total number of posts, then use this kind of sitemap.

Sitemap: http://yoursite.net/atom.xml?redirect=false&start-index=1&max-results=500

Which tell to search engine crawler to crawler 500 posts / pages of your blog. But if you have more than 500+ pages then don’t worry use the below kind of sitemap.

Sitemap: http://yoursite.net/atom.xml?redirect=false&start-index=1&max-results=500

Sitemap: http://yoursite.net/atom.xml?redirect=false&start-index=500&max-results=1000

How To Add SEO Friendly Custom Robots.txt File in Blogger

Step 1 – First of all go to www.blogger.com .

image about www.blogger.com url in browser

Step 2 – Sign in to your blogger dashboard and select your required blog.

Step 3 – Go to “Setting” tab in the left sidebar options, and then go to “Search Preferences”.
image about how to add custom robots.txt file in blogger blogs
Step 4 – In Search Preferences, you see two options disable by default in blogger, simply click on “Edit” link in custom robots.txt option.
image about how to add seo friendly custom robots.txt file in blogger
Step 5 – After that, click on “YES” button and a blank box open below, simply add your SEO friendly custom robots.txt file in this box and click on the “Save Changes” button.
image about how to add custom seo friendly robots txt file in blogger blog
Now you’ve done your work. Simply check your blog custom robots.txt file and see it works perfectly or not.

Final Words! ! !

So this was the complete and comprehensive guide about robots.txt file in blogger and I’ll show you how to custom add robots file in blogger. I really try my best to describe as much as i can and make it simpler so everyone can easily understand how to add this in blogger. If you not know about any code in robots file, then don’t include in your blog because it will also harm you and your SERP’s ranking. Thanks for reading this whole tutorial; if you’ve any question in your mind, comment below and I’ll response you as soon as possible. If you like it then please supports us by simply share this article with your friends on your social profiles.

 
Keep Happy Blogging! ! !