15 Aug 2017

How to Add Robots.txt file in blogger

What is Robots.txt.

Every blogger site you create the post on your site ,robots.txt file is auto-generated by blogger.the purpose of robost.text file 
is croawler etc. sen by search engines like Google,yahoo,bing, about your blogger structure, and your blogger site all post crawled and index
in search enginnes. 


Where is located Robots.txt.

Step1. you can find Robots.txt file, you Goto the blogger.com dashboard ==> Setting ==>Search engine Preference ==> Crawler and indexing

==>Custom robots.txt ==> click on Edit.




Step2.

Code blow is-

User-agent: Mediapartners-Google
Disallow: 

User-agent: *
Disallow: /search
Allow: /

Sitemap: http://www.abcbblogger.com/sitemap.xml

User-agent: Mediapartners-Google- If you wanna to our site is crowling and Adsense create then this is most importent part of your site. but if you do'nt want to Adsense
you can removed both this lines.

User-agent- this is robots and crawlers.


Disallow: /search- If your page level result like as http://www.rockprogrammer.com/search/label/Asp%20.Net will never crowled and indexed.


Allow:- By default crowled home page keyword Allow.


Sitemap- Robots.txt is a straightforward content document that is set on your site's root catalog. 

Adding Custom robost.txt to blogger.

Step1. sign in  to your site blogger account.

Step.2.  Setting ==>Search engine Preference ==> Crawler and indexing
==>Custom robots.txt ==> click on Edit.

Step.3. click the edit button then select the yes .

Step4.  plese the past you cade.

User-agent: Mediapartners-Google
Disallow: 

User-agent: *
Disallow: /search
Allow: /

Sitemap: http://www.abcbblogger.com/sitemap.xml

step.5  save the button.

No comments:
Write comments

Contact Form

Name

Email *

Message *

© 2014 Rock Programmer . Designed by Bloggertheme9
Powered by Rock Programmer.