How To Add Custom Robots. txt File In Blogger
So are you ready to take your blogger/blogspot blog to next level ? so you want to add Custom Robots.txt File In Blogger blog. You are at right place as today i will be discussing How To Add Add Custom Robots.txt File In Blogger blog.
Must Read :-Tips to Improve SEO Rankings
Must Read :-Tips to Improve SEO Rankings
What Are Custom Robots.txt File ?
Custom Robots.txt is a piece of code or information that let search engine crawlers know what to crawl i mean index from your blog and what it must don’t. So when search engine comes to your blog it first comes to Robots.txt File after that it moves to other areas of blog. So robot.txt file is piece of code that tell search engine crawlers what to index and what not. It’s like Traffic Police Warden it can allow or stop search engine crawlers from indexing certain areas of Blog.
How To Add Custom Robots.txt In Blogger/Blogspot :-
Adding Robots.txt In Blogger is very easy, all you need to do is open up Settings >> Search preferences >> Custom Robots.txt >> Enable >> Paste The Code >> Save Changes
Here is the Code That you will Paste
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://yourblogurl.blogspot.com/feeds/posts/default?orderby=UPDATED
Note :- Make sure you add your blog URL in last line where it says Sitemap : and change your url after that.
Explaining Code :-
User-agent: *
So fr those who are not programmers this piece of code tells spiders what to crawl and what to not.
Disallow: /search
Disallow as name suggest doesn’t allow search engine crawlers to crawl certain area’s of your blog and search is for those pages ,labels in your blog that you don’t want to get indexed.
Allow: /
As the name suggest it allows search engines to crawl certain areas of our blog, in our case that is our sitemap, and sitemap includes our blog-posts.
Any questions Please Comment below.
Post a Comment