What is robot.txt file and how to make

What is robot.txt and how to make it for selfMade Website or blogger for proper indexing

Other article

If you have a blog or website then you must have felt that sometimes all the information that we don’t want to public on the internet. But there are already public, do you know why such a thing happens. Why our very good contents cannot be indexed after too many days. If you want to know about the secret behind all these things, then you will have to read this article carefully so that you can know all these things until the end of the article.

 

Do You know about robots?  a robot is a floor machine and it follows our command. This is the same work of robots.txt file, whatever command we will give to this file, send this command directly to the search engine. As you know, the search engine’s job is to show your posts as per the search query.

 

What is a robots.txt file

What is robot.txt file

As the name suggests, .txt is an extension of a text file in which we can only write text. This is a robot.txt file, in which we write some text or message. In this message, I write about which information is to show in the search engine and which one is not!

 

For example, if you do not want to display a page from your blog in search engines, or if you want the search engineer to not index the category or tags in your blog, then you can see all these things inside the robots.txt file. Later whenever a search engine such as google, yahoo, bing will start indexing the post of your blog, then all these search engines will get a message from the robots.txt file, that all the things from the blog index Do and these Do not index anything else, this makes your website SEO friendly and with that the search engine easily crawls your website.

Robots.txt files can be very beneficial for you if:

  • You want search engines to ignore duplicate pages on your website
  • If you do not want to index your internal search results pages
  • search engines do not index your directed few pages
  • not want to index some of your files such as some images, PDFs
  • If you want to tell search engines where your sitemap is stable

How to create robots.txt file

If you have not even created a robots.txt file on your website or blog then you should make it very soon, because it is going to be very favored for you in the future. You must follow some instructions to create this:

Well, there are many websites that make a robot.txt file for the blog I would like to say that you do not have to go anywhere. Here today I am giving you the code of robots.txt file which is told by the blogger, if you wish, you can see it by visiting the blogger help forum. If you want to see it, click here for it.

Basic Syntax of robot.txt file

In Robots.txt we use some syntax, which we really need to know about.

User-Agent:  Those robots that follow these rules and they are applicable (eg “Googlebot,” etc.)

Disallow: Using it means blocking pages from bots which you do not want to be able to access it. (Need to write disallow before files here)

Noindex: Using this, the search engine will not index your pages that you do not want to be indexed.

All user-agent / disallow group should use a blank line, but note here there should be no gap between two groups (the user-agent line and the last Disallow should not be between the gap .

Hash Symbol (#): Can be used to give comments within a robots.txt file, where all the items of the first # symbol will be ignored. They are mainly used for whole lines or end of lines.

Directories and filenames are case-sensitive: “private”, “private”, and “private” are all very different for all search engines. Let’s understand this with the help of an example. Here’s a note about him.

The robot “Googlebot” here is not written in the statement of any disallowed by which it is free to go anywhere

All the sites here have been closed where “msnbot” has been used.

All robots (other than Googlebot) are not permitted to view / tmp / directory or files called / logs, which are explained below, through comments, eg, tmp.htm,

 

/logs or logs.php.

User-agent: Googlebot

Disallow:

User-agent: msnbot

Disallow: /

# Block all robots from tmp and logs directories

User-agent: *

Disallow: /tmp/

Disallow: /logs # for directories and files called logs

How to Add Robots.txt File in WordPress

  1. If you use WordPress then you must have installed Yoast SEO, then you have to go to Yoast SEO, click on Tools, then click on the file editor.

go to yoast seo, click on Tools, then click on the file editor.

 

  1. Now copy the code below and paste it into the robots.txt file and then save changes to Robots.txt and save it.

 

User-agent: *

Disallow: / wp-admin /

Disallow: / cgi-bin /

Disallow: / comments / feed /

Disallow: / trackback /

Disallow: /xmlrpc.php

Allow: /wp-admin/admin-ajax.php

 

User-agent: Mediapartners-Google *

Allow: /

 

User-agent: Googlebot-Image

Allow: / wp-content / uploads /

 

User-agent: Adsbot-Google

Allow: /

 

User-agent: Googlebot-Mobile

Allow: /

 

Sitemap: http://wikieedia.com/sitemap.xml

Note: Place your sitemap in the place of the sitemappaste it inside the robots.txt file

——————————

If you use blogger(blogspot,wordpress.com, other free tools) for post articles and you don’t know how to add/edit Robot.txt file in your blogger account then this separate article will help you to understand how to make/edit Robot.txt file and why the robot.txt file is most important for any website.

Click here to read full article about how to add robot.txt file in blogger

——————————

 

 

What if we did not use the robots.txt file?

If we do not use any robots.txt file then there is no restriction on search engines where to crawl and where it can not index all the things that they find in your website. This is all for many websites but if we talk about some good practice, then we should use the robots.txt file as it allows search engines to index your pages, and to give them all the pages repeatedly Do not need to go.

 

Advantages of using Robots.txt

By the way, lots of use of robots.txt is given to me, but I have to tell here some very important information about which everyone should be aware of.

 

  • Using robots.txt, your sensitive information can be kept private.
  • With the help of robots.txt “canonicalization” problems can be kept away or multiple “canonical” URLs can also be placed. Forgetting this problem is also called “duplicate content” problem.
  • With this, you can also help Google Bots to index pages.

 

I sincerely hope that I gave you complete information about what people are talking about robots.txt and I hope you guys have understood about Robots.txt. I am convinced of all the readers that you too share this information with your neighbors, relatives, your friends so that we will be aware of our interactions and will all benefit from it. I need people’s support from you so that I can bring you even more new information.

 

My always try is to do that I always help my readers or readers on every side, if you have any doubt of any kind, then you can ask me uncomfortable. I will definitely try to solve those Doubts. Please tell us how you wrote this article on what Robots.txt has written so that we also get a chance to learn something from your thoughts and improve something.

0 Shares

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.