Today my supervisor told me to search some info on the robots.txt file. This is a text file that indicates the visiting robots (web robots or web spiders) which parts of the website they are allowed to visit or not. This text file should be inserted in the root directory of the website thus that when you enter the following address the file will be opened.

In this text file you should enter the following data:

User-agent: *    this means that this rule applies to any robot that tries to get information from the site

Disallow: /        this will specify the URL that is not to be visited (in this case, all the website cannot be visited by the robots)


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s