An Opinion about Robot.Txt File October 29, 2009Posted by kishosingh in seo.
Tags: code, directory, robots, search, search engine, search engines, site, url, web pages
add a comment
The chapter of Robot.Txt is not a new but recently, it came in light again when Matt Cutt talked about it on his blog. Robot.Txt file is known as to restrict the access to our site’s pages by search engine robots to crawl.
Use of Robot.Txt File:
Robot.Txt file disallow the pages to crawl by search engines that are restricted. Now, there are some questions – why we should restrict to crawl pages by search engine’s crawler, is it helpful to restrict pages etc.?
How to use Robot.Txt File:
This is the basic things about robot.txt file that how to use it. Many people still don’t know about the uses of it. It is used in dynamic and static both sites. It is always used in root directory.
Robot.Txt file is a txt file that contains some code of disallowing to index. You should need not to mention about allowing pages. You should write only disallowing pages in text file.
Importance of Robot.Txt File:
Robot.TXT is the best option to control over your own site’s pages. The pages that are not necessary for search engine, you can restrict them by robot.txt file.
Suppose, if you have a dynamic site. You are selling a product that offers 200 another products. You keep 20 products on a page. You make 10 pages to show the offers and products. You make a title for first page but the title goes to another 19 pages also. Now, there is a problem for search engine to rank your pages. Search engine understands different URL but there is same title on another 19 pages. Now, you can use robot.txt pages to disallow another 19 pages.
Your site provides session IDs during the login; you can restrict session IDs also by the using robot.txt file.
How to Make Robot.TXT File:
Open a notepad and write these codes:
You can write many codes also to disallow your sites pages. You can validate your robot.txt after making it.