jump to navigation

An Opinion about Robot.Txt File October 29, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , ,
add a comment

The chapter of Robot.Txt is not a new but recently, it came in light again when Matt Cutt talked about it on his blog. Robot.Txt file is known as to restrict the access to our site’s pages by search engine robots to crawl.

Use of Robot.Txt File:

Robot.Txt file disallow the pages to crawl by search engines that are restricted. Now, there are some questions – why we should restrict to crawl pages by search engine’s crawler, is it helpful to restrict pages etc.?

How to use Robot.Txt File:

This is the basic things about robot.txt file that how to use it. Many people still don’t know about the uses of it. It is used in dynamic and static both sites. It is always used in root directory.

Robot.Txt file is a txt file that contains some code of disallowing to index. You should need not to mention about allowing pages. You should write only disallowing pages in text file.

Importance of Robot.Txt File:

Robot.TXT is the best option to control over your own site’s pages. The pages that are not necessary for search engine, you can restrict them by robot.txt file.

For example:

Suppose, if you have a dynamic site. You are selling a product that offers 200 another products. You keep 20 products on a page. You make 10 pages to show the offers and products. You make a title for first page but the title goes to another 19 pages also. Now, there is a problem for search engine to rank your pages. Search engine understands different URL but there is same title on another 19 pages. Now, you can use robot.txt pages to disallow another 19 pages.

Another example:

Your site provides session IDs during the login; you can restrict session IDs also by the using robot.txt file.

How to Make Robot.TXT File:

Open a notepad and write these codes:

Example of Robot.TXT File

You can write many codes also to disallow your sites pages. You can validate your robot.txt after making it.

Advertisements

Develop a Site from SEO point of view March 11, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , , , , ,
add a comment

I have already mentioned designing aspect for a site in my previous post. There is another aspect that called development. There are some topics to develop your site from SEOs point of view to get better ranking in major search engines.

In the development aspect you have to check junk codes which have not importance on the web page. Junk codes make a web page larger and they are not useful from search engines point of view. Mostly junk codes take place on the web page from designer or developer ends. SEOs should check that unnecessary codes time to time. On the web pages there could be unnecessary spaces also. Those spaces can be harmful from ranking point of view. So, in during the development of a web site you need check very carefully junk codes and unnecessary spaces to remove them.

Another thing includes with URL structure. During the web page development you need to create web folder or web URL. You have to care that in the URL never includes _ (underscore) or – (double hyphens). From the search engines and users point of view in URL you need to include – (single hyphens).

If your site provides logged in option then you need to disallow session IDs to crawl boats. You can include robots.txt file to allow bots to crawl your sites without session IDs.

You need to check always your site on different browser and load timing. Load timing should be exceeded. If load time is high then your site’s bounce rate will be higher.

There should be installed tracking codes in every static page of your web sites. This is analytic codes to track your sites behaviour. The code will let you know how to improve your site, from what sources visitors are coming etc.

Most of the people add Meta refresh tag on the web page. It should not be added on the web page. Meta refresh tag always restrict to show the original content when a visitors request to the web site.