jump to navigation

An Opinion about Robot.Txt File October 29, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , ,
add a comment

The chapter of Robot.Txt is not a new but recently, it came in light again when Matt Cutt talked about it on his blog. Robot.Txt file is known as to restrict the access to our site’s pages by search engine robots to crawl.

Use of Robot.Txt File:

Robot.Txt file disallow the pages to crawl by search engines that are restricted. Now, there are some questions – why we should restrict to crawl pages by search engine’s crawler, is it helpful to restrict pages etc.?

How to use Robot.Txt File:

This is the basic things about robot.txt file that how to use it. Many people still don’t know about the uses of it. It is used in dynamic and static both sites. It is always used in root directory.

Robot.Txt file is a txt file that contains some code of disallowing to index. You should need not to mention about allowing pages. You should write only disallowing pages in text file.

Importance of Robot.Txt File:

Robot.TXT is the best option to control over your own site’s pages. The pages that are not necessary for search engine, you can restrict them by robot.txt file.

For example:

Suppose, if you have a dynamic site. You are selling a product that offers 200 another products. You keep 20 products on a page. You make 10 pages to show the offers and products. You make a title for first page but the title goes to another 19 pages also. Now, there is a problem for search engine to rank your pages. Search engine understands different URL but there is same title on another 19 pages. Now, you can use robot.txt pages to disallow another 19 pages.

Another example:

Your site provides session IDs during the login; you can restrict session IDs also by the using robot.txt file.

How to Make Robot.TXT File:

Open a notepad and write these codes:

Example of Robot.TXT File

You can write many codes also to disallow your sites pages. You can validate your robot.txt after making it.

Some Basic Aspect of Search Engine Optimization August 30, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , , , , ,
add a comment

We already have discussed on various topics of Search Engine Optimization but still there are some more topics which are remained. In SEO, there are the biggest factors as On-page optimization and Off-page optimization. We already have discussed about those chapters. I need to explain some more basic chapters of SEO which is being ignored by many SEOs.

I have to deal Server Side Factors and some general topic of SEO. In Server side factors, the main topics are – 301 and headers status code.

Many times we face the problem. There is main problem with page redirection which is known as 301 URL directions also. There are big question on this topic. It should be redirect or remove? What is the purpose of redirection? Is redirection will work?

Yes, all these are right questions. From my point of view removal of URL and redirection of URL both have own importance. We should choose those URL to remove which have no visitors and nothing importance from search engine point of view. On the other hand, a page which has high PR or high visitors should be redirected. Redirection is not the last things. There is need of webmaster submission also. There is Google webmaster where all the redirection should be submitted to make known Google crawler about the redirection.

Another thing is Server status code. You should monitor your page server status code. What is it? It is 200 errors or 404 errors? In both situations you have to work on those.

Many times we face large page size also which make pages very slow to browse. So, proper page size should not be exceeded from 110 KB.

There are many browsers so; your page should be passed with each browser properly. All things should be appropriate in each browser.

Some other things are appropriate style sheet, appropriate templates and metadata should be checked, standard navigation, HTML and XML sitemaps and analytical code.

So, all these are basic things of SEO which should be passed properly through SEOs. These are helpful in crawling and ranking.