jump to navigation

How to Make Effective Meta Tag Descriptions? May 15, 2010

Posted by kishosingh in seo.
Tags: , , , , , , , , , , , ,
add a comment

Title of the web page is not all things. It is able to say the whole content of the the web page. It is only the name of web page targeted by popular keywords. Keywords are the identification of the web page by which we find that page in search engines. Meta description is the introductions of the web page. It gives an overall synopsis about the products or content of the web page.

Meta descriptions are also appeared in the head sections of the web page just like Meta title or Meta keywords. It is also an HTML tag. Meta descriptions are useful for the web pages. It defines the web pages. Search engines show – Title, URL and Descriptions for any searches. Title gives 1st attraction, descriptions 2nd and URL 3rd. However, we already have discussed about the importance of keywords in search engine optimization.

Descriptions Tag:

I think most of the visitors derive to the site reading the descriptions of that search term. What exactly we are looking, says the descriptions. That description comes from our pages. About the Meta descriptions, I have seen many mistaken. People don’t make that on the basis of the given information. They include something else also which is not on the web page. For example – We have an SEO News site; we make a Meta like this – “Get news about SEO in India. Search Engine Optimisation news and tips are from the web page” or related to it. I think it is right. But if you make it as – “SEO, Online Marketing, Web 2.0 and Link Building Services in India. News, reviews about seo and seo services from the web site.” Is it right? I don’t think it is right.

What are the mistakes? They include other things also in their descriptions. On the other hand, there are already pages in their site related to online marketing and link building.

Another thing, descriptions should be descriptive and short. Repetition of keywords should be avoided. Should be used synonyms also instead of repetition.

Advertisements

Reciprocal Links and How to Get Reciprocal or Two Way Link April 7, 2010

Posted by kishosingh in Link Building.
Tags: , , , , , , , ,
add a comment

Reciprocal links depend on mutual understanding of web masters. It is a contract between two web pages which is related to each other in terms of theme and topic. You can say that it is an exchange of links for web pages on the understanding of web masters for their sites.

I have already mentioned in my previous post about three way links and way to get 3 way links. Here, I will write about reciprocal links and way of getting reciprocal links on my experience of link exchange.

What is Reciprocal Links?

Reciprocal links are two way links. Reciprocal links are known as “link exchange”, “links partner” and “two links” also. Site “A” adds a link on site “B” and Site “B” adds a link on site “A”, called reciprocal or two way links.

How To Get Reciprocal Or Two Links?

To Get Reciprocal links normally we follow the concept of link exchange. We send a request to the web master to add our links in returns of they get a link from our website.

Before sending a link exchange request, we fetch data from the search engines, directories, partner’s pages of other web sites or from references of web sites. To fetch or extract data we follow some searching ideas for shortcuts.

Forums are also way of getting reciprocal links after a discussion on link sections in the forums.

You can use Google talk, Yahoo or MSN massager to talk with web masters to exchange links on mutual understanding.

You can find out the data of reciprocal links and fill up form to link exchange.

What Should Care For Reciprocal Links?

If you are doing reciprocal link exchange then you should care about Page Rank.

Site should not be dead or only for link exchange.

Site should not be penalized from which you are getting links.

Site content should not be illegal or scrappy.

Site should be related to the theme.

Site’s outgoing links should not be from bad neighborhood.

Good numbers of index pages should be with the site.

Good number of link back should be with the site.

If you get links within content whether the page has not page rank, it will be beneficial for your site but content should be related to your theme.

Link should be with static web pages.

Links should be with different IPs.

Don’t exchange links in bulk whether it is reciprocal or one way.

How To Request for Reciprocal Links?

You can prepare a link exchange format and can send the request to the web masters for link exchange.

You can request a link exchange offers via forums or Google talk also as I mentioned above.

Importance of Reciprocal Links:

Many people think that reciprocal links have not importance because in returns we provide a link also from our web site and in this condition we don’t do anything extra for our website. But I think it has importance if you do the work smartly. It has great importance still to get higher ranking in search engines. Google avoid that reciprocal links which are not related to the theme.

Some Rules For Reciprocal or Two way Links:

If your site is dedicated to SEO, don’t give any external links with the anchor of SEO in returns in the conditions of reciprocal links.

Consider always with Good PR if you get links from resources pages.

Try to get links from content pages.

Don’t give higher Page Rank in return of lower.

Always do linking with theme related in the conditions of reciprocal.

Avoid reciprocal linking if the site has bad neighborhood outgoing links.

Out going links should not be more than 20 in the conditions of reciprocal linking on the page.

Reciprocal links are not much effective in getting page rank but it is more effective to get higher ranking in major search engines.

An Opinion about Robot.Txt File October 29, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , ,
add a comment

The chapter of Robot.Txt is not a new but recently, it came in light again when Matt Cutt talked about it on his blog. Robot.Txt file is known as to restrict the access to our site’s pages by search engine robots to crawl.

Use of Robot.Txt File:

Robot.Txt file disallow the pages to crawl by search engines that are restricted. Now, there are some questions – why we should restrict to crawl pages by search engine’s crawler, is it helpful to restrict pages etc.?

How to use Robot.Txt File:

This is the basic things about robot.txt file that how to use it. Many people still don’t know about the uses of it. It is used in dynamic and static both sites. It is always used in root directory.

Robot.Txt file is a txt file that contains some code of disallowing to index. You should need not to mention about allowing pages. You should write only disallowing pages in text file.

Importance of Robot.Txt File:

Robot.TXT is the best option to control over your own site’s pages. The pages that are not necessary for search engine, you can restrict them by robot.txt file.

For example:

Suppose, if you have a dynamic site. You are selling a product that offers 200 another products. You keep 20 products on a page. You make 10 pages to show the offers and products. You make a title for first page but the title goes to another 19 pages also. Now, there is a problem for search engine to rank your pages. Search engine understands different URL but there is same title on another 19 pages. Now, you can use robot.txt pages to disallow another 19 pages.

Another example:

Your site provides session IDs during the login; you can restrict session IDs also by the using robot.txt file.

How to Make Robot.TXT File:

Open a notepad and write these codes:

Example of Robot.TXT File

You can write many codes also to disallow your sites pages. You can validate your robot.txt after making it.

Develop a Site from SEO point of view March 11, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , , , , ,
add a comment

I have already mentioned designing aspect for a site in my previous post. There is another aspect that called development. There are some topics to develop your site from SEOs point of view to get better ranking in major search engines.

In the development aspect you have to check junk codes which have not importance on the web page. Junk codes make a web page larger and they are not useful from search engines point of view. Mostly junk codes take place on the web page from designer or developer ends. SEOs should check that unnecessary codes time to time. On the web pages there could be unnecessary spaces also. Those spaces can be harmful from ranking point of view. So, in during the development of a web site you need check very carefully junk codes and unnecessary spaces to remove them.

Another thing includes with URL structure. During the web page development you need to create web folder or web URL. You have to care that in the URL never includes _ (underscore) or – (double hyphens). From the search engines and users point of view in URL you need to include – (single hyphens).

If your site provides logged in option then you need to disallow session IDs to crawl boats. You can include robots.txt file to allow bots to crawl your sites without session IDs.

You need to check always your site on different browser and load timing. Load timing should be exceeded. If load time is high then your site’s bounce rate will be higher.

There should be installed tracking codes in every static page of your web sites. This is analytic codes to track your sites behaviour. The code will let you know how to improve your site, from what sources visitors are coming etc.

Most of the people add Meta refresh tag on the web page. It should not be added on the web page. Meta refresh tag always restrict to show the original content when a visitors request to the web site.