jump to navigation

How to Make Effective Meta Tag Descriptions? May 15, 2010

Posted by kishosingh in seo.
Tags: , , , , , , , , , , , ,
add a comment

Title of the web page is not all things. It is able to say the whole content of the the web page. It is only the name of web page targeted by popular keywords. Keywords are the identification of the web page by which we find that page in search engines. Meta description is the introductions of the web page. It gives an overall synopsis about the products or content of the web page.

Meta descriptions are also appeared in the head sections of the web page just like Meta title or Meta keywords. It is also an HTML tag. Meta descriptions are useful for the web pages. It defines the web pages. Search engines show – Title, URL and Descriptions for any searches. Title gives 1st attraction, descriptions 2nd and URL 3rd. However, we already have discussed about the importance of keywords in search engine optimization.

Descriptions Tag:

I think most of the visitors derive to the site reading the descriptions of that search term. What exactly we are looking, says the descriptions. That description comes from our pages. About the Meta descriptions, I have seen many mistaken. People don’t make that on the basis of the given information. They include something else also which is not on the web page. For example – We have an SEO News site; we make a Meta like this – “Get news about SEO in India. Search Engine Optimisation news and tips are from the web page” or related to it. I think it is right. But if you make it as – “SEO, Online Marketing, Web 2.0 and Link Building Services in India. News, reviews about seo and seo services from the web site.” Is it right? I don’t think it is right.

What are the mistakes? They include other things also in their descriptions. On the other hand, there are already pages in their site related to online marketing and link building.

Another thing, descriptions should be descriptive and short. Repetition of keywords should be avoided. Should be used synonyms also instead of repetition.

Importance of Keywords in Search Engine Optimization December 1, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , , , ,
add a comment

Recently, a news came in light about the keywords meta tag. Google announced that it doesn’t consider Meta tag keywords for web ranking. About the Meta tag keywords, Google has written an article on webmaster central blog. I had read an article on search engine land also about Meta keyword tag that was also fabulous.

What is keyword?

For users – Keywords are search term to find out the appropriate results in any search engines.

For SEO – Keywords are the topic of the web page to get higher ranking in any search engines by the targeted keywords.

Importance of Meta Keywords:

For users: Nothing. The user can’t see the tag on the web page.

For SEO: Everything. But now it is nothing because Google has declared its concept about it.

Importance of Keywords:

For users: Users always needs related information whatever he performed in search engines.

For SEO: It is everything but there is a proper way to utilize it. Yes, keyword is everything for an SEO but how to use it on a web page is the biggest aspect for any SEOs.

How to Use Keywords on Web Page:

Keyword research is the biggest chapter in SEO field. So, an SEO should know what the most searches keywords are. We as an SEO should recognize what the trends of keywords are. What is customer, what is product and in what geography we have to interfere are the other things about the keywords research.

Keyword is used in title tag. So, an SEO should recognize about the density of keywords in title tag. Title is quite different from keywords so, don’t be confused about the title and keywords.

Keywords have greater importance on the web page within content and within product, alt tag, header tag etc. so, you have to know the proper density of keywords within content. Right placement of keywords within the content is everything.

Overall, Meta keywords tag are nothing for web page ranking in search engines but keywords have the major factors for web page ranking in any search engines.

An Opinion about Robot.Txt File October 29, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , ,
add a comment

The chapter of Robot.Txt is not a new but recently, it came in light again when Matt Cutt talked about it on his blog. Robot.Txt file is known as to restrict the access to our site’s pages by search engine robots to crawl.

Use of Robot.Txt File:

Robot.Txt file disallow the pages to crawl by search engines that are restricted. Now, there are some questions – why we should restrict to crawl pages by search engine’s crawler, is it helpful to restrict pages etc.?

How to use Robot.Txt File:

This is the basic things about robot.txt file that how to use it. Many people still don’t know about the uses of it. It is used in dynamic and static both sites. It is always used in root directory.

Robot.Txt file is a txt file that contains some code of disallowing to index. You should need not to mention about allowing pages. You should write only disallowing pages in text file.

Importance of Robot.Txt File:

Robot.TXT is the best option to control over your own site’s pages. The pages that are not necessary for search engine, you can restrict them by robot.txt file.

For example:

Suppose, if you have a dynamic site. You are selling a product that offers 200 another products. You keep 20 products on a page. You make 10 pages to show the offers and products. You make a title for first page but the title goes to another 19 pages also. Now, there is a problem for search engine to rank your pages. Search engine understands different URL but there is same title on another 19 pages. Now, you can use robot.txt pages to disallow another 19 pages.

Another example:

Your site provides session IDs during the login; you can restrict session IDs also by the using robot.txt file.

How to Make Robot.TXT File:

Open a notepad and write these codes:

Example of Robot.TXT File

You can write many codes also to disallow your sites pages. You can validate your robot.txt after making it.

A Light on Sitemaps September 22, 2009

Posted by kishosingh in sitemap.
Tags: , , , , , , , ,
add a comment

In SEO field, sitemap is the biggest chapter and still there is a big discussion on this subject. Still, there is much confusion among SEOs about sitemap. What is it? How it works? What is the importance of sitemap in SEO? These are the basic question about sitemap.

At first, I want to share my thought about sitemap. I think there are two types of sitemap – Sitemap XML and sitemap. You will ask what the difference between both is. Yes, this is the right question. At first, we should know about sitemaps. Sitemaps are the list of URLs and pages that tell search engine to crawl them. It means if a search engine crawler is not able to discover all the pages of a site then the sitemap gives the way to discover all the pages in single file. It is an XML sitemap that is known with capital “S”. The Sitemap tells to Google or other search engine crawler to discover all the pages of a site that is not discoverable.

Importance of Sitemaps:

This is helpful for dynamic site. If your site is based on dynamic pages, sitemap can guarantee to crawl all the pages by search engines crawler.

Your site has use of AJAX or images; Sitemaps is able to discover all the pages for search engine crawler.

If your site has large archive then Sitemaps are necessary to discover all the pages for search engines crawler.

Basic Rules for Sitemaps:

Search Engines don’t guarantee to crawl all the pages which are in Sitemaps. You should not put images URL in Sitemaps however, you can add that URL on which your images are added. Google adheres to Sitemap Protocol 0.9 so; you should crate a Sitemap using Sitemap Protocol 0.9.

URL Guidelines for Sitemaps:

A Sitemap always contain a list of URLs. A Sitemap can’t contain more than 50000 URLs and file can’t be longer than 10MB when uncompressed. A Sitemap can contain a list of Sitemaps with no more than 50000 Sitemaps. You should never include session IDs in URLs. If your site begins with http then don’t include www. Never use image URLs in Sitemaps.

You can create Sitemaps manually, through Sitemaps generator or from third party. Most of the blog has already RSS and Atom as feed that is used as Sitemap. Google is familiar with RSS 2.0 and Atom 1.0 feeds. You should know these feeds are able to show only recent post URLs however, a Sitemap is able to show whole URLs of your site.

Some Basic Aspect of Search Engine Optimization August 30, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , , , , ,
add a comment

We already have discussed on various topics of Search Engine Optimization but still there are some more topics which are remained. In SEO, there are the biggest factors as On-page optimization and Off-page optimization. We already have discussed about those chapters. I need to explain some more basic chapters of SEO which is being ignored by many SEOs.

I have to deal Server Side Factors and some general topic of SEO. In Server side factors, the main topics are – 301 and headers status code.

Many times we face the problem. There is main problem with page redirection which is known as 301 URL directions also. There are big question on this topic. It should be redirect or remove? What is the purpose of redirection? Is redirection will work?

Yes, all these are right questions. From my point of view removal of URL and redirection of URL both have own importance. We should choose those URL to remove which have no visitors and nothing importance from search engine point of view. On the other hand, a page which has high PR or high visitors should be redirected. Redirection is not the last things. There is need of webmaster submission also. There is Google webmaster where all the redirection should be submitted to make known Google crawler about the redirection.

Another thing is Server status code. You should monitor your page server status code. What is it? It is 200 errors or 404 errors? In both situations you have to work on those.

Many times we face large page size also which make pages very slow to browse. So, proper page size should not be exceeded from 110 KB.

There are many browsers so; your page should be passed with each browser properly. All things should be appropriate in each browser.

Some other things are appropriate style sheet, appropriate templates and metadata should be checked, standard navigation, HTML and XML sitemaps and analytical code.

So, all these are basic things of SEO which should be passed properly through SEOs. These are helpful in crawling and ranking.