jump to navigation

How to Make Effective Meta Tag Descriptions? May 15, 2010

Posted by kishosingh in seo.
Tags: , , , , , , , , , , , ,
add a comment

Title of the web page is not all things. It is able to say the whole content of the the web page. It is only the name of web page targeted by popular keywords. Keywords are the identification of the web page by which we find that page in search engines. Meta description is the introductions of the web page. It gives an overall synopsis about the products or content of the web page.

Meta descriptions are also appeared in the head sections of the web page just like Meta title or Meta keywords. It is also an HTML tag. Meta descriptions are useful for the web pages. It defines the web pages. Search engines show – Title, URL and Descriptions for any searches. Title gives 1st attraction, descriptions 2nd and URL 3rd. However, we already have discussed about the importance of keywords in search engine optimization.

Descriptions Tag:

I think most of the visitors derive to the site reading the descriptions of that search term. What exactly we are looking, says the descriptions. That description comes from our pages. About the Meta descriptions, I have seen many mistaken. People don’t make that on the basis of the given information. They include something else also which is not on the web page. For example – We have an SEO News site; we make a Meta like this – “Get news about SEO in India. Search Engine Optimisation news and tips are from the web page” or related to it. I think it is right. But if you make it as – “SEO, Online Marketing, Web 2.0 and Link Building Services in India. News, reviews about seo and seo services from the web site.” Is it right? I don’t think it is right.

What are the mistakes? They include other things also in their descriptions. On the other hand, there are already pages in their site related to online marketing and link building.

Another thing, descriptions should be descriptive and short. Repetition of keywords should be avoided. Should be used synonyms also instead of repetition.

Reciprocal Links and How to Get Reciprocal or Two Way Link April 7, 2010

Posted by kishosingh in Link Building.
Tags: , , , , , , , ,
add a comment

Reciprocal links depend on mutual understanding of web masters. It is a contract between two web pages which is related to each other in terms of theme and topic. You can say that it is an exchange of links for web pages on the understanding of web masters for their sites.

I have already mentioned in my previous post about three way links and way to get 3 way links. Here, I will write about reciprocal links and way of getting reciprocal links on my experience of link exchange.

What is Reciprocal Links?

Reciprocal links are two way links. Reciprocal links are known as “link exchange”, “links partner” and “two links” also. Site “A” adds a link on site “B” and Site “B” adds a link on site “A”, called reciprocal or two way links.

How To Get Reciprocal Or Two Links?

To Get Reciprocal links normally we follow the concept of link exchange. We send a request to the web master to add our links in returns of they get a link from our website.

Before sending a link exchange request, we fetch data from the search engines, directories, partner’s pages of other web sites or from references of web sites. To fetch or extract data we follow some searching ideas for shortcuts.

Forums are also way of getting reciprocal links after a discussion on link sections in the forums.

You can use Google talk, Yahoo or MSN massager to talk with web masters to exchange links on mutual understanding.

You can find out the data of reciprocal links and fill up form to link exchange.

What Should Care For Reciprocal Links?

If you are doing reciprocal link exchange then you should care about Page Rank.

Site should not be dead or only for link exchange.

Site should not be penalized from which you are getting links.

Site content should not be illegal or scrappy.

Site should be related to the theme.

Site’s outgoing links should not be from bad neighborhood.

Good numbers of index pages should be with the site.

Good number of link back should be with the site.

If you get links within content whether the page has not page rank, it will be beneficial for your site but content should be related to your theme.

Link should be with static web pages.

Links should be with different IPs.

Don’t exchange links in bulk whether it is reciprocal or one way.

How To Request for Reciprocal Links?

You can prepare a link exchange format and can send the request to the web masters for link exchange.

You can request a link exchange offers via forums or Google talk also as I mentioned above.

Importance of Reciprocal Links:

Many people think that reciprocal links have not importance because in returns we provide a link also from our web site and in this condition we don’t do anything extra for our website. But I think it has importance if you do the work smartly. It has great importance still to get higher ranking in search engines. Google avoid that reciprocal links which are not related to the theme.

Some Rules For Reciprocal or Two way Links:

If your site is dedicated to SEO, don’t give any external links with the anchor of SEO in returns in the conditions of reciprocal links.

Consider always with Good PR if you get links from resources pages.

Try to get links from content pages.

Don’t give higher Page Rank in return of lower.

Always do linking with theme related in the conditions of reciprocal.

Avoid reciprocal linking if the site has bad neighborhood outgoing links.

Out going links should not be more than 20 in the conditions of reciprocal linking on the page.

Reciprocal links are not much effective in getting page rank but it is more effective to get higher ranking in major search engines.

3 (Three) Way Links or Triangular Links Process March 30, 2010

Posted by kishosingh in Link Building.
Tags: , , , ,
add a comment

3 way links are known as one way link also. It is a process of link exchange in which 3 web sites participate to make a contract to get links. For example, if you have a website related to SEO and you are requesting a link exchange offer to a web master to add your link on the SEO related site. The web master have two SEO related site and he wants to place your link on site A but in return he wants a link back for site B from your web site, called 3 way link exchange.

The thought of 3 way or triangular links developed after search engines updates. Before the 3 way links, there was a concept of reciprocal or 2 way linking but soon the site got penalties specially by Google search engines. After this updates, most of the sites began to create 3 way links to make them as one way links in the eyes of search engines.

Most of sites got penalties for reciprocal links because they were involved in exchanging links with irrelevant sites and bad neighborhood. But those sites that were not concern with these activities still got higher ranking that is why the concept of reciprocal links is not dead but should be in the form of ethical way.

3 way links were most effective but soon it was also traced by search engines because in this process sister sites were involved with the same IP for link exchange. The IP were traced. Bulk linking and Spam linking were traced for penalties.

Lastly, it was proved that Spam links and bulk links with irrelevant sites and bad neighborhood is not beneficial for sites.

It was the history of 3 way links or triangular linking but I am talking about 3 way links and how to get it. To get 3 way links, we can use all tips of reciprocal links except filling form for reciprocal links.

My opinion on 3 way link is in positive direction. You can use it as one way link but in ethical way. If you have more than 1 site you can exchange links as 3 ways.

An Opinion about Static URL and Dynamic URL February 24, 2010

Posted by kishosingh in seo.
Tags: , , , , ,
add a comment

We can say that static URLs have physical existence and it display same information for all users. The page really exists on the server. Basically, HTML files are known as static URL.

Benefits of Static URLs:

The URLs are known as cache and search engines friendly.

These get higher ranking in search engines.

These are Ideal for promotion in search engines.

Disadvantages of Static URLs:

If the site is very large then it is very difficult to maintain static URLs.

Portals, big sites or e-commerce sites are not possible with static URLs.

Dynamic URLs:

These are not like static URLs. Dynamic URLs have not its own existence. It comes from server. .ASP, .PHP, .JSP, .ASPX etc. are the well known name of dynamic URLs.

Benefits of Dynamic URLs:

For portals or e-commerce sites these are the best.

You can easily maintain 1000 of URLs and updates them dynamically.

Disadvantage of Dynamic URLs:

Dynamic URLs sometimes are not cache friendly.

These are long and create query string URLs.

Dynamic and static URLs depend on your business requirements. If your business is big and it deals are related to customer oriented products then you need to develop a dynamic web pages or dynamic sites. Now, most of sites are on the dynamic scripts. Dynamic pages use Clint side script and server side script to browse a web page.

In the dynamic web pages contents, image, script, web design are separate on the server and when a user request an id then all the things come jointly on the browser.

For the informative and small sites static web page is right.

URL of static pages looks like this: http://www.example.com/seo.html

URL of dynamic web pages looks like this: http://www.example.com/seo?id=102

Now, the concept of about dynamic URLs and static URLs are very clear. Dynamic URLs contain long queries and parameters while static URL is simple with file name.

Importance of Keywords in Search Engine Optimization December 1, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , , , ,
add a comment

Recently, a news came in light about the keywords meta tag. Google announced that it doesn’t consider Meta tag keywords for web ranking. About the Meta tag keywords, Google has written an article on webmaster central blog. I had read an article on search engine land also about Meta keyword tag that was also fabulous.

What is keyword?

For users – Keywords are search term to find out the appropriate results in any search engines.

For SEO – Keywords are the topic of the web page to get higher ranking in any search engines by the targeted keywords.

Importance of Meta Keywords:

For users: Nothing. The user can’t see the tag on the web page.

For SEO: Everything. But now it is nothing because Google has declared its concept about it.

Importance of Keywords:

For users: Users always needs related information whatever he performed in search engines.

For SEO: It is everything but there is a proper way to utilize it. Yes, keyword is everything for an SEO but how to use it on a web page is the biggest aspect for any SEOs.

How to Use Keywords on Web Page:

Keyword research is the biggest chapter in SEO field. So, an SEO should know what the most searches keywords are. We as an SEO should recognize what the trends of keywords are. What is customer, what is product and in what geography we have to interfere are the other things about the keywords research.

Keyword is used in title tag. So, an SEO should recognize about the density of keywords in title tag. Title is quite different from keywords so, don’t be confused about the title and keywords.

Keywords have greater importance on the web page within content and within product, alt tag, header tag etc. so, you have to know the proper density of keywords within content. Right placement of keywords within the content is everything.

Overall, Meta keywords tag are nothing for web page ranking in search engines but keywords have the major factors for web page ranking in any search engines.

An Opinion about Robot.Txt File October 29, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , ,
add a comment

The chapter of Robot.Txt is not a new but recently, it came in light again when Matt Cutt talked about it on his blog. Robot.Txt file is known as to restrict the access to our site’s pages by search engine robots to crawl.

Use of Robot.Txt File:

Robot.Txt file disallow the pages to crawl by search engines that are restricted. Now, there are some questions – why we should restrict to crawl pages by search engine’s crawler, is it helpful to restrict pages etc.?

How to use Robot.Txt File:

This is the basic things about robot.txt file that how to use it. Many people still don’t know about the uses of it. It is used in dynamic and static both sites. It is always used in root directory.

Robot.Txt file is a txt file that contains some code of disallowing to index. You should need not to mention about allowing pages. You should write only disallowing pages in text file.

Importance of Robot.Txt File:

Robot.TXT is the best option to control over your own site’s pages. The pages that are not necessary for search engine, you can restrict them by robot.txt file.

For example:

Suppose, if you have a dynamic site. You are selling a product that offers 200 another products. You keep 20 products on a page. You make 10 pages to show the offers and products. You make a title for first page but the title goes to another 19 pages also. Now, there is a problem for search engine to rank your pages. Search engine understands different URL but there is same title on another 19 pages. Now, you can use robot.txt pages to disallow another 19 pages.

Another example:

Your site provides session IDs during the login; you can restrict session IDs also by the using robot.txt file.

How to Make Robot.TXT File:

Open a notepad and write these codes:

Example of Robot.TXT File

You can write many codes also to disallow your sites pages. You can validate your robot.txt after making it.

A Light on Sitemaps September 22, 2009

Posted by kishosingh in sitemap.
Tags: , , , , , , , ,
add a comment

In SEO field, sitemap is the biggest chapter and still there is a big discussion on this subject. Still, there is much confusion among SEOs about sitemap. What is it? How it works? What is the importance of sitemap in SEO? These are the basic question about sitemap.

At first, I want to share my thought about sitemap. I think there are two types of sitemap – Sitemap XML and sitemap. You will ask what the difference between both is. Yes, this is the right question. At first, we should know about sitemaps. Sitemaps are the list of URLs and pages that tell search engine to crawl them. It means if a search engine crawler is not able to discover all the pages of a site then the sitemap gives the way to discover all the pages in single file. It is an XML sitemap that is known with capital “S”. The Sitemap tells to Google or other search engine crawler to discover all the pages of a site that is not discoverable.

Importance of Sitemaps:

This is helpful for dynamic site. If your site is based on dynamic pages, sitemap can guarantee to crawl all the pages by search engines crawler.

Your site has use of AJAX or images; Sitemaps is able to discover all the pages for search engine crawler.

If your site has large archive then Sitemaps are necessary to discover all the pages for search engines crawler.

Basic Rules for Sitemaps:

Search Engines don’t guarantee to crawl all the pages which are in Sitemaps. You should not put images URL in Sitemaps however, you can add that URL on which your images are added. Google adheres to Sitemap Protocol 0.9 so; you should crate a Sitemap using Sitemap Protocol 0.9.

URL Guidelines for Sitemaps:

A Sitemap always contain a list of URLs. A Sitemap can’t contain more than 50000 URLs and file can’t be longer than 10MB when uncompressed. A Sitemap can contain a list of Sitemaps with no more than 50000 Sitemaps. You should never include session IDs in URLs. If your site begins with http then don’t include www. Never use image URLs in Sitemaps.

You can create Sitemaps manually, through Sitemaps generator or from third party. Most of the blog has already RSS and Atom as feed that is used as Sitemap. Google is familiar with RSS 2.0 and Atom 1.0 feeds. You should know these feeds are able to show only recent post URLs however, a Sitemap is able to show whole URLs of your site.

Some Basic Aspect of Search Engine Optimization August 30, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , , , , ,
add a comment

We already have discussed on various topics of Search Engine Optimization but still there are some more topics which are remained. In SEO, there are the biggest factors as On-page optimization and Off-page optimization. We already have discussed about those chapters. I need to explain some more basic chapters of SEO which is being ignored by many SEOs.

I have to deal Server Side Factors and some general topic of SEO. In Server side factors, the main topics are – 301 and headers status code.

Many times we face the problem. There is main problem with page redirection which is known as 301 URL directions also. There are big question on this topic. It should be redirect or remove? What is the purpose of redirection? Is redirection will work?

Yes, all these are right questions. From my point of view removal of URL and redirection of URL both have own importance. We should choose those URL to remove which have no visitors and nothing importance from search engine point of view. On the other hand, a page which has high PR or high visitors should be redirected. Redirection is not the last things. There is need of webmaster submission also. There is Google webmaster where all the redirection should be submitted to make known Google crawler about the redirection.

Another thing is Server status code. You should monitor your page server status code. What is it? It is 200 errors or 404 errors? In both situations you have to work on those.

Many times we face large page size also which make pages very slow to browse. So, proper page size should not be exceeded from 110 KB.

There are many browsers so; your page should be passed with each browser properly. All things should be appropriate in each browser.

Some other things are appropriate style sheet, appropriate templates and metadata should be checked, standard navigation, HTML and XML sitemaps and analytical code.

So, all these are basic things of SEO which should be passed properly through SEOs. These are helpful in crawling and ranking.

Off-Page Factors in Search Engine Optimization July 22, 2009

Posted by kishosingh in Link Building.
Tags: , , , , , , , , , , , , , , , ,

Off-page is the biggest factors in search engine optimization from beginning of search engines. It is known as link building or promotion of web pages also. Off-page factors got drastic position in the age of Google search engines. Google brought page rank and back link concept to get web pages in honorable status. It was the really a revolution in search engine optimization field.

For both – bank link creation and to get higher page rank, there is need of, link building. One more factor include in link building or off-page factor, that is – higher ranking from targeted keywords. It means – more back links, higher page rank and more relevant back, higher ranking in search engines by targeted keywords. Here the theory of link building changes again. In the recent times, links will be count as back link which are relevant and not crap. So, more back link is not possible now, if the link is not relevant. Now, again for higher ranking, relevancy is must in the link building era.

I have analyzed many times that people are confused in link building and relevant link building. They are confused about off-page factors. Again, here I want to make it clear that only link buildings are not off-page factors in SEO. Yes, off-page factors have broader aspect and sense. Basically, off-page is the promotional work whether it is online or off-line. So, any type of promotional work will be known as the activity of off-page, such as – link building, RSS Subscription, forums participation, social bookmaking, social promotion through social networking sites, blogging, directory submission, press releases, article submission, social participation, buzzing etc.

All these activities are effective only when, it is used in the context and naturally. For example – for social bookmaking home page URL submission is unethical. In the directory submission – deep link has no value. I have many examples which are unethical from the search engine point of view and they have no value as back links.

In off-page factors, using proper anchor text according to landing pages are the most valuable and important. People are crazy towards home page URL only but deep link’s back link creation should be minimum 25% in the context of home page URL.

How to Get Higher Ranking in Organic Searches Your Site by Targeted Keywords June 19, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , , ,
1 comment so far

Today, in search engine optimization industry, the question is very hot. How to get higher ranking in organic searches a site by targeted keywords? It is the basic question of this industry. We, as an SEO can’t guarantee of #1 position but we always try to reach on #1 if we are ethical.

Unethical practice can reach us on #1 but it can be temporarily not for permanent. We, as an SEO analysis, research and study of search engines algorithms and try to follow that. We know that no search engines give their algorithm but we try to know that like a doctor to see the ups and downs in this industry.

I have always said that to get higher ranking by targeted keywords is not a vast task but I say here this word in another way that it is not an easy task also. There is only one fact to get higher ranking – keywords research and placement of those keywords in right way on the web page.

If we work as an SEO on a web site then we should understand very clearly our work. Our work starts from URL research and navigation of URL properly. URL research and navigation of URL is the biggest work for a site.

If you understand difference between dynamic URL and static URL then you will get succeed soon. You have to put your dynamic URL within static pages.

If the work of URL is done by you then you should proceed with keywords research and placement of keywords within content, title and overall webpage in right way. This is the basic and final work as an SEO which will guarantee to get higher ranking of your web page in organic searches by targeted keywords.

In this way, we have to cross some technical works also like – use of h1, h2…h6, use of fresh and unique content related to the web theme, use of alt attribute, use of anchor text etc.

Nowadays, in the SEO industry there is big problem of unique and fresh content. There is another big problem of optimized page and optimized content. If we solve these basic problems then surly we will get any web page in higher position in organic searches by targeted keywords.