jump to navigation

An Opinion about Static URL and Dynamic URL February 24, 2010

Posted by kishosingh in seo.
Tags: , , , , ,
add a comment

We can say that static URLs have physical existence and it display same information for all users. The page really exists on the server. Basically, HTML files are known as static URL.

Benefits of Static URLs:

The URLs are known as cache and search engines friendly.

These get higher ranking in search engines.

These are Ideal for promotion in search engines.

Disadvantages of Static URLs:

If the site is very large then it is very difficult to maintain static URLs.

Portals, big sites or e-commerce sites are not possible with static URLs.

Dynamic URLs:

These are not like static URLs. Dynamic URLs have not its own existence. It comes from server. .ASP, .PHP, .JSP, .ASPX etc. are the well known name of dynamic URLs.

Benefits of Dynamic URLs:

For portals or e-commerce sites these are the best.

You can easily maintain 1000 of URLs and updates them dynamically.

Disadvantage of Dynamic URLs:

Dynamic URLs sometimes are not cache friendly.

These are long and create query string URLs.

Dynamic and static URLs depend on your business requirements. If your business is big and it deals are related to customer oriented products then you need to develop a dynamic web pages or dynamic sites. Now, most of sites are on the dynamic scripts. Dynamic pages use Clint side script and server side script to browse a web page.

In the dynamic web pages contents, image, script, web design are separate on the server and when a user request an id then all the things come jointly on the browser.

For the informative and small sites static web page is right.

URL of static pages looks like this: http://www.example.com/seo.html

URL of dynamic web pages looks like this: http://www.example.com/seo?id=102

Now, the concept of about dynamic URLs and static URLs are very clear. Dynamic URLs contain long queries and parameters while static URL is simple with file name.

An Opinion about Robot.Txt File October 29, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , ,
add a comment

The chapter of Robot.Txt is not a new but recently, it came in light again when Matt Cutt talked about it on his blog. Robot.Txt file is known as to restrict the access to our site’s pages by search engine robots to crawl.

Use of Robot.Txt File:

Robot.Txt file disallow the pages to crawl by search engines that are restricted. Now, there are some questions – why we should restrict to crawl pages by search engine’s crawler, is it helpful to restrict pages etc.?

How to use Robot.Txt File:

This is the basic things about robot.txt file that how to use it. Many people still don’t know about the uses of it. It is used in dynamic and static both sites. It is always used in root directory.

Robot.Txt file is a txt file that contains some code of disallowing to index. You should need not to mention about allowing pages. You should write only disallowing pages in text file.

Importance of Robot.Txt File:

Robot.TXT is the best option to control over your own site’s pages. The pages that are not necessary for search engine, you can restrict them by robot.txt file.

For example:

Suppose, if you have a dynamic site. You are selling a product that offers 200 another products. You keep 20 products on a page. You make 10 pages to show the offers and products. You make a title for first page but the title goes to another 19 pages also. Now, there is a problem for search engine to rank your pages. Search engine understands different URL but there is same title on another 19 pages. Now, you can use robot.txt pages to disallow another 19 pages.

Another example:

Your site provides session IDs during the login; you can restrict session IDs also by the using robot.txt file.

How to Make Robot.TXT File:

Open a notepad and write these codes:

Example of Robot.TXT File

You can write many codes also to disallow your sites pages. You can validate your robot.txt after making it.

A Light on Sitemaps September 22, 2009

Posted by kishosingh in sitemap.
Tags: , , , , , , , ,
add a comment

In SEO field, sitemap is the biggest chapter and still there is a big discussion on this subject. Still, there is much confusion among SEOs about sitemap. What is it? How it works? What is the importance of sitemap in SEO? These are the basic question about sitemap.

At first, I want to share my thought about sitemap. I think there are two types of sitemap – Sitemap XML and sitemap. You will ask what the difference between both is. Yes, this is the right question. At first, we should know about sitemaps. Sitemaps are the list of URLs and pages that tell search engine to crawl them. It means if a search engine crawler is not able to discover all the pages of a site then the sitemap gives the way to discover all the pages in single file. It is an XML sitemap that is known with capital “S”. The Sitemap tells to Google or other search engine crawler to discover all the pages of a site that is not discoverable.

Importance of Sitemaps:

This is helpful for dynamic site. If your site is based on dynamic pages, sitemap can guarantee to crawl all the pages by search engines crawler.

Your site has use of AJAX or images; Sitemaps is able to discover all the pages for search engine crawler.

If your site has large archive then Sitemaps are necessary to discover all the pages for search engines crawler.

Basic Rules for Sitemaps:

Search Engines don’t guarantee to crawl all the pages which are in Sitemaps. You should not put images URL in Sitemaps however, you can add that URL on which your images are added. Google adheres to Sitemap Protocol 0.9 so; you should crate a Sitemap using Sitemap Protocol 0.9.

URL Guidelines for Sitemaps:

A Sitemap always contain a list of URLs. A Sitemap can’t contain more than 50000 URLs and file can’t be longer than 10MB when uncompressed. A Sitemap can contain a list of Sitemaps with no more than 50000 Sitemaps. You should never include session IDs in URLs. If your site begins with http then don’t include www. Never use image URLs in Sitemaps.

You can create Sitemaps manually, through Sitemaps generator or from third party. Most of the blog has already RSS and Atom as feed that is used as Sitemap. Google is familiar with RSS 2.0 and Atom 1.0 feeds. You should know these feeds are able to show only recent post URLs however, a Sitemap is able to show whole URLs of your site.

Some Basic Aspect of Search Engine Optimization August 30, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , , , , ,
add a comment

We already have discussed on various topics of Search Engine Optimization but still there are some more topics which are remained. In SEO, there are the biggest factors as On-page optimization and Off-page optimization. We already have discussed about those chapters. I need to explain some more basic chapters of SEO which is being ignored by many SEOs.

I have to deal Server Side Factors and some general topic of SEO. In Server side factors, the main topics are – 301 and headers status code.

Many times we face the problem. There is main problem with page redirection which is known as 301 URL directions also. There are big question on this topic. It should be redirect or remove? What is the purpose of redirection? Is redirection will work?

Yes, all these are right questions. From my point of view removal of URL and redirection of URL both have own importance. We should choose those URL to remove which have no visitors and nothing importance from search engine point of view. On the other hand, a page which has high PR or high visitors should be redirected. Redirection is not the last things. There is need of webmaster submission also. There is Google webmaster where all the redirection should be submitted to make known Google crawler about the redirection.

Another thing is Server status code. You should monitor your page server status code. What is it? It is 200 errors or 404 errors? In both situations you have to work on those.

Many times we face large page size also which make pages very slow to browse. So, proper page size should not be exceeded from 110 KB.

There are many browsers so; your page should be passed with each browser properly. All things should be appropriate in each browser.

Some other things are appropriate style sheet, appropriate templates and metadata should be checked, standard navigation, HTML and XML sitemaps and analytical code.

So, all these are basic things of SEO which should be passed properly through SEOs. These are helpful in crawling and ranking.

Off-Page Factors in Search Engine Optimization July 22, 2009

Posted by kishosingh in Link Building.
Tags: , , , , , , , , , , , , , , , ,
5 comments

Off-page is the biggest factors in search engine optimization from beginning of search engines. It is known as link building or promotion of web pages also. Off-page factors got drastic position in the age of Google search engines. Google brought page rank and back link concept to get web pages in honorable status. It was the really a revolution in search engine optimization field.

For both – bank link creation and to get higher page rank, there is need of, link building. One more factor include in link building or off-page factor, that is – higher ranking from targeted keywords. It means – more back links, higher page rank and more relevant back, higher ranking in search engines by targeted keywords. Here the theory of link building changes again. In the recent times, links will be count as back link which are relevant and not crap. So, more back link is not possible now, if the link is not relevant. Now, again for higher ranking, relevancy is must in the link building era.

I have analyzed many times that people are confused in link building and relevant link building. They are confused about off-page factors. Again, here I want to make it clear that only link buildings are not off-page factors in SEO. Yes, off-page factors have broader aspect and sense. Basically, off-page is the promotional work whether it is online or off-line. So, any type of promotional work will be known as the activity of off-page, such as – link building, RSS Subscription, forums participation, social bookmaking, social promotion through social networking sites, blogging, directory submission, press releases, article submission, social participation, buzzing etc.

All these activities are effective only when, it is used in the context and naturally. For example – for social bookmaking home page URL submission is unethical. In the directory submission – deep link has no value. I have many examples which are unethical from the search engine point of view and they have no value as back links.

In off-page factors, using proper anchor text according to landing pages are the most valuable and important. People are crazy towards home page URL only but deep link’s back link creation should be minimum 25% in the context of home page URL.

On-Page Factors are The Biggest Challenge for SEOs May 7, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , , , , , , ,
add a comment

Many times, we have written about each and every factors of SEO. In my previous post I have written about search engine optimization and beginners guide for SEOs but here specially, I want to explore something different on the on-page factors which are the biggest challenge for SEOs.

Recently, I am facing big problem by URLs. After 3.5 years having worked in an online marketing company as an SEO now, I have joined in a new company which has never done SEO for its website. The website has been developed by developer and designers on contract basis. All the URL of the website (on which I am working) is in different languages like – .HTML, .ASP, .ASPX etc. Some URLs are browsing in pop-ups, some in JavaScript and some are in frames. URL structures are related to neither users’ point of view nor search engines.

During the work, I learnt about the proper structure of URL which is the biggest challenge for SEOs. Now, I think for a better site it is necessary to research and analysis of URL structure.

URL should be in proper naming conventions. Other problems are related to navigational structure. How to optimize a better navigation? It is the basic things to go through the 20 top sites with related industry. Better navigation structures derive – loyal visitors, returning visitors, increase leads and decrease bounce rate.

Title tags are the keys of websites. Unique and descriptive title and Meta tags related to the web page themes are also one of the biggest challenge for SEOs.

We have analysis and seen most of the sites are not including H1, H2 and H3 tags because they don’t know their importance. The tags tell the users and search engines, all the things are in proper and optimize way.

In most of the websites, there are same title and Meta tags with different URLs. Title tags in most of cases are not in a proper sentence. Most of the people include only keywords in title tags which look like spam and nothing else.

Some SEOs don’t know the uses of robots.txt also. The file tells the search engines what pages have to index and what not.

These are some on-page factors which are as challenge for SEOs. If we complete these works properly then most of the web pages can get higher ranking by their targeted keywords.

Develop a Site from SEO point of view March 11, 2009

Posted by kishosingh in seo.
Tags: , , , , , , , , , , ,
add a comment

I have already mentioned designing aspect for a site in my previous post. There is another aspect that called development. There are some topics to develop your site from SEOs point of view to get better ranking in major search engines.

In the development aspect you have to check junk codes which have not importance on the web page. Junk codes make a web page larger and they are not useful from search engines point of view. Mostly junk codes take place on the web page from designer or developer ends. SEOs should check that unnecessary codes time to time. On the web pages there could be unnecessary spaces also. Those spaces can be harmful from ranking point of view. So, in during the development of a web site you need check very carefully junk codes and unnecessary spaces to remove them.

Another thing includes with URL structure. During the web page development you need to create web folder or web URL. You have to care that in the URL never includes _ (underscore) or – (double hyphens). From the search engines and users point of view in URL you need to include – (single hyphens).

If your site provides logged in option then you need to disallow session IDs to crawl boats. You can include robots.txt file to allow bots to crawl your sites without session IDs.

You need to check always your site on different browser and load timing. Load timing should be exceeded. If load time is high then your site’s bounce rate will be higher.

There should be installed tracking codes in every static page of your web sites. This is analytic codes to track your sites behaviour. The code will let you know how to improve your site, from what sources visitors are coming etc.

Most of the people add Meta refresh tag on the web page. It should not be added on the web page. Meta refresh tag always restrict to show the original content when a visitors request to the web site.

Designing aspect in SEO February 15, 2009

Posted by kishosingh in Designing.
Tags: , , , , , , , , , , , , , , , ,
1 comment so far

Design is the basic aspect of a website. But it can’t be 1st step in SEO. SEO and designing both are interrelated to each other in promotion of a web site. If design is strong of a website then an SEO can easily promote that site in top 10 by targeted keywords. Design is basic aspect but it should be after market research and competitor analysis by an SEO.

If an SEO analysis competitor site to launch a new site then now his basic work can be making synopsis and design of that site which should be perfect from search engine point of view and user friendly.

Design aspect in SEO:

At 1st we should proceed with URL structure that should be on the basic research of market because it has great importance from search engines, ranking and user point of view. If your site is in .html or in .asp then it is OK but if your site is in .aspx then it is very hard to manage URL structure with site header so, make sure about URL at first. It is basic of SEO.

Second level of designing aspect initiate with the validation of all page by W3C. Your all page should be validated by W3C with minimum error to satisfy search engines.

3rd level of designing aspect is checking of error pages in a site. From my point of view in the designing aspect always designer make 2 or 3 URLs for every page it is their great mistake. Multiple URL for a page create problem of canonicalization. If by chance the page uploads on your site and cached by Google. After you delete this page it means you are creating 404 errors for your site.

Always check in your site broken links that called 404 errors. If you find that error in your site then immediately removed that from your site as well as webmaster tools also. You should check broken links error in your webmaster tools daily. You can get external and internal both errors in condition of broken links. You can’t do anything for external broken links error but you can do everything for internal broken link error.

4th level of designing aspect in SEO starts with checking of hidden text on a page. Many times designer write some text to indicate changes on page that are not related to page or some times we find text in single pixel on page that are not for users. We should delete those texts which are hidden and not useful from user point view.

5th level starts with navigational structure of a site. In the process of making site we should always keep in our mind about that user who comes to your site to buy something. Navigation of site always people let knows about your site products and reaches them to add cart to buy that things.

6th level of designing is related to navigational structure of a site. Sometimes we see in many site “Toggle” (+) in their navigational links. Toggle contains many links in the sign of “+”. From my point of view navigational links should be free by “Toggle”.

7th level of designing aspect in SEO contains removing of .js and .css file from the page. .Js and .css files are important for a site but that should not be on the page. All the .js and .css files should be in separate files. If we don’t call these files separately then it will increase load time and will make your pages heavy.

8th level is related to the coding of the page. From my point of view site should be in DIV rather than TR and TD. DIV treat as text by search engines and they are soft. DIV don’t take more time in loading also and it is lighter than TR-TD.

9th aspect is related to links structure. Most time we use images to show navigation of sites but to get pages in search engine ranking it should be in text links.

10th aspect is the reality and future of SEO and web design. If we use flash on our pages very strongly and in creative way to show a particular things then it can lead your business very effectively.

These are my designing aspect for any SEO to get site on higher ranking in any search engines by targeted keywords.