A Light on Sitemaps September 22, 2009Posted by kishosingh in sitemap.
Tags: crawler, rss, search, search engine, search engines, seo, sitemap, url, xml
add a comment
In SEO field, sitemap is the biggest chapter and still there is a big discussion on this subject. Still, there is much confusion among SEOs about sitemap. What is it? How it works? What is the importance of sitemap in SEO? These are the basic question about sitemap.
At first, I want to share my thought about sitemap. I think there are two types of sitemap – Sitemap XML and sitemap. You will ask what the difference between both is. Yes, this is the right question. At first, we should know about sitemaps. Sitemaps are the list of URLs and pages that tell search engine to crawl them. It means if a search engine crawler is not able to discover all the pages of a site then the sitemap gives the way to discover all the pages in single file. It is an XML sitemap that is known with capital “S”. The Sitemap tells to Google or other search engine crawler to discover all the pages of a site that is not discoverable.
Importance of Sitemaps:
This is helpful for dynamic site. If your site is based on dynamic pages, sitemap can guarantee to crawl all the pages by search engines crawler.
Your site has use of AJAX or images; Sitemaps is able to discover all the pages for search engine crawler.
If your site has large archive then Sitemaps are necessary to discover all the pages for search engines crawler.
Basic Rules for Sitemaps:
Search Engines don’t guarantee to crawl all the pages which are in Sitemaps. You should not put images URL in Sitemaps however, you can add that URL on which your images are added. Google adheres to Sitemap Protocol 0.9 so; you should crate a Sitemap using Sitemap Protocol 0.9.
URL Guidelines for Sitemaps:
A Sitemap always contain a list of URLs. A Sitemap can’t contain more than 50000 URLs and file can’t be longer than 10MB when uncompressed. A Sitemap can contain a list of Sitemaps with no more than 50000 Sitemaps. You should never include session IDs in URLs. If your site begins with http then don’t include www. Never use image URLs in Sitemaps.
You can create Sitemaps manually, through Sitemaps generator or from third party. Most of the blog has already RSS and Atom as feed that is used as Sitemap. Google is familiar with RSS 2.0 and Atom 1.0 feeds. You should know these feeds are able to show only recent post URLs however, a Sitemap is able to show whole URLs of your site.
Some Basic Aspect of Search Engine Optimization August 30, 2009Posted by kishosingh in seo.
Tags: code, html, off page, on page, optimization, search, search engine, search engine optimization, seo, sitemap, url, xml
add a comment
We already have discussed on various topics of Search Engine Optimization but still there are some more topics which are remained. In SEO, there are the biggest factors as On-page optimization and Off-page optimization. We already have discussed about those chapters. I need to explain some more basic chapters of SEO which is being ignored by many SEOs.
I have to deal Server Side Factors and some general topic of SEO. In Server side factors, the main topics are – 301 and headers status code.
Many times we face the problem. There is main problem with page redirection which is known as 301 URL directions also. There are big question on this topic. It should be redirect or remove? What is the purpose of redirection? Is redirection will work?
Yes, all these are right questions. From my point of view removal of URL and redirection of URL both have own importance. We should choose those URL to remove which have no visitors and nothing importance from search engine point of view. On the other hand, a page which has high PR or high visitors should be redirected. Redirection is not the last things. There is need of webmaster submission also. There is Google webmaster where all the redirection should be submitted to make known Google crawler about the redirection.
Another thing is Server status code. You should monitor your page server status code. What is it? It is 200 errors or 404 errors? In both situations you have to work on those.
Many times we face large page size also which make pages very slow to browse. So, proper page size should not be exceeded from 110 KB.
There are many browsers so; your page should be passed with each browser properly. All things should be appropriate in each browser.
Some other things are appropriate style sheet, appropriate templates and metadata should be checked, standard navigation, HTML and XML sitemaps and analytical code.
So, all these are basic things of SEO which should be passed properly through SEOs. These are helpful in crawling and ranking.