Tags: google, search engines, seo, web, website
add a comment
Search Engine Optimization is the biggest challenge for today’s online business. If you want to grow your online business, you need to make your website search engines friendly.
At first you should think about your business goals before proceeding website optimization. If your business goal is clear, you can achieve great success in your business.
Second thing is optimization. Optimization for whom? It should be the basic question before initiating website optimization. Answer is very clear, optimization for search engines to derive audiences and to complete business goals.
Google is the biggest Search Engine as we already know. It receives around 500 million search requests every day. Google has approximately 110 million unique websites and more than 1 trillion URLs.
In this situation, there is the biggest challenge for SEOs to put a website at the top of a search engine’s results, ensuring that it is seen and visited by a large audience.
SEO plays the vital role in modern advertising. In these conditions, there are some tips to grow your online business through SEO practices:
Make Hierarchal Architecture for Your Website:
Architecture is the root for any business. It is just like a plan. It is the idea how to explore your products on the website or webpage.
Make Meaningful Title with the Most Searched Keywords:
Title is the biggest aspect to get ranking in Search Engines. It should be meaningful, short and optimized with the most searched keywords.
Content is always known as the king on the web for the Search Engines. Content should be related to your web products and should be written with analytical approach.
An Opinion about Static URL and Dynamic URL February 24, 2010Posted by kishosingh in seo.
Tags: html, search engine, search engines, seo, url, web
add a comment
We can say that static URLs have physical existence and it display same information for all users. The page really exists on the server. Basically, HTML files are known as static URL.
Benefits of Static URLs:
The URLs are known as cache and search engines friendly.
These get higher ranking in search engines.
These are Ideal for promotion in search engines.
Disadvantages of Static URLs:
If the site is very large then it is very difficult to maintain static URLs.
Portals, big sites or e-commerce sites are not possible with static URLs.
These are not like static URLs. Dynamic URLs have not its own existence. It comes from server. .ASP, .PHP, .JSP, .ASPX etc. are the well known name of dynamic URLs.
Benefits of Dynamic URLs:
For portals or e-commerce sites these are the best.
You can easily maintain 1000 of URLs and updates them dynamically.
Disadvantage of Dynamic URLs:
Dynamic URLs sometimes are not cache friendly.
These are long and create query string URLs.
Dynamic and static URLs depend on your business requirements. If your business is big and it deals are related to customer oriented products then you need to develop a dynamic web pages or dynamic sites. Now, most of sites are on the dynamic scripts. Dynamic pages use Clint side script and server side script to browse a web page.
In the dynamic web pages contents, image, script, web design are separate on the server and when a user request an id then all the things come jointly on the browser.
For the informative and small sites static web page is right.
URL of static pages looks like this: http://www.example.com/seo.html
URL of dynamic web pages looks like this: http://www.example.com/seo?id=102
Now, the concept of about dynamic URLs and static URLs are very clear. Dynamic URLs contain long queries and parameters while static URL is simple with file name.
An Opinion about Robot.Txt File October 29, 2009Posted by kishosingh in seo.
Tags: code, directory, robots, search, search engine, search engines, site, url, web pages
add a comment
The chapter of Robot.Txt is not a new but recently, it came in light again when Matt Cutt talked about it on his blog. Robot.Txt file is known as to restrict the access to our site’s pages by search engine robots to crawl.
Use of Robot.Txt File:
Robot.Txt file disallow the pages to crawl by search engines that are restricted. Now, there are some questions – why we should restrict to crawl pages by search engine’s crawler, is it helpful to restrict pages etc.?
How to use Robot.Txt File:
This is the basic things about robot.txt file that how to use it. Many people still don’t know about the uses of it. It is used in dynamic and static both sites. It is always used in root directory.
Robot.Txt file is a txt file that contains some code of disallowing to index. You should need not to mention about allowing pages. You should write only disallowing pages in text file.
Importance of Robot.Txt File:
Robot.TXT is the best option to control over your own site’s pages. The pages that are not necessary for search engine, you can restrict them by robot.txt file.
Suppose, if you have a dynamic site. You are selling a product that offers 200 another products. You keep 20 products on a page. You make 10 pages to show the offers and products. You make a title for first page but the title goes to another 19 pages also. Now, there is a problem for search engine to rank your pages. Search engine understands different URL but there is same title on another 19 pages. Now, you can use robot.txt pages to disallow another 19 pages.
Your site provides session IDs during the login; you can restrict session IDs also by the using robot.txt file.
How to Make Robot.TXT File:
Open a notepad and write these codes:
You can write many codes also to disallow your sites pages. You can validate your robot.txt after making it.
Some Basic Aspect of Search Engine Optimization August 30, 2009Posted by kishosingh in seo.
Tags: code, html, off page, on page, optimization, search, search engine, search engine optimization, seo, sitemap, url, xml
add a comment
We already have discussed on various topics of Search Engine Optimization but still there are some more topics which are remained. In SEO, there are the biggest factors as On-page optimization and Off-page optimization. We already have discussed about those chapters. I need to explain some more basic chapters of SEO which is being ignored by many SEOs.
I have to deal Server Side Factors and some general topic of SEO. In Server side factors, the main topics are – 301 and headers status code.
Many times we face the problem. There is main problem with page redirection which is known as 301 URL directions also. There are big question on this topic. It should be redirect or remove? What is the purpose of redirection? Is redirection will work?
Yes, all these are right questions. From my point of view removal of URL and redirection of URL both have own importance. We should choose those URL to remove which have no visitors and nothing importance from search engine point of view. On the other hand, a page which has high PR or high visitors should be redirected. Redirection is not the last things. There is need of webmaster submission also. There is Google webmaster where all the redirection should be submitted to make known Google crawler about the redirection.
Another thing is Server status code. You should monitor your page server status code. What is it? It is 200 errors or 404 errors? In both situations you have to work on those.
Many times we face large page size also which make pages very slow to browse. So, proper page size should not be exceeded from 110 KB.
There are many browsers so; your page should be passed with each browser properly. All things should be appropriate in each browser.
Some other things are appropriate style sheet, appropriate templates and metadata should be checked, standard navigation, HTML and XML sitemaps and analytical code.
So, all these are basic things of SEO which should be passed properly through SEOs. These are helpful in crawling and ranking.
Tags: keywords, search engine, search engine optimization, search engines, seo, title, Title Tag, web, web pages, web site
1 comment so far
Today, in search engine optimization industry, the question is very hot. How to get higher ranking in organic searches a site by targeted keywords? It is the basic question of this industry. We, as an SEO can’t guarantee of #1 position but we always try to reach on #1 if we are ethical.
Unethical practice can reach us on #1 but it can be temporarily not for permanent. We, as an SEO analysis, research and study of search engines algorithms and try to follow that. We know that no search engines give their algorithm but we try to know that like a doctor to see the ups and downs in this industry.
I have always said that to get higher ranking by targeted keywords is not a vast task but I say here this word in another way that it is not an easy task also. There is only one fact to get higher ranking – keywords research and placement of those keywords in right way on the web page.
If we work as an SEO on a web site then we should understand very clearly our work. Our work starts from URL research and navigation of URL properly. URL research and navigation of URL is the biggest work for a site.
If you understand difference between dynamic URL and static URL then you will get succeed soon. You have to put your dynamic URL within static pages.
If the work of URL is done by you then you should proceed with keywords research and placement of keywords within content, title and overall webpage in right way. This is the basic and final work as an SEO which will guarantee to get higher ranking of your web page in organic searches by targeted keywords.
In this way, we have to cross some technical works also like – use of h1, h2…h6, use of fresh and unique content related to the web theme, use of alt attribute, use of anchor text etc.
Nowadays, in the SEO industry there is big problem of unique and fresh content. There is another big problem of optimized page and optimized content. If we solve these basic problems then surly we will get any web page in higher position in organic searches by targeted keywords.
Content Factors in Search Engine Optimization April 20, 2009Posted by kishosingh in seo.
Tags: article, blog, content, keywords, optimization, search engine, search engine optimization, search engines, seo, title, web, web pages
add a comment
In the search engine optimization content has greater importance for a web page than other things. Content is known as king for the web page. But what should be criteria of best content? Search engines always find better and unique information on the web page to guide in right direction to users.
For the better content there should be some criteria – content should be theme related, content should be unique, content should be fresh, content should be informative and content should be correct by grammar.
Now, on the web page content should be in optimized way. For the content optimization SEOs should analysis the theme of web page and content with related keywords. Keywords should be placed in right way within content. Keywords density within content should not be more than 5-7%. Duplicate content are not allowed for better ranking so, it should be avoid.
For the better ranking it is better to place content in the main body after H1. Scrappy content prevent whole web page so, before placing content all factor should be in mind. There should be standard of web page content, article content and blog content. Web page content should be fully grammatically correct, fresh, unique, informative and theme related.
If a web page offers well established content on the web page then there is no need to put Meta description or keywords but title should be unique and related to that content. Content and title two things can guarantee for better ranking of a web page in any search engines.
Develop a Site from SEO point of view March 11, 2009Posted by kishosingh in seo.
Tags: analytic, codes, design, develop, robots, search engines, seo, site, url, web, web pages, web site
add a comment
I have already mentioned designing aspect for a site in my previous post. There is another aspect that called development. There are some topics to develop your site from SEOs point of view to get better ranking in major search engines.
In the development aspect you have to check junk codes which have not importance on the web page. Junk codes make a web page larger and they are not useful from search engines point of view. Mostly junk codes take place on the web page from designer or developer ends. SEOs should check that unnecessary codes time to time. On the web pages there could be unnecessary spaces also. Those spaces can be harmful from ranking point of view. So, in during the development of a web site you need check very carefully junk codes and unnecessary spaces to remove them.
Another thing includes with URL structure. During the web page development you need to create web folder or web URL. You have to care that in the URL never includes _ (underscore) or – (double hyphens). From the search engines and users point of view in URL you need to include – (single hyphens).
If your site provides logged in option then you need to disallow session IDs to crawl boats. You can include robots.txt file to allow bots to crawl your sites without session IDs.
You need to check always your site on different browser and load timing. Load timing should be exceeded. If load time is high then your site’s bounce rate will be higher.
There should be installed tracking codes in every static page of your web sites. This is analytic codes to track your sites behaviour. The code will let you know how to improve your site, from what sources visitors are coming etc.
Most of the people add Meta refresh tag on the web page. It should not be added on the web page. Meta refresh tag always restrict to show the original content when a visitors request to the web site.