Before a site appears in search reports it has to be indexed by a search engine. Indexation of your web site means that the site pages are visited by a search engine and analyzed. The information is then added to the search engine database. If a page is not indexed, it means that the search engine cannot know about it and consequently will never display information from it.
Site promotion in search engines is an ongoing process of page text optimization, increases in number of quality inbound links and so on. However, the very first step a webmaster of a newly created web resource should make is to check site indexation by the major search engines.
It is normally not difficult to get into a search engine index. In some cases, you do not need to take any special action. For example, the Google search engine will automatically visit and then index your site within a few days, if there are inbound links from other pages that were previously indexed by Google.
With other search engines, it is not quite so simple. In some cases, it is necessary to manually submit the new web resource to a search engine for successful indexation of the site. It is not necessary to sign up all of the pages, submission of just the main page is sufficient. The other pages will be tracked via their links.
Most of the search engines allow you to examine the indexation of the site with the help of special language operators. Seo Administrator allows you to check site indexation in Google, Yahoo, MSN and other search engines.
As Seo Administrator checks the indexation of the site it also obtains the Google PageRank value of each checked page. It is very useful to get the PageRank at the site indexation stage, especially when analyzing complex and large sites.
Updating of search engines’ database is a permanent ongoing process; records in the database may change, disappear and re-appear. That is why indexation of your website pages should be regularly checked. Once or twice a month is a reasonable interval.
The most usual reason for indexation site failure is server unavailability. The search robot could not access to your site at the particular moment that it was visiting it.
The search robots main route into your website will be via inbound links. This means that the more inbound links you have, the more often search robots will visit your site. You can control robot indexing behavior with the help of a robots.txt file.
The image shows an indexation report of the site www.netfirms.com by the Google search engine.