Normally, search engines, notably Google, Yahoo or Bing employ crawlers, some times also referred as Bots, to find and index Web Pages in order to feed their search results. However, these crawlers do not index all the web pages of a website.
They do not index all the web pages because of various reasons. The reason they are unable to find may vary from simple to complex. For example, simple reasons as a broken link may leave a web page un-indexed.
Similarly, a variety of these simple reasons may combine to cause a complex hindrance to these crawlers.
This is where search engine optimization steps in. They identify why a web page is unindexed and incorporate the necessary changes. However, this can attract the crawlers only to a certain extent. In order to ensure indexing, search engine optimization techniques swear by submission to search engines and directories.
Submitting a website to search engine directories creates links and that in turn enable the crawlers find pages through these links. Once these crawlers arrive, they then proceed to crawl and index all the pages by finding them through the internal links. (Now do you understand the value of having a Sitemap?:)).
The above scenario is a simplistic one and given as an illustration. The truth is even if crawlers do index your website (homepage), there is no guarantee that it will crawl all the pages. (There are many reasons why this happens, but then that is beyond the scope of this article).
To conclude, getting a website indexed is an arduous process better left to experts.