Some content and links visible to your visitors on a web page may not actually be visible to the Search Engines, eg. Flash based content etc.. Part of your SEO research should include tools that will allow you to check that spiders see what they are supposed to see. I recently came across a tool which simulates a Search Engine by displaying the contents of a webpage exactly how a Search Engine spider would see it. It also displays the hyperlinks that will be followed (crawled) by a Search Engine when it visits the particular web page along with meta tags and description.
When you first set up your site, it will not be crawled by Google until the ‘spider’ (or ‘bot’) arrives there via another link – you can of course request Google to crawl your site in Google Webmaster tools, but in my experience it’s a lot quicker to put a link onto a page which you know is already being crawled – a spider finds the link and hey presto, your content is added… This is why it makes sense to make sure there are valid links between your pages. Google webmaster tools also allows you to submit either an XML sitemap, or a simple text list of URLs that you want spidered to ensure that all of your site is included.
You can use the form below to visit the site – you just enter the URL that you want checked. The results shown will include all the text that the Google spiders will be able to see and index, as well as highlight EVERY link that has been identified. Seeing your site displayed this way enables you to have confidence that the right keywords and links are being seen.