A web crawler (also known as a web spider or web robot) is a program or automated script which browses the World Wide Web in a methodical, automated manner. This process is called Web crawling or spidering. Many legitimate sites, in particular search engines, use spidering as a means of providing up-to-date data.
For demonstration purposes we are crawling a pre-specified web page
Please be aware that there are copyright issues involved in crawling websites. For this reason we are including just one site in this demo. This is a site which we host and maintain and have permission to crawl.
The number of websites we can crawl is unlimited
This program can crawl sites which require sessions and referrers