crawler visits a website and stores it in a databank.
A crawler is a program used by search engines to collect data from the internet.
It is the process by which Googlebot visits new and updated pages to be added to the Google index
@Maria,
Do have any doubts?