Googlebot is Google's crawler used to automatically scan websites by following links from one page to another and adding them to the index based on relevance.
Googlebot, as the name says, is Google's crawler (also named bots) in charge of crawling websites. Googlebot is the general name given to:
The purpose of Googlebot is to crawl and scan all web sites and their pages via links: basically, Googlebot finds and reads all the new or updated content and adds it to the index based on its relevance.
The visits created by the Google crawlers may be displayed in the referrer logs, and they should be specified in robots.txt.
Googlebot crawls as many pages from a site as possible but respecting the disavow links selected by the website owner as well.
To ask Google to (re)index a website, please note these general guidelines: