Discussion in 'SEO & Traffic Discussion' started by oorjaclinic, Mar 20, 2017.
What are Spiders, Robots and Crawlers and what are their functions?
There are a million different bots but from what I know spiders and crawlers are designed to scan websites and extract date from them
Spiders, Robots and Crawlers all are same these are automated software programme search engine use to stay up to date with web activities and finding new links and information to index in their database. Search engines need to keep their database updated so they created some automated programmes which goes from site to site and find the new data for search engine also collects the information about the web page what is the page all about.
Spiders and Crawlers are specific type of Robots (Bots). Bot is a general name for all automated activities.
It will be great if you specify for what reason are you asking, it will help us to find out the best answer for you
Separate names with a comma.