robots.txt
, containing rules for the spidering of that server that the bot is supposed to obey.
Saturday, 10 November 2012
internet bots
Internet bots, also known as web robots, WWW robots or simply bots, are software applications that run automated tasks over the Internet.
Typically, bots perform tasks that are both simple and structurally
repetitive, at a much higher rate than would be possible for a human
alone. The largest use of bots is in web spidering,
in which an automated script fetches, analyzes and files information
from web servers at many times the speed of a human. Each server can
have a file called
Subscribe to:
Post Comments (Atom)
humm
ReplyDelete