A bot is a software robot. On the Net, robots have taken on a new form of life. Bots are normally used for digging through large amounts of data You give a bot directions and it bring back answers, examples: search engines. Bots can also be used to perform some repetative tasks in the Net, examples: IRC bots.
Since all Web servers are connected, robot-like software is the perfect way to perform the methodical searches needed to find information. For example, Web search engines send out robots that crawl from one server to another, compiling the enormous lists of URLs that are the heart of every search engine. Shopping bots compile enormous databases of products sold at online stores.
The term bot has become interchangeable with agent, to indicate that the software can be sent out on a mission, usually to find information and report back. Strictly speaking, an agent is a bot that goes out on a mission. Some bots operate in place; for example, a bot in Microsoft Front Page automates work on a Web page.
Bots have great potential in data mining, the process of finding patterns in enormous amounts of data. Because data mining often requires a series of searches, bots can save labor as they persist in a search, refining it as they go along. Intelligent bots can make decisions based on past experiences, which will become an important tool for data miners trying to perfect complex searches that delve into billions of data points.
Bots were not invented on the Internet, however. Robotic software is generally believed to have been created in the form of Eliza, one of the first public displays of artificial intelligence. Eliza is a computer programmer that can engage a human in conversation: Eliza asks the user a question, and uses the answer to formulate yet another question (for more on Eliza see Don Barker's review). Artificial intelligence is an advanced form of computer science that aims to develop software capable of processing information on its own, without the need for human direction.
At times, Webmasters look on some forms of robots as a nuisance. A spider robot may uncover information the Webmaster would prefer would remain secret; occasionally, a bot will mis-behave as it crawls through a Web site, looking for URLs over and over, and slowing down the server's performance. As a result, search engine developers have formed standards on how robots should behave and how they can be excluded from Web sites.
Links & References
- "The Spot for All Bots on the Net". BotSpot classifies Bots and Intelligent Agents by subject. Most of the bots you'll find discussed at BotSpot can be downloaded and used on your computer; some require a fee for permanent registration. Others are completely free.