Description:
Programming Bots, Spiders, and Intelligent Agents in Microsoft Visual C++ showcases the basics of creating autonomous Internet programs such as bots and agents. Besides offering an introduction to writing "intelligent" software, this book provides excellent background material on Internet programming in general. The first part of the book examines the differences between simple bots (such as Web crawlers) and more advanced, intelligent agents (which communicate more directly with users). The author discusses security and access issues for creating well-behaved bots that Webmasters will find more acceptable. Author David Pallmann's four powerful custom Microsoft Foundation Classes (MFC) for building bots are at the heart of this book. After introducing them, he features material on scheduling bot activity and logging the results and includes samples for tracking Web site changes (along with logging). Information on mapping Web sites with a real Web crawler follows, including a discussion of multithreading for improved performance. Pallmann features plenty of material on design issues with agents, which should be as unobtrusive, reliable, and flexible as possible. Samples include an agent that monitors weather information on the Web and one that hunts down stock information. Final chapters look at some of the issues inherent in processing HTML programmatically. (The book has some good tips for managing ill-formed HTML pages.) There is also an excellent review of the basics of Internet programming for HTTP and FTP in MFC (and even a quick tour of open database connectivity [ODBC] database programming). In all, this book provides some great sample code along with a thorough introduction to what goes into building today's intelligent Internet bots and agents. --Richard Dragan
|