[ formatted version ] [ home ] [ about ] [ forum ] [ RSS ] [ help ] [ search ]




[ Hear the podcast: Audio | Transcript ]

The promise (and peril) of a world composed of bits is that it opens the door to automation; to let computers alone do what we cannot or prefer not to do ourselves. Computer automation takes many shapes. It can be as basic as a program that spares us from the tediousness of deleting spam from our email, or as evolved as one that simulates human interaction to answer customer service questions.

Software programs that operate independently with little or no human intervention are varied in form and function and go by several names — agents, bots, recommender systems, collaborative filtering, avatars, chatterbots, and more — but all could be characterized generally as digital automata [au•tom•ata], to use the Latin word for self-operating machines. As remarkable as progress has been with digital automata, there is much more to be done. Now that computers communicate with each other across the Internet, opportunities to automate tasks abound. But to realize the full potential of automata on the Web, there is a need for further work to make information understandable to machines. This is the goal of the Semantic Web, an ambitious project to create machine-readable protocols that convey meaning about data (metadata and ontologies) that will allow computers to act upon data independently to perform tasks on our behalf.

Another front in the advancement of computer automation is autonomic computing. Here the goal is to develop a computer that is capable of managing itself to a significant degree without human intervention. Researchers envision a machine that could configure itself, discover and correct faults, monitor and optimize how it functions, identify threats and protect itself from attack among other things.

Not all automation is helpful. A troublesome trend in security is the proliferation of automated attack tools that can be used to cause widespread harm to computer networks. Distributed denial of service (DDos) attacks use automated techniques, like worms, to compromise and seize control of thousands of computers, called a Botnet, and then use them to shutdown targeted Web sites. There are spambots that troll the Internet to harvest email addresses from Web pages. Indeed spam itself is made possible largely through the use automated tools that send out millions of messages each day. Sometimes those messages carry viruses and worms, which replicate themselves and very quickly infect millions of computers across a network. What is worse, automated attack tools can be simple to use with devastating effect in the hands of novices.


Collect: Programs that gather and store information in databases, such as search engine crawlers. [Googlebot]

Filter: Programs that sort and classify information based on a given set of criteria. [SpamAssassin]

Recommend: Programs that analyze information such as preferences to make recommendations. [MovieLens]

Monitor: Programs that scan information to detect changes and respond, for example, by issuing an alert. [Copernic]

Find: Programs that autonomously conduct searches based on a set of given criteria.

Process: Programs that perform regularly scheduled processes, such as with basic computer maintenance. [Macaroni]

Transact: Programs that perform automated transactions, such as with auction bidding. [Auction Sentry]

Simulate: Programs that simulate human activity such as speech, facial expressions, or motion. The representations may be realistic or stylized. [CyberBuddy, Lauren]

Learn: Programs that adapt to new information such that learning may occur. [Creature Labs]

Negotiate: Programs that negotiate with other programs in a multi-agent system.

Learning objectives:

Things to read:

The Vision of Autonomic Computing
Jeffrey Kephart and David Chess | 01.06.2003

Patterns for e-Commerce Agent Architectures
Michael Weiss | 08.02.2001

Electronic Commerce Recommender Applications
J. Ben Schafer, Joseph A. Konstan and John Riedl | 07.18.2000

Case study:


Hungry minds:

Know your Enemy: Tracking Botnets
Honeynet Project

Is There an Intelligent Agent in Your Future?
James Hendler

Agents and Other 'Intelligent Software' for E-Commerce
Maria Gini

Intelligent Agents (Chapter 2 from Artificial Intelligence: A Modern Approach)
Stuart Russell and Peter Norvig

Pricing, Agents, Perceived Value and the Internet
Phillip G. Bradford, Herbert E. Brown and Paula M. Saunders

Intelligent Agents: A Primer
Susan Feldman and Edmund Yu

Electronic Commerce Recommender Applications
J. Ben Schafer, Joseph A. Konstan, and John Riedl

Software Agents: An Overview
Hyacinth S. Nwana

Look it up:

Application programming interface (API)

Autonomic computing




Collaborative filtering

Internet Bot


Recommender system


Semantic Web

Software agent


Places to visit:

CMU Software Agent Lab


IBM Autonomic Computing

UMBC AgentWeb

Previous topic:

Digital Markets

Next topic:


Course information:

© 2009 Michael Rappa
Page last updated: