Rethinking Artificial Intelligence
Overview
In September, 1997, the MIT Artificial Intelligence Laboratory and the MIT
Industrial Liaison Program sponsored a briefing for senior technical
management and corporate strategists on the future business impact of
accumulating Artificial Intelligence technology. This document is a
summary of the most salient points, as seen from the perspective of the
briefing chair, Patrick H. Winston.
The Big Messages
- AI is back. Development leaders from Microsoft, Netscape, General
Electric, and Disney discussed numerous examples of AI enabled
products and product enhancements.
- Today's AI is about new ways of connecting people to
computers, people to knowledge, people to the physical world, and
people to people.
- Today's AI is enabled in part by technical advances and in part by
hardware and infrastructure advances. The world-wide-web is here and
truly gigantic high-resolution displays are coming.
- Today's AI invites investment in systems that:
- Save mountains of money, through applications such as resource allocation,
fraud detection, database mining, and training.
- Increase competitiveness, through applications that provide, for
example, on-screen help and low-cost rationale capture.
- Create new capabilities and new revenue streams, in areas such as medicine and
information access.
- AI is no longer a one-horse field. Newer technical ideas, with labels such as
agents, Bayes nets, neural nets, and genetic algorithms, combine with
older ideas, such as rule chaining, to form a powerful armamentarium.
All are important; none, by itself, is the answer.
Details
Session 1: The AI Business: Past, Present, and Future
During the 80s, AI experts developed systems for solving problems ranging
from chemical-plant optimization to oil-well log analysis. Some of these
systems were spectacular successes, with payback measured in hours. But in
spite of such early successes, Esther Dyson, editor of the influential
trade publication Release 1.0, predicted that AI would not become truly
important commercially until AI became embedded in main-stream,
strategically important systems like raisins in a loaf of raisin bread.
Time has proven Dyson's prediction correct. In the 90s, AI, as a field, is
becoming more important as emphasis shifts away from replacing expensive
human experts with stand-alone expert systems toward main-stream computing
systems that create strategic advantage. Accordingly, many of today's AI
systems are connected to large data bases, they deal with legacy data, they
talk to networks, they handle noise and data corruption with style and
grace, they are implemented in popular languages, and they run on standard
operating systems. Moreover, human users usually are important
contributors to the total solution.
In this session, the speakers explained how the shifted emphasis has put
AI to work in such industries as defense, transportation, manufacturing, and entertainment.
Patrick Winston
Former Director, AI Laboratory and
Ascent Technology
What I learned about business after I thought I knew everything.
- Leading edge technology buyers are more excited by new revenue than by
saving money.
- Hence, the commercial value of AI lies in the direction of new revenue
building rather than replacing people.
Philip Brou
Ascent Technology
Transportation Industry
Situation assessment, allocating resources, and the application that a
former DARPA director said "justified much of the money spent on AI
research."
- AI makes it possible for organizations to make better decisions.
- Hence, AI can help save large amounts of money in the world of complex situation
assessment and expensive resource allocation.
- AI is often a small part, but certainly an essential part, of the total solution to
the customer's problem.
Joseph Mundy
General Electric
Manufacturing Industry
How computer vision is developed and applied for a broad range of
applications and what constitutes a viable application.
- Applications of computer vision must evolve to mirror company
diversity. General Electric, like many large companies, has become
extremely diverse, with interests not only in manufacturing but also
in broadcasting, medical imaging, and financial services.
- Government-funded research through DARPA, NIST, and other agencies has
become an important factor in developing basic technology, including
migration to application prototypes.
- University collaborations also are an important factor in developing
applications of computer vision technology.
Eric Horvitz
Microsoft
Role of AI in the future of user-friendly software
- Personal computing will be significantly enhanced by intelligent systems.
- Research underway in user interfaces and operating systems will
use AI in significant roles.
Ted Dintersmith
Charles River Ventures
Venture Capital Industry
The AI Business as seen from today's venture capital community.
- AI actually can work, but applications have to be focused on solving
customer problems.
- Many important AI companies have been built.
- AI's early set-backs are not atypical.
- Rapid IT and AI advances will enable the creation of important new companies.
Session 2: Information Access and Presentation: Making People Smarter
Because better information means better decisions, decision makers
naturally want access to large quantities of information expressed in
diverse forms. But the world wide web, new sensor technology, and other
information sources have combined to create quantity and diversity, such
that it has become increasingly difficult to provide decision makers with
the right information, at the right time, in the right quantity, in the
right form.
Of course, practical difficulty means research opportunity, and many
of today's AI research efforts focus on the development of systems that
anticipate information needs, find needed information, distill needed
information appropriately, and display distilled information in new ways.
Some such systems help decision makers locate and query information
sources---human or computer---via the world-wide web. Other systems
distill tidal-waves of information into simple presentations that engage
human problem-solving capabilities. Still other systems bring the
computers into our world, so that we can dispense with the small screens,
awkward keyboards, and distracting mice that the computers of today insist
we use.
In this session, the speakers explained how AI is shaping the
future via AI enabled information access, AI enabled human-computer
interaction, and AI driven advances in interface infrastructure.
How vision research led to a system that enables surgeons to do their work
in 1/3 less time, with great benefit to the patient, and to undertake
operations that would have been too risky otherwise.
- In medical applications, computer enabled overlays can increase the
power of human vision, reducing time and cost and enabling more
accurate and risk-free work.
- In medical applications, computer enabled overlays can improve care
while reducing costs, thus providing a competitive edge to companies
that provide surgical assistance services, clinical evaluation
services, and radiation treatment delivery.
- Computer enabled overlays for surgeons provide an existence proof that
physical work can be made more productive by enhanced reality
technology.
Thomas Knight
MIT and various spinoffs
The Human-Computer Interaction Project
Creating the Infrastructure: Wall-sized displays.
- New display technology will be an enabler for all sorts of new
business and entertainment applications.
- Essentially unlimited amounts of computation is almost free.
Thus, engineers should look for ways to use computation to turn an
expensive problem into a problem readily solved, inexpensively, by
large amounts of computation.
How hundreds of thousands of Web users access text, pictures,
images, maps, tables, video, and everything else.
- Without access technology, much on line information might as well be
in a black hole.
- Today's technology can take us a great leap forward from systems
limited to key word analysis.
- The great leap forward enables new business opportunities for
companies that provide, for example, tailored news services, help desk
services, and access to corporate information ranging from benefits
policies to material normally distributed only in annual reports.
Ramanathan Guha
Netscape
Role of AI in the future of information access.
- The World Wide Web has provided access to so much knowledge, people
need help in organizing increasingly large personal information libraries.
- To provide that help, the creators of aggregation tools must take
advantage of AI work in the area of knowledge representation.
- Knowledge representation is not used to mimic human reasoning, but to
provide a robust and extensible framework with which the user's world
can be modeled.
Marc Raibert
Formerly MIT and now
Boston Dynamics
Simulation systems that enable medical students and doctors
to feel what it is like to suture vessels and learn new procedures.
- Simulated reality is a reality.
- Virtual worlds are not just for fun and games---they are hugely useful
for all sorts of training, especially training in manipulative skills.
- When developing cutting-edge applications, it is imperative to keep
the number of high hurdles low.
Howard Shrobe
MIT
The Intelligent Room: a basis for intelligent collaborative problem
solving
- Intelligent systems for Human Computer Interaction (HCI) will enter
into the environment of knowledge workers as capable assistants who
respond to problems both by presenting relevant information and more
significantly by finding other members of the organization who possess
true expertise relevant to the problem at hand.
- Intelligent HCI systems will not only establish, but also support the
interactions between the collaborating problem solvers; in particular,
they will have models of interaction styles relevant to different
stages of a project and will help at each stage to facilitate
interactions with the greatest likelihood of benefit.
- Intelligent HCI systems will lead to a highly fluid organization in
which teams can be established and disbanded as needed. Valuable
expertise will be brought to bear on a problem as needed but only as
long as needed.
Session 3: Beyond Expert Systems: Making Computers Smart Enough
The rise of rule-based expert systems in the 1980s was predicated on the
idea that computers could do what human experts could do, only less
expensively. Those interested in trying out the then-new expert-system
technology were told that the first problem tackled should be doable by a
person in more than an hour and less than a week.
Today, the emphasis is not on doing what people do. Instead, the
emphasis is on exploiting opportunities to do tasks that people cannot do
alone.
In this session, the speakers provide examples of how AI can, in fact,
do what people cannot do alone, visiting a variety of applications, with
goals that include digging regularity out of data in the search for new
pharmaceuticals, capturing design rationale and exploiting captured
rationale to improve product design, and working through tediously complex
calculations to better guess what a computer user needs to know.
Rodney Brooks
Director, AI Laboratory and
IS Robotics
Building robots: From theories of intelligence to cleaning up land mines
and exploring the surface of Mars.
- Embodied AI systems can do useful physical work today in places that
it is impossible to put people or teleoperated machines.
- Many AI technologies can effectively run on the commodity microprocessors
that are today routinely embedded in consumer products, leading to
enhanced capabilities with hands-free brains-free user interfaces.
David Waltz
NEC Research Institute
Database Mining: Experiences and Potential
Why database mining is important, and experiences in introducing
datamining technologies.
- AI research has developed new ways of analyzing heaps of data that
complement traditional statistical methods.
- Database mining can help to realize value from data assets.
- Database mining is compute-intensive, and can benefit from
ever-cheaper, powerful hardware.
- User interface issues---how to express goals and constraints,
and how to display high-dimensional results---are important areas
where progress is needed (and expected).
Randall Davis, MIT and various spinoffs
Rationale capture: preserving the thought as well as the conclusion.
- Today's technology has reached the stage where more natural and
familiar modes of communication will soon make the keyboard obsolete.
- That technology can be used to capture much of the process
involved in design and decision making, as well capturing the final
outcome.
- The additional information---about rationale, alternatives considered
and rejected, and the expertise underlying the result---is an
enormously valuable commodity, perhaps the most valuable asset of a
corporation in the information age.
David Kirsh
University of California, San Diego
Knowledge Management: Maximizing Intellectual Capital
Best practices transfer well only when properly supported by help,
training and experienced people.
Technologies of collaboration are emerging -- telepresence,
interactive learning environments, and virtual collaboratories.
Reward examples of information sharing, else information hoarding is rational.
To design workable knowledge management system's, make sure HCI design is sensitive to work
practice behavior.
David Barrett
Director, Walt Disney Imagineering
The role of entertainment as an
information-technology driver.
-
Disney uses AI technology in many areas, such as in the development of
animal robots for use, for example, in 101 Dalmations and in the
development of software-aided animation, as used, for example, in The
Lion King.
- AI can be used to make entertainment much more
interactive and tailored to the interests of individuals.
- AI-enabled development in Disney is growing rapidly.