|To search, type one or more key words below.|
or those wondering when artificial intelligence will truly take root, here's a bulletin: it already has.
Artificial intelligence is now a regular academic discipline. It is already embedded in many everyday products. And it helps businesses sort through and make sense of huge databases.
Even so, a controversy erupts with each step toward the day when machines might be said to surpass humans in intelligence -- a day that some say will trumpet progress for humanity, but that others say will court disaster.
"The concept scares people," said Jordan B. Pollack, a Brandeis University researcher who found himself in the limelight last year as co-creator of the first machine to design and manufacture other machines with virtually no human help.
But Mr. Pollack knows better than most that although yesterday's release of the futuristic film "A.I." is likely to stir up such fears, artificial intelligence has already seeped into many corners of daily life.
Last year, Mr. Pollack branched out from his research roots to found Thinmail, a venture that provides users of wireless devices with an intelligent electronic assistant capable of tasks like translating documents into simple text and diverting bulky attachments to fax machines.
Thinmail is just one of countless businesses that use machine intelligence, whether to guide missiles, detect credit card fraud, diagnose medical problems, or make toys more entertaining. Automated money management and trading systems manage an estimated $1 trillion in pension funds and other investments. While few people see any connection of all this to the machines Mr. Pollack is researching in his lab, much less the robots of science fiction, their variety and the pace of their development is clearly shaping social and economic life.
"An A.T.M. is not very intelligent, but it puts bank tellers out of work," Mr. Pollack said. "It earns a living."
Just how much of a living is anybody's guess. "I haven't seen an estimate of the market value of A.I. products in years because it's become part of the landscape," said Curt Hall, a software analyst with the Cutter Consortium, a research firm in Arlington, Mass., who has followed the technology since the 1980's.
The numbers are elusive in part because artificial intelligence is spread across a variety of disciplines that overlap and, in some cases, start from conflicting premises about how humans think. They include the ability to understand and manipulate language; make sense of what can be seen, heard or felt; find useful patterns in data; and draw conclusions based on rules and experience. Other attributes include an ability to respond to environmental changes without human intervention or evolve through selecting the best results from random mutation.
No product puts everything in one package the way science fiction
repeatedly envisions it, but researchers are increasingly stacking
several of them together, said Joseph Sirosh, executive director of
the Advanced Technology Solutions
Most computer experts expect public fascination with artificial intelligence to surge with the fanfare surrounding the release of "A.I.: Artificial Intelligence," the sci-fi film vision of Stanley Kubrick and Steven Spielberg that some have described as Pinocchio meets robotics. Some suspect the impact will be comparable to 1968, when Kubrick's "2001: A Space Odyssey" made HAL the world-famous symbol of deadly thinking machines.
Today's theatergoers will emerge with a reality check all around them, in the form of numerous university programs that teach artificial intelligence and the many businesses that embed such concepts in their products. But the artificial-intelligence world is so fragmented that some experts fear the film will leave many people with mistaken notions that could slow development.
"My fear is that the movie will make the subject too cute," said Ronald R. Yager, director of the Machine Intelligence Institute at Iona College. "Serious people would become afraid to associate themselves with the technology."
Similarly, some vendors of intelligent software fear that an explosion of interest could produce a flood of shaky business schemes and products dressed up as artificial intelligence. "When you get the circus, you get the clowns," said Konrad Feldman, head of American operations of Searchspace, a British vendor of multimillion-dollar software agents that continually examine databases and online activity.
Others are more hopeful, suggesting that "A.I." might awaken venture capitalists to the commercial potential of research projects in controversial areas like the emotional dimensions of machine intelligence. The film asks what would become of a childlike robot programmed to love a human mother. As unnerving as the results depicted are for both the robots and humans, researchers said "A.I." could build support for today's more mundane goals of using programmed emotional capabilities to make Web sites, tutorial software and products like cars more responsive and engaging.
"The movie could propel what I'm doing at an exponential rate," said Cindy Mason, a researcher at the University of California at Berkeley who has been developing programming techniques to represent attitudes, moods, temperament and other emotional states.
Many researchers and entrepreneurs with artificial-intelligence products say they hope the movie will be the occasion for a national crash course on how far the technology has come. The technology had a notably rocky commercial debut in the 1970's and 1980's. Fortunes were invested and lost in robotics, in machine vision systems and in software known as expert systems that tried to reduce human expertise to collections of rules that machines could follow.
In some cases, like Digital Equipment's software package XCON (for expert configurer), which helped customers choose among many options for computer systems, programs initially hailed as great successes proved to be embarrassingly limited and expensive as the number of rules they juggled swelled. Automakers were dismayed to discover that expert systems they developed to help manufacture cars had to go through extensive overhauls every time the models changed.
The Nobel Prize-winning economist Herbert Simon predicted in 1965 that by 1985 "machines will be capable of doing any work man can do." When that year rolled around, though, a period of diminished expectations known as "A.I.'s winter" had set in. Several start-ups failed, and some big corporations reduced their programs.
Some technology managers still associate artificial intelligence with the hype of that era, said Steven A. Ward, founder and chief executive of the Ward Systems Group in Frederick, Md. Ward Systems began marketing a form of artificial intelligence software known as neural networks in 1988. Such products, which try to mimic human learning, make projections about, say, how markets will move or when a manufacturing process will break down.
"When engineers who want to buy the product tell us that terms like A.I. and neural network are raising red flags with their superiors, we tell them to call it multivariable nonlinear modeling," Mr. Ward said. "Their superiors won't have the foggiest notion of what it is, but it sounds traditional."
On the other hand, a growing number of businesses, led by the video game industry, appear to view the use of artificial intelligence as a selling point. After all, their mostly young customers have no memories of the disappointments in the 1980's. Richard Stottler, whose San Mateo, Calif., company, Stottler Henke Associates, helps clients add artificial intelligence to their products, said that a marriage counseling firm, a company that reviews highway designs and a trouble-shooting service for computer network operators all plan to stress this feature in their marketing.
Veterans of the long push to expand and commercialize machine intelligence say that efforts to market it have often been confounded by the tendency of experts continually to raise the threshold of what they consider true artificial intelligence.
Raymond Kurzweil, a researcher and entrepreneur whose involvement began when he was a teenager in the 1960's, said, "Once a technique works, it's no longer considered A.I."