Interest in artificial intelligence (AI) as it relates to IT seems to be growing. Back in December, Network World published an article titled National Science Foundation commands artificial intelligence revolution. Apparently, NSF funds AI research to the tune of $90 million annually as part of its Information and Intelligent Systems (IIS) grant program. Researchers can apply for as little as $500,000 or as much as $3 million for projects related to AI, robotics, machine learning, and other key areas.
In January, Computerworld followed suit with an article entitled Future Watch: A.I. comes of age. The article includes some fascinating information about robotics and machine learning. What is most interesting to me is it describes AI as able to real-world problems by analyzing complex input and learning to adapt accordingly. I was intrigued because, while AI is a somewhat esoteric subject in the non-IT world, yeas ago enterprise management products were among the first AI products to become commercially viable.
In the Beginning
Starting with the original network management tools made possible by the standardization of simple network management protocol (SNMP) in the 1980’s, up to and including today’s sophisticated service oriented architecture (SOA) products, management solutions were among the earliest commercial offerings to productize AI. Admittedly, early products were primarily static systems programmed to display errors based on SNMP traps, and, at most, to perform Y in response to X. Very soon, however, they incorporated correlation capabilities with the innate intelligence to isolate the most likely cause of a network disruption and to notify support specialists about problems.
Today’s solutions have evolved to the point where they feature much higher levels of machine learning and adaptation. Particularly in the past two years, evolution seems to have accelerated and many of today’s products are capable of supplementing the expertise of in-house technology specialists very effectively. I discussed some such products and capabilities in last month’s article on predictive analytics. What does this mean for the CIO? Products with high levels of embedded expertise offer significant opportunities for cost efficiencies.
Products with adaptive capabilities “learn” the normal characteristics of an IT environment by “observing” infrastructure and applications during execution, and collecting and storing data. Often as granular as time of day, day of the week, and time of the month; this data is utilized to analyze real-time execution in context to historical trends. Based on complex algorithms that are often patented, some solutions can maintain rolling baselines, predict application issues before they become full-blown problems, or report whether an infrastructure change has made things better or worse.
As higher levels of intelligence become embedded in enterprise management solutions over time, we will be able to offload a growing percentage of application management tasks to automation. In fact, as I talk with vendors about products and their capabilities, the picture I have in the back of my mind is of an autonomic computing system. Going under different names depending on the vendor, we are seeing autonomic technologies coming from IBM and others, Dynamic Systems from Microsoft, and the Adaptive Enterprise coming from HP. These terms refer to systems—and by systems I mean collections of technology, not stand-alone devices—that incorporate AI for self-management or to manage other systems, to self-configure or self-heal, or to perform other functions incorporating high levels of expertise.