IBM Global Services Executive Tek Report March 22, 2002 Autonomic computing: Can it help to manage the increasingly complex information environment? Executive Summary - Information systems that respond to changing conditions and regulate and repair themselves are the dream behind autonomic computing. This initiative depends on bringing together advanced technologies and management techniques in a holistic way. Autonomic computing isn't just an academic "grand challenge." It is an answer to a looming crisis in our ability to manage an ever more complex information environment, where the costs and scarcity of trained personnel threaten to halt progress. Autonomic computing can help ease this increasing complexity by moving it into systems and using system resources to manage the problem. To be successful, autonomic computing depends on pioneer work in architectural design, use of standards, agent software, context detection and much more. Your firm has just signed a contract to provide retirement investment support for a global chemical company. There are four key elements in this contract: a portfolio of investment options, employee education, legal qualification and automated payroll services. Your portfolio is long established through partnerships with brokers worldwide. The biggest change will be a dramatic increase in the number of transactions. That can be handled by the FastFarm server farm, which will allocate additional resources, as needed. You'll venture into new business areas with employee education, subcontracting with a specialist to provide both simulation and just-in-time (JIT) education. The JIT systems will interface directly with your client's normal knowledge management system so employees will get advice on an investment bond as easily as they get information on how to store chemicals in the lab. To fulfill your commitment on legal qualification, the application must run transparently, using both profiles of employees and links to government systems to ensure that restrictions, estimations on taxes, and other obligations are accurate and localized. A key feature is the generation of alerts for employees when new regulations make a significant difference to their current investments. Finally, the automated payroll services in this contract allow for employee initiated changes to be handled directly. While the software handshakes do not present a challenge, this element has a high level of security controls and requires bonded experts to safeguard the combined operation. How this happens. Today, to troubleshoot an application installation, get different platforms to work together or manage storage resources, we usually rely on human intervention. The complexity created by increasing computing power, higher bandwidth communications, greater connectivity and a proliferation of devices is handled by highly trained experts who diagnose, design and build solutions. This is enormously expensive, ever more challenging and, ultimately, a losing game, as the number of people required to manage the system grows and grows. At current rates, it is estimated that, globally, demand for skilled IT workers will to increase by over 100 percent in the next six years. Putting the complexity into the system and letting it bear most of the burden of solving these problems is the idea behind autonomic computing. By taking advantage of new and emerging technologies and management approaches, the future may include systems that function well with limited intervention and that provide a simplified user experience. According to the paper, Autonomic Computing: IBM's Perspective on the State of Information Technology, this depends on at least eight elements.
1. To be autonomic, a computing system needs to "know itself" and comprise components that also possess a system identity. To effectively manage itself, a system must first have a full inventory of all the software, hardware and connections it "owns," including those that it can rent or borrow. It needs to have specific policies on claiming and relinquishing resources. In short, it has to know its own boundaries. This is the biggest set of tasks currently handled by humans in terms of contractual obligations, but tools can help. For instance, a software tool could be used to discover and inventory the applications and devices that are connected to the system. System management software (SMS) can provide realtime indications of the use of shared resources. Architecture tools can establish the roles of components and how they work together. Agents can constantly monitor the system to determine when new devices are linked, when applications are added or updated, and when users turn on new capabilities. In the future, sensors might be able to provide context by determining the real world environment. 2. An autonomic computing system must configure and reconfigure itself under varying and unpredictable conditions. Every time something is added or subtracted from a complex system, many elements of the system are affected. This is visible today when you install or uninstall a program, and the system asks what you want to do about updating or removing shared files. The dependencies and options multiply rapidly as systems become larger and more interconnected. Success here is highly dependent on sophisticated, flexible architectural designs, which may be explored and validated through simulation. Modular software, adapters, development and adherence to industry standards and use of Extensible Markup Language (XML) are also keys to success here. Many of the aspects of grid computing are directly applicable. 3. An autonomic computing system never settles for the status quo -- it always looks for ways to optimize its workings. Users need flexibility and dependability as their priorities shift. Business managers need to have confidence that they can make decisions based on the opportunities they face, rather than the limits of their IT environment. Systems should be as large as they need to be, and no bigger. Ultimately, optimization keeps costs of hardware and maintenance from spiraling out of control and improves the speed and efficiency of the system. Here, management tools that handle capacity and demand and can quickly respond to emerging problems are essential. Control theory and decision support can also help make this level of optimization a reality. The most significant results in optimization have come from deep computing, where processing power and algorithms have been put to work to generate and choose options for the best use of resources. 4. An autonomic computing system must perform something akin to healing -- it must be able to recover from routine and extraordinary events that might cause some of its parts to malfunction. Rebooting is not the ideal solution to a malfunctioning system. Communications systems already detect trouble spots and can route around them and activate other resources to maintain system reliability. Computing systems need to be able to do the same thing. Error checking software already exists to detect and even anticipate and correct emerging problems, but much still needs to be done to automate root cause analysis. Once problems are discovered, there is a need to be able to reallocate resources on the fly, take advantage of any redundancy, find capacity that can be rented or borrowed and identify the best options for the system as a whole. Sensors and agents may help here. Actual self-repair is likely in the future. 2
5. A virtual world is no less dangerous than a physical one, so an autonomic computing system must be an expert in self-protection. In today s connected world, the potential for system intrusion and virus attacks must be addressed. The threats are getting more sophisticated, spread more quickly and can cause greater damage. Some of the tools that are available to control access and anticipate and deal with malefactors are immune systems, pattern recognition, social network analysis, visualization, single sign-on and biometrics. In addition, open source approaches may help by leveraging a large, interested community to find and close security holes before they do harm. Ultimately, security features that are built into the system from the beginning provide the most hope. 6. An autonomic computing system knows its environment and the context surrounding its activity and acts accordingly. Two things are at work here: One context is the computing environment -- which specifies which resources might become available and the extent of sharing that is permissible. The other is the user's context -- which determines what information is of interest. For example, a global positioning system (GPS) might make it clear that a temperature query pertaining to the inside of a house located in the U.S. is best answered in Fahrenheit, and how it might be presented best (such as PDA, cell phone or workstation). Technologies including grid computing, sensors, user profiles, hard portals, workflow and decision support help provide contextual information and make the information received valuable. 7. An autonomic computing system cannot exist in a hermetic environment. This is a fundamental concept. Today, information systems depend on connectivity for most of their value. There is an ecosystem of relationships, and new devices and applications must arrive adapted to this ecosystem not with a cluster of needs that must be met before they can fit into the ecosystem. Standards, nonproprietary software, open source software, grid computing adapters and agents all have roles to play here. 8. Perhaps most critical for the user, an autonomic computing system will anticipate the optimized resources needed while keeping its complexity hidden. Instead of people having to learn and adapt to computing systems, the systems should learn and adapt to us. At a minimum, the design should take advantage of what is known about humans so the system can be intuitive. Raising the bar on easeof-use needs to take place at every level, from the casual user, all the way to an executive managing the information needs of dynamic partnering. Advanced interfaces, anticiparallelism (where the system performs, in a parallel fashion, anticipatory work to gauge what the user wants next), visualization and instinctual computing are just some of the technologies on the horizon to help achieve this. What this means to you. Autonomic computing can assist in reducing market concerns over security, reliability, service level agreements and return on investment. Most of the focus in autonomic computing to date has been centered on technology. Given the seriousness of the challenges, this is appropriate today. However, for the vision to become a reality, focus must be shifted to user needs and to the services that will be required as the vision is realized. Interfaces need to adapt more to user experience and context. IT planning and strategy must become more future-focused and work beyond the impediments to autonomic computing progress. IT experts 3
will need to handle new levels of flexibility and an environment with very different security risks. Outsourcing models will change and disciplines -- such as IT architecture -- will need to be directed at solving problems for larger, more complex systems. The importance of autonomic computing to the future of business is clear. Advances in autonomic computing can enable the evolution of computing to move forward at a rate faster than today, as IT people can leave more and more time consuming tasks for the system to complete, and they concentrate their efforts on new development. Much of the growth and many of the products and services envisioned for the future can only be realized if the benefits autonomic computing promises arrive. Two industries: Retail and Finance. For the retail industry, autonomic computing may enable practical dynamic partnering. This would allow for rapid changes in relationships all along the supply chain, without losing data, so that operations can be optimized. In addition, as complexity is hidden, a richer array of information devices can be put to use to view, sample and purchase commodities. Both the contexts of transactions and the profiles of buyers have the potential to become more important components of the retail experience. For financial services firms, autonomic computing may force more capability-based business designs by enabling rapid morphing of structure, without impact to the infrastructure and -- particularly -- its security features. As confidence grows, firms may be able to reach beyond known sources of data, creating new categories of information that can be put to use within risk parameters. This will provide broader, more real-world views and help enable better decision-making. Tek to watch Open source XML Encryption Agents Sensors Grid computing Simulation Pervasive computing 4
References Horn, Paul. (October, 2001). Why autonomic computing will reshape IT. Retrieved from the World Wide Web, March 5, 2002. http://www.control.com/sqposting_html?pid=1003266873 Meehan, Michael. (November, 2001). Users Eye Self-Healing Systems Management Software. Computerworld. Retrieved from the World Wide Web, March 5, 2002. http://www.computerworld.com/itresources/rcstory/0,4167,key18_sto65313,00.html Various. (November, 2001.) Autonomic Computing: A Grand Challenge. EvoNet. Retrieved from the World Wide Web, March 5, 2002. http://evonet.dcs.napier.ac.uk/evoweb/news_events/news_features/nf_article165.html Various. Autonomic computing: IBM s perspective on the state of information technology. Retrieved from the World Wide Web, March 5, 2002. http://www.research.ibm.com/autonomic/manifesto/ Other sites of interest Deep computing http://www.research.ibm.com/dci/ Grid computing http://www.ibm.com/services/insights/etr_grid.html O Reilly and Associates, Inc. http://www.xml.com/ Single sign-on http://www.ibm.com/services/insights/etr_singlesign-on.html 5
About this publication Executive Tek Report is a monthly publication intended as a heads-up on emerging technologies and business ideas. All the technological initiatives covered in Executive Tek Report have been extensively analyzed using a proprietary IBM methodology. This involves not only rating the technologies based on their functions and maturity, but also doing quantitative analysis of the social, user and business factors that are just as important to its ultimate adoption. From these data, the timing and importance of emerging technologies are determined. Barriers to adoption and hidden value are often revealed, and what is learned is viewed within the context of five technical themes that are driving change: Knowledge Management: capturing a company's collective expertise wherever it resides in databases, paper, people's minds -- and distributing it to where it can produce the big payoffs Pervasive Computing: combining communications technologies and an array of computing devices (including PDAs, laptops, pagers and servers) to allow users continual access to the data, communications and information services Realtime: "a sense of ultracompressed time and foreshortened horizons, [a result of technology] compressing to zero the time it takes to get and use information, to learn, to make decisions, to initiate action, to deploy resources, to innovate" (Regis McKenna, Real Time, Harvard Business School Publishing, 1997.) Ease-of-Use: using user-centric design to make the experience with IT intuitive, less painful and possibly fun Deep Computing: uses unprecedented processing power, advanced software and sophisticated algorithms to solve complex problems and derive knowledge from vast amounts of data This analysis is used to form the explanations, projections and discussions in each Executive Tek Report issue so that you not only find out what technologies are emerging, but how and why they'll make a difference to your business. If you would like to explore how IBM can help you take advantage of these new concepts and ideas, please contact us at insights@us.ibm.com. To browse through other resources for business executives, please visit: ibm.com/services/insights Executive Tek Report is written by Peter Andrews, an IBM emerging technology analyst, and is published as a service of IBM Corporation. Copyright 1999-2002 IBM Corporation. All rights reserved. IBM and the e-business logo are trademarks or registered trademarks of International Business Machines Corporation in the United States, other countries, or both. Other company, product and service names may be trademarks or service marks of others. References in this publication to IBM products and services do not imply that IBM intends to make them available in all countries in which IBM operates. G510-1659-00 6