Autonomic Computing in today’s computing landscape

Asst. Proof. Dr. Noor D. Al Shakarchy
Asst. Proof. DrHiba Jabbar Aleqabie
Computer Science and Information Technology

Autonomic Computing
Autonomic computing is a computing model in which computer systems and software applications are designed to self-manage, self-configure, self-heal, and self-optimize with minimal human intervention. Autonomic computing seeks to minimize the need for human supervision and intervention by designing computer systems that can function autonomously and adapt to changing circumstances. The term “autonomic” is derived from the autonomic nervous system of the human body, which controls many involuntary processes, such as breathing and heart rate, without conscious intervention. Similarly, autonomic computing aims to create systems that can manage themselves in a similar way.
Characteristics of Autonomic Computing
The main characteristics and features of autonomic computing include:
Self-configuration: Autonomic computing systems can configure themselves automatically based on changing conditions or requirements without human intervention.
Self-optimization: Autonomic computing systems can continuously monitor their performance and adjust their behaviour to optimize their efficiency, speed, and resource utilization.
Self-healing: Autonomic computing systems can detect and diagnose faults or errors in their operation and take corrective actions to restore normal operation without human intervention.
Self-protection: Autonomic computing systems can detect and respond to security threats or attacks automatically without human intervention.
Adaptability: Autonomic computing systems can adapt to changing environments or conditions, such as changes in user behaviour, system load, or network topology.
Resilience: Autonomic computing systems are designed to be resilient to failures, errors, and attacks and can recover quickly from disruptions.
Scalability: Autonomic computing systems can scale their resources up or down automatically to handle changing workloads or demands.

Autonomic computing in today’s computing
Many reasons why autonomic computing is needed in today’s computing landscape are:
Complexity: Computer systems and applications have become increasingly complex and difficult to manage, with multiple layers of software and hardware interacting in complex ways. Autonomic computing can help simplify the management of these systems by automating many of the tasks involved.
Scale: With the growth of cloud computing and other large-scale systems, managing computing resources and applications has become a daunting task. Autonomic computing can help manage these systems by automating many of the tasks involved, such as resource allocation, fault detection, and performance optimization.
Efficiency: Traditional management approaches rely on human operators to manually perform tasks, such as provisioning resources or configuring systems. This can be time-consuming and error-prone. Autonomic computing can help improve efficiency by automating these tasks and reducing the need for human intervention.
Resilience: Computer systems and applications are vulnerable to failures, errors, and attacks. Autonomic computing can help improve resilience by detecting and responding to these issues automatically, without human intervention.
Security: Cybersecurity threats are becoming increasingly sophisticated, and traditional security approaches may not be sufficient to protect against them. Autonomic computing can help improve security by detecting and responding to threats automatically without human intervention.
Autonomic computing applications
Autonomic computing can be applied to any complex system or application that requires intelligent, adaptive, and self-managing capabilities. By automating management tasks and minimizing the need for human intervention, autonomic computing can improve system performance, reliability, and security, while reducing costs and complexity.
Autonomic computing can be applied to a wide range of systems and applications, including:
Cloud computing: Autonomic computing can be used to manage cloud-based infrastructure and services, such as virtual machines, storage, and network resources.
Internet of Things (IoT): Autonomic computing can be used to manage large-scale IoT networks, devices, and data, including device provisioning, data collection, and analysis.
Cybersecurity: Autonomic computing can be used to detect, prevent, and respond to security threats and attacks, including malware, phishing, and denial-of-service attacks.
Big data analytics: Autonomic computing can be used to manage and optimize large-scale data processing and analysis tasks, including data ingestion, processing, and storage.
Robotics: Autonomic computing can be used to manage and control robots and autonomous systems, including navigation, sensor data processing, and task scheduling.
Telecommunications: Autonomic computing can be used to manage and optimize telecommunications networks, including traffic management, resource allocation, and fault detection.
Early uses of this idea in real life include:
Client Relationship Management: A loop, which is a self-optimizing component, may be used to handle the complexity that occurs everyday on lead-execution.
Finance: An algorithm for managing and analysing financial mistakes such as those in resource allocation, allocation of budgets, revenue generation, recurrent costs, etc.
Policy management: Autonomous computing assists in managing price policies via a variety of plans that take into account the marketplace, rivalry, etc.
Inventory management: To adapt itself optimally, variables, including stock, market data, and supplier information, are regularly examined.
IT infrastructure management – Other applications for autonomous computing include server load balancing, task distribution, memory error correction, automatic software and driver updates, pre-failure warning, automated device backup and recovery, etc.
Autonomic computing’s Future
Computer-related problems are expanding along with the computer sector. They are becoming increasingly dynamic. The level of uncertainty has increased to the point that there is a greater need for professional assistance. This has increased the demand for autonomous devices that can do computational tasks without assistance from humans.
Autonomic computing architecture claims to automate computing device management. However, the capacity would serve as the foundation for even more efficient Cloud Computing. Applications of autonomous computing include load balancing for servers, process allocation, memory error correction, power supply control, automatic software and driver updates, pre-failure alert, automated device backup and recovery, and others.
In conclusion, Autonomic computing is a computing model that allows computer systems and software applications to self-manage, self-configure, self-heal, and self-optimize. It is needed in today’s computing landscape due to its complexity and can be applied to a variety of systems and applications. It is expected to be used to automate computing device management and serve as the foundation for Cloud Computing. Overall, the main goal of autonomic computing is to reduce the complexity and cost of managing computer systems and applications while improving their reliability, performance, and security.

References:
Philippe Lalanda, Julie A. McCann, and Ada Diaconescu, Autonomic Computing: Principles, Design, and Implementation (Undergraduate Topics in Computer Science) 2013th Edition, Springer.
Manish Parashar and Salim Hariri, Autonomic Computing Concepts, Infrastructure, and Applications, 2007.
Phan Cong-Vinh, Formal and Practical Aspects of Autonomic Computing and Networking: Specification, Development, and Verification, NTT University, Vietnam, DOI: 10.4018/978-1-60960-845-3