The Computing Continuum: Evolution and Implications
In today’s interconnected world, computing has become an integral part of our lives. From smartphones and laptops to cloud computing and artificial intelligence, we constantly rely on a wide range of computing devices and systems. The concept of computing continuum encompasses the evolution of computing and the diverse paradigms that have emerged over time.
In the mid-20th century, integrated electronics started to become a fundamental part of the technological innovation [1]. Originally, the fundamental paradigm was centralized computing: a single powerful machine performed all processing and storage tasks. Back then, computing resources were finite and costly to expand.
The rise of personal computers in the 1970s brought computing to individuals. At first, PCs were standalone devices with limited connectivity. The rise of the Internet in the 1990s allowed computers to be connected to one another, exchanging data, and accessing shared resources. The proliferation of smartphones and mobile devices in the 2000s ushered in the final era of connectivity, with devices constantly online. At the same time, cloud computing [2] emerged as a new service model where resources could be accessed on-demand, over the Internet. Instead of owning physical infrastructure, organizations could rent access to vast pools of configurable computing resources – storage, servers, databases, software and more.
Finally, the advent of low-cost sensors, microcontrollers and wireless connectivity enabled the rise of the Internet of Things [3], with billions of devices now connected to the internet and each other. This rapid progression, enabled by advances in technology and falling costs, has transformed isolated personal computers into a globally connected ecosystem of intelligent devices. At the same time, the evolution of cloud computing created a system that follows the “as-a-service” model, in which customers pay only for the resources they consume, scaling up or down as needed. The main benefits are scalability, accessibility, reliability and reconfigurability of resources.
Ubiquitous computing [4] is a concept that comes from the end of the millennium. This idea refers to the fact that many types of computers can be anywhere at any time, often embedded in the physical world in everyday objects, resulting in being effectively invisible to end users.
Basically, nowadays we can identify two main computational elements:
- A powerful cloud, accessible at all times, which allows for a quick execution of computationally intensive tasks.
- A pervasive network of smart devices with low-power computing units, very close to end users.
Recent advances in edge computing and the Internet of Things have further expanded the continuum. Edge computing brings computational capabilities closer to data sources, while IoT connects billions of devices generating massive data.
The computing continuum enables seamless integration across devices and platforms. Users can do much more with their devices, and much more computational power keeps getting added throughout the continuum. Emerging technologies like artificial intelligence (AI), machine learning (ML) and quantum computing are made possible through this evolution. As technology advances, the continuum will continue to expand, transforming our digital experiences.
Author: Riccardo Cavadini
Reference
[1] Moore, G. E. (1965). Cramming more components onto integrated circuits. Electronics, 38(8).
[2] Armbrust, M., Fox, A., Griffith, R., Joseph, A. D., Katz, R., Konwinski, A., … & Zaharia, M. (2010). A view of cloud computing. Communications of the ACM, 53(4), 50-58.
[3] Perera, C., Liu, C. H., Jayawardena, S., & Chen, M. (2014). A survey on internet of things from industrial market perspective. IEEE Access, 2, 1660-1679.
[4] Weiser, M. (1991). The Computer for the 21 st Century. Scientific american, 265(3), 94-105.
Links
Keywords
Computing continuum, edge computing, cloud computing, internet of things, ubiquitous computing