Schweitzer Fachinformationen
Wenn es um professionelles Wissen geht, ist Schweitzer Fachinformationen wegweisend. Kunden aus Recht und Beratung sowie Unternehmen, öffentliche Verwaltungen und Bibliotheken erhalten komplette Lösungen zum Beschaffen, Verwalten und Nutzen von digitalen und gedruckten Medien.
Understand the computing technology that will power a connected future
The explosive growth of the Internet of Things (IoT) in recent years has revolutionized virtually every area of technology. It has also driven a drastically increased demand for computing power, as traditional cloud computing proved insufficient in terms of bandwidth, latency, and privacy. Edge computing, in which data is processed at the edge of the network, closer to where it's generated, has emerged as an alternative which meets the new data needs of an increasingly connected world.
Edge Computing offers a thorough but accessible overview of this cutting-edge technology. Beginning with the fundamentals of edge computing, including its history, key characteristics, and use cases, it describes the architecture and infrastructure of edge computing and the hardware that enables it. The book also explores edge intelligence, where artificial intelligence is integrated into edge computing to enable smaller, faster, and more autonomous decision-making. The result is an essential tool for any researcher looking to understand this increasingly ubiquitous method for processing data.
Edge Computing readers will also find:
Edge Computing is ideal for students, professionals, and enthusiasts looking to understand one of technology's most exciting new paradigms.
Lanyu Xu, PhD, is Assistant Professor in the Department of Computer Science and Engineering, Oakland University, Michigan, where she leads the Edge Intelligence System Laboratory. Her research intersects edge computing and deep learning, emphasizing the development of efficient edge intelligence systems. Her work explores optimization frameworks, intelligent systems, and AI applications to address challenges in efficiency and real-world applicability of edge systems across various domains.
Weisong Shi, PhD, is an Alumni Distinguished Professor and Chair of the Department of Computer and Information Sciences at the University of Delaware, where he leads the Connected and Autonomous Research Laboratory. He is an internationally renowned expert in edge computing, autonomous driving, and connected health. His pioneer paper, "Edge Computing: Vision and Challenges," has been cited more than 8000 times in eight years. He is an IEEE Fellow.
What is edge computing? Why did it become popular after being proposed? What are the relationships between edge computing and IoT/Cloud Computing? In this chapter, we will answer these three questions by introducing the background, the evolutionary history, and the concept of edge computing.
To answer the question of this chapter, let us trace back to when edge computing was proposed, back to the big data era when the Internet of Things (IoT) and cloud computing were blooming.
The IoT technology [3] aims to connect physical objects to the Internet according to the communication protocols of IoT, utilizing technologies such as RFID (radio frequency identification), wireless data communication, and GPS (global positioning system). This enables information exchange for intelligent identification, positioning, tracking, monitoring, and management of Internet resources. IoT has significantly expanded with the advancement of computer and network communication technologies. It now encompasses the integration of almost all information technologies with computer and network technologies, facilitating real-time data sharing between objects and achieving intelligent real-time data collection, transmission, processing, and execution. The concept of "computer information perception without human intervention" has gradually been applied to fields such as wearable devices, smart homes, environmental sensing, intelligent transportation systems, and smart manufacturing [18, 36]. Key technologies involved in IoT include:
Later on, with the rapid development of IoT and the widespread adoption of 4G/5G wireless networks, the era of the Internet of Everything (IoE) [11] has arrived. Cisco introduced the concept of IoE in December 2012. It represents a new network architecture for future Internet connectivity and the evolution of IoT, enhancing the network's intelligent processing and security features. IoE employs a distributed structure, integrating application-centric networks, computing, and storage on a new platform. It is driven by IP settings, global higher bandwidth access, and IPv6, supporting hundreds of millions of edge terminals and devices connected to the Internet. Compared to IoT, IoE not only involves "thing-to-thing" connections, but also introduces a higher level of "human-to-thing" connectivity. Its distinguishing feature is that any "thing" will possess contextual awareness, enhanced computing capabilities, and sensing abilities.
Integrating humans and information into the Internet, the network will have billions or even trillions of connected nodes. The IoE is built on the physical network, enhancing network intelligence to achieve integration, coordination, and personalization among the "things" on the internet.
Application services based on the IoE platform require shorter response times and will generate a large amount of data involving personal privacy. For example, sensors and cameras installed on autonomous vehicles capture road condition information in real time; one car with five cameras can generate more than 24 terabytes (TB) data per day [17]. According to the Insurance Institute for Highway Safety, there will be 3.5 million self-driving vehicles on U.S. roads by 2025 and 4.5 million by 2030 [21]. The Boeing-787 generates about 5 gigabytes (GB) of data per second and requires real-time processing of the data. In Beijing, China, the electric vehicle monitoring platform can provide continuous -hour real-time monitoring for 10,000 electric vehicles and forward data to various enterprise platforms at a rate of one data point every 10 seconds per vehicle. In terms of social security, the United States has deployed over 30 million surveillance cameras, generating more than 4 billion hours of video data each week. China's "Skynet" surveillance network, used for crime prevention, has installed over 20 million high-definition surveillance cameras nationwide, monitoring and recording pedestrians and vehicles in real time.
Since the concept was proposed in 2005, cloud computing has been widely applied, changing how people work and live. SaaS (Software as a Service) is commonly used in data centers of major IT companies like Google, Twitter, Facebook, and Baidu. Scalable infrastructure and processing engines supporting cloud services have significantly impacted application services such as Google File System (GFS), MapReduce programming model, Hadoop (a distributed system developed by Apache Foundation), and Spark (the in-memory computing framework designed by the AMP Lab at the University of California Berkeley). However, in the context of IoT and similar applications, data is geographically dispersed and demands higher response times and security. Although cloud computing provides an efficient platform for big data processing, the network bandwidth growth rate cannot keep up with the data growth rate. The cost reduction rate of network bandwidth is much slower than that of hardware resources like CPU and memory, and the complex network environment makes it challenging to significantly improve network latency. Therefore, the traditional cloud computing model will struggle to support application services based on IoE efficiently and in real time, requiring solutions to address the bandwidth and latency bottlenecks.
With the rapid development and widespread application of the IoE, edge devices are transitioning from primarily serving as data consumers to serving as both data producers and consumers. Simultaneously, network edge devices are gradually capable of utilizing the collected real-time data for pattern recognition, predictive analysis or optimization, and intelligent processing. In the edge computing model, computing resources are closer to the data source, and network edge devices now have sufficient computational power to process the raw data locally and send the results to the cloud computing center locally. The edge computing model not only reduces the bandwidth pressure in network transmission, speeding up data analysis and processing, but also lowers the risk of privacy leaks for sensitive terminal data.
Currently, big data processing is shifting from the centralized processing era centered on cloud computing (we refer to the years from 2005 to 2015 as the centralized big data processing era) to the edge computing era centered on the IoE (we refer to it as the edge-based big data processing era). During the centralized big data processing era, the focus was more on centralized storage and processing of big data, achieved by building cloud computing centers and leveraging their powerful computing capabilities to solve computational and storage issues centrally. In contrast, in the edge-based big data processing era, network edge devices generate massive real-time data. In 2018, Cisco's Global Cloud Index estimated that nearly 850 zettabytes (ZB) will be generated by all people, machines, and things by 2021. Yet only around 10% is classed as useful data; useful data is predicted to four times exceed data center traffic (21 ZB per year) [10]. From 2018 to 2023, the average number of devices owned per person worldwide increased from 2.4 to 3.6. Specifically, in North America, on average, one person owned eight devices in 2018 and 13 devices in 2023 [38]. According to Statista, the number of IoT devices connected to the network was 15.14 billion in 2023 and will reach 29.42 billion in 2030 [35]. This mismatch between data producing and data consuming requires the emergence of an alternation for cloud-based data centers. Instead of purely relying on cloud computing, data can be stored, processed, and analyzed at the network edge. These edge devices will be deployed on edge computing platforms supporting real-time data processing, providing users with numerous service or function interfaces, which users can invoke to obtain the...
Dateiformat: ePUBKopierschutz: Adobe-DRM (Digital Rights Management)
Systemvoraussetzungen:
Das Dateiformat ePUB ist sehr gut für Romane und Sachbücher geeignet – also für „fließenden” Text ohne komplexes Layout. Bei E-Readern oder Smartphones passt sich der Zeilen- und Seitenumbruch automatisch den kleinen Displays an. Mit Adobe-DRM wird hier ein „harter” Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.Bitte beachten Sie: Wir empfehlen Ihnen unbedingt nach Installation der Lese-Software diese mit Ihrer persönlichen Adobe-ID zu autorisieren!
Weitere Informationen finden Sie in unserer E-Book Hilfe.