You are here
Paradoxes of edge computing
More and more cities, around the world are visibly embracing the digital revolution and adopting new technologies to become smarter. At the Smart City Expo World Congress 2018 alone, over 700 cities around the world presented their projects around developing and improving smart city solutions.
Smart cities use data and technology to help improve the quality of life for its citizens. The arrival of the fifth generation of wireless technology (5G), Artificial Intelligence (AI) and faster Internet of Things (IoT) connectivity promise to entice even more cities to embrace the power of digital data to improve people’s lives.
More cities will be enticed to embrace the digital revolution and adopt new technologies to improve people’s lives. Photo Credit: Seagate
Why edge computing is crucial to the development of smart cities
Fast-expanding IoT sensor networks, 5G’s blazing fast data rates and last mile bandwidth, and AI applications promise to help realise the dream of smart cities. And as new data types and sources continue emerging to achieve a smart city vision, the global datasphere looks set to grow exponentially.
Global digital transformation is expected to generate up to 175 zettabytes of data by 2025, making it imperative to process data closer to the source (such as the edge), so that smart city services can be delivered speedily to citizens.
But as the fourth industrial revolution unfolds at breakneck speed, misunderstandings can arise and cloud perceptions of edge computing.
Here are five myths about edge computing, explained and debunked to help you and other businesses deploy the technology more efficiently and ultimately, improve people’s lives.
Myth 1: The edge will eat the cloud
The cloud is generally referred to as a centralised data centre used for storing, accessing and processing large amounts of data over the internet. Edge computing allows the processing of smaller-scale data near the edge of a network, closer to where data is being generated. In a nutshell, edge computing improves response time (low-latency) and helps save bandwidth (more data can be transmitted over a network).
Research and advisory company Gartner predicts that by 2020, the number of connected devices will reach 20.8 billion, compared to 6.4 billion connected devices in 2016. Consumers have come to expect the best customer experience and faster responses to their issues with anyone (chatbots included), anywhere and at any time.
As edge computing allows the processing of information to be performed closer to the end-user (or device), some experts have predicted that edge computing will supplant the cloud.
In reality, the edge and cloud complement each other. Data will still be stored in a centralised cloud in an increasingly connected world. The edge provides an interim point for data transfer and computing to speed up the delivery of digital services. The cloud still works great for applications such as large-scale archiving (such as for traffic data or image libraries), application storage (such as for map applications and graphic-intensive games), and rapid prototyping (a method of quickly creating a scale model of a physical part).
An example of how cloud and edge support each other is the usage of smart sensors (IoT devices) in an intensive care unit (ICU) of a hospital. The sensors monitor acutely ill patients and alert doctors and nurses to changes in the patients’ conditions immediately. This is the result of edge computing at work. Meanwhile, the data is also fed into the cloud for long-term storage and additional analysis in order to provide further insights into a patient’s conditions. This is where cloud computing capabilities come into play.
The Takeaway: Understanding how cloud and edge can work together will help businesses find the right balance of being able to process data in real-time (edge) and access stored data remotely for analysis (cloud) to enhance products and services in the future.
Myth 2: There is only one edge
According to Seagate, “the edge” as a singular term refers to an ecosystem in which data is processed near where it’s created. It can be any point along a network where data is being created and processed, often next to the source device or end-user. To share an example, an edge in the manufacturing sector can be a machine on the factory floor that is sending signals to a bigger network.
Hence, there can be more than one edge in any network.
If a factory owner wants to control a robotic arm (an edge) based on information from another machine, or an edge, these two edges can share data and communicate via the cloud. Data analysis will help improve the decision-making process to optimise production.
The Takeaway: As the number of sensors continue to grow in smart cities, more edges will help to reduce network traffic. By reducing latency with multiple edges, businesses can now improve application performances and streamline efficiencies.
Myth 3: The edge is just a little cloud
An edge, where data flows to, is also where the storing and processing of data (a function of cloud) takes place. Some people confuse edges with clouds, referring to edges as little clouds.
This is a misconception. The edge is not a little cloud.
Driven entirely by data, the edge is determined by use cases that produce and process data closer to the end-user. For example, autonomous cars require real-time processing of data at the edge to make almost instantaneous decisions, such as when the brakes should be applied when a moving object is detected in the path of the vehicle.
Autonomous cars require real-time processing of data at the edge to make almost instantaneous decisions. Photo Credit: Seagate
Conversely, there is no edge when the car is entirely controlled by a driver, because data isn’t being processed. Data such as mileage, fuel efficiency and life cycle of car parts are simply produced for the driver’s reference.
The Takeaway: Unlike the cloud, which is a centralised, general-purpose, hyper-scale data centre hub, each edge focuses on solving a specific problem. The edge is a remote lights-out automated operation marked by a physical proximity to the user. Businesses operating on the edge can also look forward to lower operating costs. There are reduced bandwidth requirements and lower data centre costs leading to smaller operational and data management expenses.
Myth 4: AI will replace jobs
The automation of tasks such as data entry, driving and packing, as well as the presence of human-like robots in museums and restaurants have given rise to concerns that robots are replacing humans in many jobs.
This is a myth. AI-powered robots have replaced mundane repetitive tasks in some industries, but they have yet been able to replicate human expertise and experience.
While AI helps businesses make more accurate decisions via predictions, classifications and clustering, logic, meaning, judgement and empathy still require that human touch. Take the use of AI imaging in healthcare as an example: a chest X-ray application based on AI can detect diseases faster than radiologists, but AI cannot replace a radiologist’s job of observing and dealing with unusual cases, which takes experience and sometimes, even intuition.
The Takeaway: Automating parts of a job will increase the productivity and quality of workers by complementing their skills with machines and computers, enabling them to focus on higher-valued and more satisfying aspects of their jobs.
Myth 5: 5G is all about speed
With 5G set to be the enabler of more impressive and speedier IoT applications, it’s easy to think of 5G as being fast. Period.
However, 5G is more than just speed. 5G networks will improve end-to-end latency (the amount of time information takes to get from one place to another) upwards of 5 milliseconds. This is due to faster transmission speeds – 5G can achieve speeds of up to 10Gb-per-second – and the ability to overcome the ‘last-mile’ technology barrier in remote places lacking access to cable internet infrastructure.
The Takeaway: Compared with 4G, the fifth generation of wireless technology can allow a vast amount of data streaming simultaneously, supporting more device connections to drive a wider range of use cases.
According to Data at the Edge, a 2019 report published by Seagate, organisations should start thinking of ways to take advantage of new edge-powered opportunities. As billions of devices continue to come online, capturing and churning out zettabytes of data, today’s centralised cloud environments will need support from a new and robust IT architecture at the edge.
The time has arrived. As more businesses start adopting AI and 5G technologies, the demand for edge devices and applications will increase, and enterprises will need to be ready to withstand the huge increases in data volume and costs down the line.
To learn more about building the new IT architecture, please download the Data at the Edge report here.