The Case for the Edge Computing

Edge Computing Image | Ts Qatar

Do you know how much data is used daily and how much data was used in 2023, and why is it important when we talk about edge computing?

In recent years, we have been seeing a huge increase in the amount of data being generated, and every day about 2.5 million terabytes of data are produced. In 2023, there was a growth of 23.71%, and unlike in 2022 when 97 zettabytes of data were produced, in 2023 that number reached 120 zettabytes.

Some forecasts suggest that in 2024, this number will reach 147 zettabytes, and by 2025, it will be 181 zettabytes.

To better understand why this information is important, let’s remember the topic of Cloud Computing and the blog that started our journey of exploring and educating our readers about the roles, benefits, and drawbacks of modern technology that largely deals with producing and exchanging vast amounts of data.

We always wonder where it will all end up? Will data overwhelm us?

We mentioned Cloud Computing because in this blog we’re talking about Edge Computing, and readers might easily get confused and not understand the real difference.

These are two different approaches for processing data and running applications. While Cloud Computing processes and stores data on remote servers in data centers, commonly known as “Cloud”, Edge Computing approach processes and stores data closer to the source, like smart devices, IoT devices, or local servers. Since Edge Computing approach stores data closer to the source, there is a higher chance of reducing latency in data processing and improving application response times. It also helps in transmitting smaller amounts of data, which leads to less network congestion or delays. Additionally, Edge Computing processes data locally, providing greater control over data security and privacy.

The Cloud is for global access to resources, while Edge is for local access to resources and processing. Neither approach is perfect; both have their advantages and disadvantages. However, it’s important to understand their purposes so they can be used for the right reasons to improve business efficiency. Since today’s topic is Edge Computing, we will focus more on this subject to explain and define Edge Computing, as well as its current approaches and business examples.

According to Grand View Research, the global Edge Computing market is expected to grow from $53.6 billion in 2023 to $111.3 billion by 2028, with an annual growth rate of 15.7% during the forecast period.

As you can see from the data and research mentioned, we are increasingly turning to a more modern way of living with new digital tools, as well as their development and improvement. You’re probably familiar with tools like voice assistants, face filters, autonomous vehicle apps, and if you’re a driver and own a car, you’re likely aware of sensors and cameras in the vehicle. Did you know that these same sensors and cameras, thanks to Edge Computing, process data in just a few milliseconds at a local level, ultimately preventing you from frequent accidents and similar

Edge Computing isn’t just about face filters; it’s much more than that. It’s used in manufacturing, agriculture, oil and gas industry, healthcare, public institutions, and defense.

Before we continue, let’s define Edge Computing.

What is Edge Computing?

Edge Computing is a term often used in technology circles, especially related to Internet of Things (IoT) devices, and it refers to placing computer and storage resources where data is generated, at the data source. It involves decentralized data processing. Edge computing not only provides a foundation for mobile computing but also for Internet of Things (IoT) technologies. Specifically, within edge computing, data from mobile devices, local computers, or servers are processed directly, without the need to transfer them to a central data center.

For example, it’s possible to collect data from wind turbines or railway stations by deploying a certain amount of computer and storage resources. These resources are used for data collection and processing on-site. The results of such processing can then be sent back to another data center for analysis, archiving, and integration with other data.

When defining Edge Computing, it’s useful to differentiate it from Internet of Things (IoT) and Cloud Computing.

Edge Computing refers to a topology- and location-sensitive architecture, not a specific technology. This architecture is usually associated with IoT devices rather than Cloud devices. Therefore, this distinction should be clearly emphasized when defining Edge Computing.

How did Edge Computing develop?

It started evolving from content delivery networks in the late 1990s. These networks were designed to deliver web and video content to users faster and more efficiently. They used edge servers located close to users to reduce latency and improve performance.

By the early 2000s, these networks began to host applications and their components on these edge servers. This resulted in the first commercial use of edge computing, allowing hosting of various types of applications such as shopping carts, ad insertion engines, real-time data aggregators, and store locators. All these applications were locally hosted on edge servers, providing faster access and better performance for end users. This development shows how edge computing evolved from simple content distribution to more complex hosting of applications on edge servers, benefiting users with local data processing and reduced latency.

How does Edge Computing work?

Edge computing emphasizes the importance of location in processing data. In a typical business setting, data is first generated on end-user devices, like users’ computers. Then, this data travels over the internet through the company’s local network (LAN), where it is stored and processed by the company’s applications. After processing, the results are sent back to the end-users. This cycle highlights the significance of the local location in the process of collecting, processing, and distributing data.

What are the advantages and disadvantages of Edge Computing?

Like anything else, Edge Computing has certain benefits, but also risks.

Edge Computing brings several advantages that contribute to more efficient data processing and increased security. The speed of data processing allows for reduced network load, resulting in faster information exchange. Additionally, connected devices maintain functionality even in situations of internet disruptions or delays caused by Cloud computing, ensuring operational continuity. Furthermore, eliminating the need to transfer sensitive data to the Cloud provides greater security as sensitive data remains on the local device or network, reducing the risk of unauthorized access or data leaks. These features make Edge Computing an attractive choice for organizations aiming for faster and more secure data processing.

The disadvantages of Edge Computing relate to capacity limitations, especially when faced with the need to process large amounts of data or archive them long-term. These memory and computing resource requirements are often unpredictable in advance, which can lead to challenges in ensuring timely availability of adequate resources. Additionally, this approach requires increased control and protection of endpoint devices from potential misuse and failures. This additional layer of control and security can be complex to implement and maintain, especially in complex network environments, posing an additional challenge in managing and ensuring system stability.

Examples of Edge Computing in Industries.

Industry 1: Mining

If you’re wondering what the most dangerous job in the world is, and if you answered mining, then you’re right. You might be wondering what Edge Computing has to do with mining. Underground, lack of control can be deadly, and setting up a Wi-Fi network in mines is difficult and expensive. Here, we’ve realized that Cloud isn’t the best option, but safety is still a priority in this industry and it’s hard to function without some automation. And when we talk about automation, we can already connect it to Edge Computing, which is processing data at the edge of the network. Devices that automatically collect data can be controlled and adjusted remotely. However, safe working conditions remain the main priority in the industry, and this won’t work without process automation. Automation means that devices would collect and process data at the edge of the network. Such devices can be remotely controlled and adjusted.

Industry 2: Healthcare

In modern healthcare, which relies heavily on data, around 10 billion IoT medical devices are used (like pacemakers, insulin pumps, heart rate monitors, etc.). They work by generating and processing more data, and now even software-analyzed data is used to provide better care. Along with better care, the speed of processing and analyzing data is crucial for reacting quickly in emergency situations.

Security and latency are key aspects of Edge Computing in the healthcare industry. Since patient data is processed at the point of generation instead of being sent to remote cloud servers, there is less chance of security breaches or unauthorized access to that data. This also leads to more accurate data, reducing latency and the risk of errors.

Maintaining compliance with HIPAA standards, which are essential for protecting patient privacy, is easier because data is processed locally rather than through the Cloud, where there could be higher risk. Additionally, using Virtual Private Networks (VPNs) for secure connections to the Cloud can be slow and problematic, while Edge Computing relies on local data processing, reducing the need for these network protocols and thereby increasing system security and efficiency.

Lower costs refer to reducing healthcare organizations’ financial expenses. Implementing big data applications at the edge allows organizations to gain detailed business insights that help identify areas for improvement and reduce losses. This also helps reduce costs per patient, ensuring more efficient use of resources and the financial sustainability of healthcare facilities without compromising the standard of care provided to patients.

Conclusion.

Edge computing is still in its early stages, and it’s expected that it won’t replace Cloud computing, but rather complement it. It has certainly found applications in various industries, and as technology develops, its architecture is becoming more usable with faster results and efficient costs.

Its development is expected to involve working on more complex applications in areas such as artificial intelligence, machine learning, and robotics.

What we currently know is that data collection and analysis are currently taking place on the user’s smart device at the “edge” of the network.

Get In Touch
TS Qatar Systems & Communications
24699 Ibn Seena Street, Al-Munthaza Doha Qatar
24699 Mirqab Mall, Al Nasr Street
Doha Qatar

Copyright ©2022 All rights reserved | TS Qatar, Doha, Qatar