Skip to Content

EDGEOPS: The Future of Edge Computing

Himanshu Pant | Panigrahi Prasad
17 August 2022
capgemini-engineering

Edge computing is a model in which data handling happens near the site where information is generated and collected, as opposed to on a server or in the cloud. Sensors are synchronized with edge servers to securely decode real-time information and connect devices, like PCs and cell phones, to the network.

Modern organizations need edge computing because it offers new and better strategies for enhancing functional efficiency, further enhancing performance, improving security, automating all basic business activities, and maintaining constant accessibility. The approach is widely used for achieving the digital transformation of business practices. The increase in computing power it offers allows organizations to build a foundation for more independent systems, enhancing their performance and efficiency, while allowing individuals to focus on higher-value tasks.

EdgeOps combines the upsides of edge processing with edge-enhanced AI/ML edge inferencing, execution, and control, offering three levels of significant value that expand upon each other:

  1. split-second information, virtualization, and examination with EdgeOps
  2. fast, versatile deployment of sophisticated models and applications
  3. versatile control that empowers machines to foster self-restorative and self-streamlining capacities.
Figure 1: EdgeOps Architecture Source: Capgemini Engineering

Challenges

Deploying modern workloads such as machine learning apps and AI close to the edge brings several challenges. Those challenges include:

Edge Machine Learning

Gartner expects the number of devices to grow by 15 billion by 2029. As the volume and use of collected data increase, the deployment of machine learning could unleash the full potential of edge analytics. Edge locations have limited resources to perform machine learning tasks: sometimes, the development of a machine learning model requires data to be sent to the cloud for additional validation, as the reduced computational power makes it harder to manage this hybrid environment. Deploying models at locations with low network visibility and limited bandwidth could be problematic in terms of latency, connectivity, etc.

Machine learning algorithms are based on extensive linear algebra operations as well as vector and matrix data processing. A specialized processing flow is required to meet the low-latency requirements for use cases such as self-driving vehicles and Unmanned Aerial Vehicles (UAVs). But traditional architectures are not optimized for such edge intelligence, which requires customized hardware support for the deployment of machine learning workloads. In addition, they usually need to be able to store and access a set of parameters that describe how the model operates. These neural network architectures require access to a massive amount of memory locations for each classification. Deploying machine learning algorithms is therefore challenging due to memory access on resource-constrained devices and keeping data local to avoid costly reads and writes to external memory (e.g., data-reuse).

Power Efficiency

AI strategies such as neural networks normally come at the price of a need for high computing and memory. In typical application scenarios today, these neural networks run on powerful GPUs that dissipate a huge amount of power. For the practical deployment of neural networks on mobile devices, there is a significant need to improve not only the efficiency of the underlying operations performed by the neural networks but also their structure. They must also be agreeable to asset proficient frameworks to amplify execution while decreasing power utilization and limiting the actual space required.

Security and Accessibility

The rapid growth of IoT devices is modernizing the world, as seen in the automobile industry, construction, healthcare, and many other sectors. Gartner predicts that 75% of enterprise-generated data will be created and processed outside a traditional centralized data center or cloud. This proliferation could expose edge devices to different security risks as some of these devices are deployed outside the centralized infrastructure, making them harder to monitor and access in terms of software and hardware. Security risks include:

  • Data storage and protection
  • Authentication and passwords
  • Data sprawl
  • Lateral attacks
  • Personal data and account theft
  • DDoS attack and entitlement theft

Mitigation Approach

Modern organizations deploy technology to edge locations which requires security solutions to ensure that data is secure. The following solutions can improve EdgeOps capabilities and security:

Edge Machine Learning Security

EdgeOps combined with AI technology provides capabilities such as efficiency, adaptive control, machine autonomy, and many more. Machine learning provides highly accurate data if its algorithm models are trained on large sets of data. Applying Continuous Integration and Continuous Deployment (CI/CD) for training models could be a solution for preparing better algorithms. Acting on real-time insights such as monitoring and analyzing real-time data and applying machine learning to it will detect threats, and attacks, and act accordingly from predictions and results. Having distributed architecture to store, process, and perform real-time analysis on the data generated is a way to reduce latency and cost-efficiently retain data at the edge.

Using container orchestration technology, we can limit or restrict the usage of resources and power consumption by up-scaling or down-scaling the clusters. We can also control the security of containers by controlling user access, having root access in a limited manner, reducing OS installed components, using namespaces, forbidding containers to run as root by default, checking container health, and monitoring metrics, logs, and container runtimes.

Implement Zero Trust Edge Access

A key solution to the edge computing security problem is to apply a “zero trust” or “least access” policy to all edge devices. In this scenario, cyber security professionals allow only a minimal amount of access for each device; only what is needed to do its job. IoT gadgets commonly have a particular reason and speak with a few different servers or gadgets, so it should be easy to use a narrow set of security protocols. Using an access control policy to manage your device network means you can give users access only to the resources they need. If one device is compromised, a hacker will be much more unlikely to damage additional resources.

Ensure Physical Security of Connected Devices

Edge deployments typically reside outside the central data infrastructure, making physical security a crucial component. Organizations must implement controls to prevent the dangers of others physically tampering with devices, adding malware to belongings, swapping, or interchanging devices, and creating rogue edge data centers. Security personnel ought to know how to tamper-proof edge devices and employ procedures consisting of a hardware root of trust, crypto-based ID, encryption for in-flight and at-rest data, and automated patching.

Conclusion

EdgeOps is the key to enabling AI at scale on embedded devices. It has the potential to provide enhanced security and reduced costs while maintaining performance in comparison to cloud-based processing, and it can enable new capabilities for companies and individuals alike. Considering the above discussion, we can see that a significant number of changes must be made to improve not only the capabilities of the computing infrastructure but also the underlying architecture. Co-optimizing machine learning algorithms with DevSecOps principles and hardware architecture in terms of security can allow for highly intelligent and resource-efficient systems to realize the vision of EdgeOps.

Authors

Himanshu Pant

Product Services and Support team, Capgemini Engineering GBL
Himanshu is part of the Product Services and Support team at Capgemini Engineering GBL. He focuses on developing DevOps and Cloud solutions and delivering them to customers.

Panigrahi Prasad

DevOps Engineer, Capgemini Engineering 
Panigrahi Prasad is part of the Product Services and Support team at Capgemini Engineering GBL. Focuses on developing DevOps and Cloud solutions and delivering them to customers.