Powering The Edge With AI In An IoT World

In today’s digital world, AI and IoT are transforming so many aspects of our lives profoundly. The number of IoT devices that are connected to the network is exploding at an enormous rate. According to IDC, we will have over 41 billion connected devices by 2025.

With the growing number of connected devices, the amount of data flowing back to the cloud is also exponentially increasing. In the end, it is not a scalable model to pump all this data back to the cloud for processing. Processing all this data at the cloud would push network bandwidth requirements to the limit. Already data centers are finding it difficult to guarantee transfer rates and response times.

As the CEO of a company that has worked with a leading security agency to leverage AI models to do video analytics to perform real-time intrusion and tailgating detection in highly secure bank vaults, I believe it is imperative that we move toward more data processing at the edge. This is the next frontier that is waiting to be harnessed and has immense potential to spin businesses around the edge computing world.

Bringing Intelligence To The Edge

Data is the new oil, but ironically, very few companies are able to extract value out of it despite having petabytes of IoT data all around them. This is because the true value lies in combining datasets from different IoT devices by understanding patterns that can predict future trends. This is where AI on the edge has immense potential in restoring true value back to data.

The edge needs more processing power. This will enable enterprises to run AI models at the edge, thereby bringing more intelligence to the edge.

interesting reading:  Turkish Messaging App BiP Global Users Near 8M

Nowadays, a lot of edge devices have inbuilt compute power. A lot of IoT edge devices have GPU, TPU or VPU. For example, some of the high-end security cameras now feature GPU cards, which enables them to run AI-based image recognition models on the edge itself instead of sending all the HD video back to the cloud for processing. Moving the processing to the edge ensures better response times and reduced bandwidth usage.

PROMOTEDClifford Chance BRANDVOICE | Paid ProgramTech Regulation: How Do Different Regions Approach It?Civic Nation BRANDVOICE | Paid ProgramYoung People Want To Vote. So How Do We Get Them To The Polls?UNICEF USA BRANDVOICE | Paid Program‘Children Are The Hidden Victims Of This Pandemic’

Let me give you an example from the field. In an oil and gas refinery that has 1,000 edge GPU-enabled cameras, one would want to deploy different AI models on different camera nodes based on the location and the anomalies the models were trying to detect. A red zone within an oil and gas refinery is the area where there is a high chance of fatality as a result of an H2S gas leak. Hence people entering into a red zone have to wear protective gear. Cameras focused on a red zone might detect HSE noncompliance, such as not wearing an emergency breathing apparatus (EBA) while entering a red zone and trigger real-time alarms, thereby saving lives.

interesting reading:  Trains Over Texas At The Houston Museum Of Natural Science

AI on the edge will help make much better sense out of our data. Uses for AI on the edge are broad and can be applied across numerous verticals, including patient monitoring in healthcare, evaluating the health of crops in agriculture, identifying and rescuing injured folks during natural disasters. and more.

Manage The AI Life Cycle On The Edge

Running AI models on the edge has to be well thought out. Once you have loaded the edge with your AI models, that’s when the easy part ends. You can’t load and forget;they need to be continuously monitored for performance and optimized for various scenarios.

The heterogeneous nature of devices on the edge AI World in an IoT world has its own set of challenges. Remote deployment of the models and monitoring the edge for performance is another big area that has immense potential. One has to have a robust mechanism to deploy and fine-tune the AI models remotely. It’s also critical to keep a close eye on the health of the hardware.

Continuous monitoring of the performance of these models is also a high ask. Managing the continuous deployment, debugging and fine-tuning of AI models on the edge is also an area in which few companies have made real advancements.

For businesses that are just embarking on a journey to leverage AI power on the edge, I would suggest a few things keep in mind:

1. It’s important to select an apt use case that will provide a direct benefit to the business.

2. Select a good tool to automate the deployment and monitoring process for the edge services. Eclipse Foundation’s project ioFog is making waves in this space.

interesting reading:  Five Studies Demystified Essential Human Experiences

3. While selecting the edge hardware, keep in mind a road map of three to five years based on future requirements and capability to expand hardware features.

Security On The Edge

Naturally, security on the edgeai AI World is another important factor that cannot be ignored. Bringing the processing closer to the edge puts more pressure on having rock-solid security in and around the edge. Security at the edge has to be a multipronged strategy to ensure the safety of the hardware and software stack. You need to remain vigilant to detect rogue nodes entering the edge network. Once rogue nodes are detected, they need to be isolated and not be allowed to enter the edge network.

One way is to leverage the hardware root of trust to secure the operation of an edge computing system. Have runtime application verification and authorization to prevent rogue applications. There needs to be trust of data from device to the cloud. Have full control of data flow by making sure data reaches only the authorized nodes.


Powering the edge with AI is the next big gold mine waiting to be harnessed and has immense potential to bring real value to enterprises. AI on the edge in an IoT world will help to deliver intelligent real-time decisions for the business in a cost-effective manner and with low latency.

Originally Publish at: https://www.forbes.com/

Leave a Reply

Your email address will not be published. Required fields are marked *

Captcha loading...