Looking to the future with edge computing

0
1021

Edge devices (and edge computing) are the future. Although, this does seem a little cliché, it is the truth. The edge computing industry is growing as quickly as technology can support it and it looks like we will need it to. The IoT (Internet of Things) industry alone will have put 15 billion new IoT devices into operation by the year 2020 according to a recent Forbes article titled, “10 Charts That Will Challenge Your Perspective of IoT’s growth”. IoT devices are not the only edge devices we have to deal with as the total number of connected edge devices includes the likes of devices like security devices, phones, sensors, retail sales devices, and industrial and home automation devices.

The sheer number of devices begins to bring thoughts of possible security and bandwidth implications into perspective. The amount of data that will need to be passed and processed with all of these devices will be massive. There needs to be consideration taken by all business owners and automation engineers into how this amount of data and processing will be conducted.

As the number of edge devices in the marketplace and their use among consumers and businesses rises, the need to be able to handle the data from all of these devices is no longer going to be suitable for central server architectures. We are talking about hundreds of billions and even trillions of devices. According to IHS Markit researchers’ study, there were 245 million CCTV cameras worldwide. One has to imagine there are at least 25% of that many access control devices (61.25 million devices) based on a $344 million market cap also calculated by IHS Markit’s researchers.

If all the other edge devices mentioned earlier are considered then one can see that trying to route them all through servers for processing is going to start to become difficult if it hasn’t already, -which arguably it already has, as is evidenced by the popularity of cloud-based solutions amongst those businesses that already use a lot of edge devices or are processing a lot of information on a constant basis.

The question is this; is cloud computing the most effective and efficient solution as the IoT industry grows and the amount of edge devices becomes so numerous? My belief is that it is not. Taking the example of a $399 USD device that is just larger than the size of a pack of cards and runs a CPU benchmarked at the same level as a mid-size desktop. This device has 8GB RAM and 64GB EMMC built-in and a GPU that can comfortably support a 4K signal at 60Hz with support for NVMe SSDs for add-on storage. This would have been unbelievable five years ago.

As the price of edge computing goes down, which it has done in a dramatic way over the last 10 years, the price to maintain a central server that can perform the processing required for all of the new devices being introduced to the world (due to the low cost of entry for edge device manufacturers) becomes more expensive. This introduces the guarantee that there will be a point where it will be less expensive for businesses, and consumers alike, to do the bulk of their processing at the edge as opposed to in central server architectures.

There are a plethora of articles discussing and detailing the opposition between the two sides of the computing technology coin, cloud computing and edge computing. The gist of it is that “cloud computing” was the hot new buzzword three years ago and is now being overtaken by “edge computing.” The truth is that cloud computing is a central server architecture hosted at someone else’s location. Edge computing is the method of processing data at the edge of the network (in the devices themselves) and allowing for less resources required at a central location. There is certainly a use case for both, however the shift to edge computing amongst the general public and small to mid-sized businesses will not be a surprise to those players who have been paying attention.

One article titled, “Next Big Thing In Cloud Computing Puts Amazon And Its Peers On The Edge” by Investor’s Business Daily, takes the stance that edge computing is going to completely displace centralised cloud computing and even coins the phrase, “Cloud computing, decentralised” to explain edge computing. As it currently stands, most residential users can only achieve a 1Gbps WAN (internet) connection, and small to medium-sized business can’t get much more but seem to get much less. When more than 1Gbps needs to be processed, cloud computing becomes very expensive at which point, owning a central server or utilising edge computing become the better options.
Then you look at total cost of ownership and when the cost of edge computing is less expensive than the cost of maintaining central server architectures, edge computing becomes the single best option. So edge devices (and edge computing) are the future.