technology computer lines board
Photo by Pixabay on Pexels.com

Formerly a new technology trend to watch, cloud computing has become mainstream, with major players AWS (Amazon Web Services), Microsoft Azure and Google Cloud Platform dominating the market. The adoption of cloud computing is still growing, as more and more businesses migrate to a cloud solution. But it’s no longer the emerging technology trend. Edge is.

As the quantity of data organizations are dealing with continues to increase, they have realized the shortcomings of cloud computing in some situations. Edge computing is designed to help solve some of those problems as a way to bypass the latency caused by cloud computing and getting data to a datacenter for processing. It can exist “on the edge,” if you will, closer to where computing needs to happen.

For this reason, edge computing can be used to process time-sensitive data in remote locations with limited or no connectivity to a centralized location. In those situations, edge computing can act like mini datacenters. 

Edge computing will increase as use of the Internet of Things (IoT) devices increases. By 2022, the global edge computing market is expected to reach almost $7 billion. And this new technology trend is only meant to grow and nothing less, creating various jobs, primarily for software engineers.

One great driver for edge computing is the speed of light. If Computer A needs to ask Computer B, half a globe away, before it can do anything, the user of Computer A perceives this delay as latency. The brief moments after you click a link before your web browser starts to actually show anything is in large part due to the speed of light. A good example would be multiplayer video games that implement numerous elaborate techniques to mitigate true and perceived delay between you shooting at someone and you knowing, for certain, that you missed.

It might be weird to think of it this way, but the security and privacy features of an iPhone are well accepted as an example of edge computing. Simply by doing encryption and storing biometric information on the device, Apple offloads a ton of security concerns from the centralized cloud to its scattered users’ devices.

But the other reason this feels like edge computing, not personal computing, is because while the compute work is distributed, the definition of the compute work is managed centrally. You didn’t have to bash together the hardware, software, and security best practices to keep your iPhone secure. You just pay your monthly contract and train it to recognize your face…

close
Business Edition Logo

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome content in your inbox

We don’t spam! Read our privacy policy for more info.