A version of this essay was originally published at Tech.pinions, a website dedicated to informed opinions, insight and perspective on the tech industry.
These days, it’s easy to fall into the trap of thinking that anything really important in tech only happens in the cloud. After all, that’s where all the excitement, investment and discussion seems to be. And there are indeed innumerable efforts to not only build software for the cloud, but also to use the cloud to completely reinvent companies or even industries.
As important as these cloud-based developments may be, however, they shouldn’t supercede many of the equally exciting capabilities being brought to life on the edge of today’s networks. While these endpoints, or edge devices, used to be limited to smartphones, PCs and tablets, there’s now an explosion of new options for creating, manipulating, viewing, analyzing and storing data. From VR headsets to smart digital assistants to intelligent tractors, the range of edge devices is enormous and shows no signs of slowing down anytime soon.
In addition, we’re starting to see the appearance of entirely new types of distributed computing architectures that can break up large workloads across different elements. Admittedly, some of this can get pretty messy fast, but suffice it to say that many types of modern applications, such as voice-based computing, big (and little!) data analytics, factory automation, and real-time document collaboration tools all require the efforts and coordination of several different layers of computing, including pieces that live out on the edge.
On the industrial side of this work, there’s a relatively new industry group called the OpenFog Consortium — originally organized by companies like Cisco, ARM, Dell and Microsoft — that has been working to try and standardize some of these elements and how they can be used in these types of modern applications. The group gets its somewhat confusing name from the concept of applying cloud-like computing principles close to the ground (i.e., near the edge or endpoint) — similar to how clouds near the ground are perceived as fog in the real world.
In many fog computing applications, sensor data from an endpoint device or attached straight into a simple server-like computer (sometimes called a “gateway”) is acted upon by that gateway to trigger certain actions or perform certain types of tasks. After that, the data is also forwarded on up the chain to more powerful servers that typically do live in the cloud for advanced data analysis.
Probably the best example of an advanced-edge computing element is a connected autonomous (or even semi-autonomous) car. Thanks to a combination of enormous amounts of sensor data, critical local processing power, and an equally essential need to connect back to more advanced data analysis tools in the cloud, autonomous cars are seen as the poster child of advanced-edge computing. Throw in the wide range of different types of computing elements required for assisted or autonomous driving, and it’s easy to see why so many companies are making major acquisitions in this space. Intel’s plan to purchase Mobileye, announced yesterday, for example, is just the latest in a string of key developments in this market, and it’s not likely to be the last. MobileEye’s components will bring computer vision and other critical elements of connected car-based computing to Intel’s rapidly growing grab bag of complimentary technologies.
On the semiconductor side, companies like Intel, Nvidia, Qualcomm and ARM, as well as system integrators like Harman (recently purchased by Samsung) all see connected cars as essentially “the” computing device of the next decade or so, just as smartphones have been for the last decade. That’s another reason why there’s so much excitement — and so many battles looming — in and around car tech. Add in the carriers, network providers, car OEMs, other tier-one suppliers and a raft of startups ,and the stage is set for an intricate and complex competitive dance for years to come.
While it’s tempting to long for the simpler days of computing devices, where everything occurred locally, or even a pure cloud-based world, where everything happens in remote data centers, the simple truth is that today’s advanced applications require much more sophisticated hybrid designs. Building out a cloud-based infrastructure and cloud-based software tools was a critical step along this computing evolution chain, but it’s clear that the most interesting and exciting developments moving forward are going to be pushing advanced computing out onto the edge.
Bob O’Donnell is the founder and chief analyst of Technalysis Research LLC, a technology consulting and market research firm that provides strategic consulting and market research services to the technology industry and professional financial community. Reach him @bobodtech.
This article originally appeared on Recode.net.