Distributed intelligence and IoT fog


By this point, anyone who is connected to the Internet knows how many ‘things’ will be online by 2020. Numbers of connected devices, first calculated by IBM and recounted many times over with some variation on Kevin Ashton’s Internet of Things theme – Cisco’s Internet of Everything (IoE), which throws people and process in the mix, is one prominent example – do not bear repetition. A more interesting question, at least from the perspective of defining market opportunity, is what kinds of devices will talk to each other and to their owners? On this point, the upper limit appears to the proverbial sky, as prophets of the Internet of Things (IoT) speculate on innovation that will take existing hardware, sensor, cloud and communications technologies and fashion them into new and compelling IoT or IoE use cases.

If it’s difficult to comprehend the seemingly-infinite variety of connected items comprising IoT/IoE, one way of addressing this issue is to consider data density: to separate the ‘smart’ from the ‘dumb’ by looking at the amount of data generated by a particular device.

In IoT speak, it has been conventional to understand a ‘smart’ device as one that connects to the Internet, and likely to management and control applications in the cloud; however, specific device attributes – especially, the amount of data generated by a device – will likely be an important determinant of interconnection capability and need. In smart home applications, for example, while low data transfer rates on many devices is a factor that in theory at least would encourage cloud connectivity for advanced device integration and management, the onramp for most homes to the Internet, WiFi, does not yet work well with the low power, low processing capabilities of many simple sensors and devices. Another consideration is use case. Does a thermal sensor need to connect to the cloud, or simply transfer that information to a local control device? In connected home, it is likely that many devices will interconnect M2M fashion through other radio protocols, and many will independently filter and analyze sensor data streams through embedded intelligence at the local level.

This type of ‘smart’ device, which lives at the edge, is not limited to connected home, but rather a probable scenario in many industrial and commercial applications where distributed intelligence makes more sense than a centralized system for cloud processing of IoT or IoE data. The amount of data that needs to be managed, the type of data generated and associated connectivity requirements, then, are likely to combine with the potential for data analysis that will produce real business or user benefits to determine high-level IoT data architectures.

Rob Lloyd, president, development and sales, Cisco

Rob Lloyd, president, development and sales, Cisco

This tension between centralized and distributed information processing is not a common discussion point in IoT circles, as the unifying threads in the IoT framework are typically thought to be Internet connectivity and cloud-based analytics. Additionally, the centralized/decentralized distinction presents challenges to providers of data centre infrastructure who are relying on brute force in device numbers to drive new IoT opportunity. As Rob Lloyd, president, Cisco development and sales, explained at this year’s CiscoLive event in Las Vegas, “if you look at how the industry has built IP networks around the world, it’s been a tiered system where you brought data from the access aggregation up to a core that was in an enterprise network that served as an intelligent communications centre. The question here is with networks of the future for IoT, which will be very distributed. You won’t bring all of the data into the data centre, but you’ll be bringing trillions and trillions instead of billions of programmable events from sensors and industrial devices that will begin to connect with IP.”

This approach makes some sense, and a good analogy can be drawn with the storage industry where plummeting storage costs have in fact been compensated for by the data deluge which has engendered ongoing and exponentially increasing demand. But just in case there’s more to the story, Cisco is hedging its bets with “fog,” a platform designed to push Cisco to the intelligent edge. Lloyd continued: “Those events will not all come to a data centre, they’re not all going to traverse the network because they simply can’t – there’s no network that will scale to that volume of transactions. So the network will have storage, compute and some intelligence at the edge, which will store just the transactions and send only the anomalies…But this is not a counter-networking architecture, it’s exactly what we’ve done – build the [required] componentry and the systems, the edge technology and the solid state memory, UCS and routers, and the mobile networks. I would argue the opposite – that we’re going to accelerate with our portfolio that ability to take all the new data onto the Internet protocol and allow it to be architected in such a way that it will deliver value because you can’t bring those transactions all up into one data centre.”

Key to this effort is the IOx Framework, otherwise known as ‘fog,’ created by Todd Baker, product management and head of IOx Framework, Internet of Things at Cisco. IOx is a new OS or open environment for distributed applications and SDK gateway services that can run on Cisco hardware or be embedded into third-party hardware. According to Baker, the massive volume of data spun out of smart devices means that “it is completely impractical to think that we’re going to stay in a model that is basically end points and the cloud.” Instead, Baker’s team is banking on fog, which leverages the IP resources that have been built into modern switches and routers “that are no longer monolithic, single core and processor platforms” to add intelligence to IOx, and combines this with compute capability through the integration of other open source operating systems, such as Linux, to incorporate heterogeneous smart devices that will run at the edge. Integration between devices and an analytics environment, he explained, will be achieved through “bring your own application” or “bring your own interface” approaches where partner/customer code can be embedded to connect to and innovate on the IOx platform. Through this interface, extraction of data via USB, Ethernet or serial port will be enabled, and its conversion into IP packets for storage or transmission for analysis elsewhere.

And in the case of self-contained devices that are capable of making decisions or invoking action based on anomalies identified in local data analysis, Baker noted: “a single, smart sensor that is processing all of its own data out at the edge becomes infrastructure, but the reality is that the way that most of these are actually done is through aggregation sites. When you start to get beyond the singular and into the multiple sensor at the edge, you will need to get some correlation between the sensors to get better situational awareness. Many smart sensor cases are very mission focused, and expanding beyond their original intent is difficult. So we’re giving them more of an open platform to leverage the data that may begin there and use if for a slightly different purpose in order to build more intelligence out at the edge.” In other words, through connections enabled via the Cisco platform, the single sensor can develop additional functionality – when data is aggregated up to a hub that transfers information, a design architecture that is different from the collection of all data from sensors and its transmission up to the cloud data centre. “Doing this whole process [data analysis] on a hardened platform with embedded storage and compute is interesting in a unit of one, but actually has real value in a unit of a thousand, or ten thousand. So truly making it manageable at scale, doing application lifecycle management at scale, zero touch deployment for ‘things’ that may be out at the end of an oil pipeline which may be a week’s drive away from a human is really critical stuff.”

“Fog,” Baker explained, “is not an acronym, it’s literally cloud that is closer to the ground” – a mid-layer between endpoint and cloud for managing distributed intelligence, which can bring developers closer to the mountains of new IoT data that is being created at the edge that may not have been viewed in the past – and in this way inspire innovation, new services that Cisco believes will drive both new types and higher volumes of networking opportunity.

Add your insights below...