@ThingsExpo | @Numerex CTO Discusses #M2M Needs

Numerex has been reported as an “IoT company to watch,” and that seems to be a reasonable statement. The company does its work in the cloud, of course, delivering its IoT DNA to devices, networks, and application tools for M2M customers. We had a few questions for Numerex’s Chief Innovation & Technology Officer Jeffrey O. Smith, and here’s what he told us:

IoT Journal: Numerex has been around for about 20 years, with a background in telemetry, as I understand it. How has that migrated to what’s known as M2M today?

Jeffrey O. Smith: I have referred to a Maslow’s hierarchy of telemetry needs as a tongue-in-cheek way of helping people understand the evolution the M2M market isgoing through. “Just get me the data, stupid” is the first layer, and the hierarchy’s top layer is optimization.

Supply chain is a good example of that evolution. Initial supply chain deployments were Œdots on a map most of our deployments lately are in optimizing things like dwell time in the supply chain utilizing location data with higher level data analytics and visualization.

IoT Journal: How much data will M2M be producing within a few years, and how should
IT managers plan for it?

Jeffrey: Most M2M applications do not create that much data, except for possibly fleet management, which has location reports continuously. But who really needs to know the address of a delivery van or other fleet asset with 15 seconds resolution from six months ago?

By contrast, I have said that control and BSS/OSS data and metadata will far outweigh the data from a device. Part of the reason for this is that devices which generate large amounts of data will have local
intelligence.

For example, cavitation detection in large pumps generate a tremendous amount of real-time vibration, pressure, etc. data. But that data is usually analyzed using high-level frequency analysis locally and
only one bit of info is needed to transmit—are we in cavitation or not? You will be able to drill down to the device to get at a particular moments raw data, but it should not be transmitted and analyzed in the
cloud if it can be done on the edge.

A second example is video. In this case, it can be filtered locally—motion detection and capture—or
transmitted through out of wireless-band medium, ie a land line.

IoT Journal: I wanted to say next that massive dataflows are of no value without monitoring, analysis,
collaborative tools, and a strategy, and you’ve pointed out that the non-local dataflows may not even be that massive. In any case, how does Numerex address these issues with its customers?

Jeffrey: Numerex has had an approach to “big data for real-time streaming” of data for several years. We have PhD¹s with specialization in this area.

Most of our initial success has been in internal projects to determine areas for cost savings and determining anomalistic behavior of network elements and devices. We use some pretty-leading edge technologies to do machine learning and real-time classification on streaming data (in terabytes) and use the cloud for processing this data in real-time.

We have only recently been commercializing these capabilities. The most interesting opportunity is what we have down lately in deep packet content analysis of live streams in coordination with network and billing data.

IoT Journal: What similarities and what differences do you find in M2M challenges
within the large vertical markets that you serve?

Jeffrey: Differences are usually in how the data is presented or the action that is taken. The similarities of challenges are in devices—which our new NX platform through licensing and off-the-shelf solutions can solve—by eliminating or reducing certification, for example.

Other similarities that we leverage are services like LBS. Indoor location is needed across verticals and the integration of those capabilities like advanced Kalman filtering to disparate tea-time data to infer good indoor location is helpful in most domains.

read more