This article is an excerpt of a piece originally published on DZone - the full article is available at the link at the bottom of the page.
Organizations can’t put IoT technology and devices into their systems fast enough to capture vital data that would otherwise go untapped. System architects are not to blame. Standards are not to blame. Nobody is to blame. Still, organizations are reaching an impasse on IoT, often over concerns about exactly what IoT can achieve. In either case, it’s becoming apparent that IoT, without a system to manage operational performance, is a futile effort.
Companies across industries are increasingly adopting performance management technology to manage not only the performance but the output of systems of all kinds. And, much of that data is created by IoT devices. Operations Performance Management (OPM) focuses on improving the responsiveness, throughput, quality, cost, and efficiency of workflows reliant on data from production or service systems. Think of it as an overlay to IoT unlocking business value.
Most organizations struggle in dealing with complex workflows, siloed data sources, lack of relevant data, the absence of predictive analytics, and a fundamental lack of a people-centric approach. That’s where OPM comes into play.
In historical cases, even alerting and fault detection systems have proven to fall short. Why? Because alerts are isolated outputs; without a system to correlate alerts with other system indicators and made visible on a single operations console, the root cause of the problems often requires domain experience from people responsible for diagnosing the problems.
In a typical business case, the impact of over-alerting is that IT personnel have no means of correlating multiple alerts or indicators. When that happens, they have little recourse but to track an alert to a likely bad actor, confirm, or eliminate that act as a culprit and continue this hunt-and-peck operation, hoping to isolate the cause.
But, the inescapable conclusion is that following trails one-by-one rarely succeeds.
OPM is emerging as companies and their technology suppliers continue to grapple with the challenges of improving the customer experience. In many industries, the gap in serving the needs of people has been dubbed the “last mile.” Everything else is irrelevant if you can’t improve the satisfaction of people, whether they are operators, users, or consumers. They sit at the beginning of the first mile, and how companies serve them is the paramount concern in maximizing the customer experience.
As suppliers of OPM technology grapple with the customer experience challenge, IoT comes into the spotlight. In a recent report by 451 Research, 41 percent of companies said, “IoT’s lack of perceived ROI was a barrier to adoption.” It seems like things are a little bit topsy-turvy there because IoT is a limited set of technologies, including devices, software connectivity, and so on, but it can’t deliver the ROI that people expect until or unless the apps are there to correlate information that is spewed out by hundreds or even thousands of IoT devices.
Arguably, the limitations of IoT have been the chief impetus for the growth of OPM. Gartner shares the point of view that generic IoT systems will never penetrate deeply or create value because the solutions are being created without domain experience. OPM is what brings the domain experience, and vertically focused OPM solutions have become critical in organizations across industries.
OPM systems earn extra credit if they employ digital-twin technology, which essentially creates a virtual model of a process, product, or service. This pairing of the virtual and physical worlds allows the analysis of data and monitoring of systems to head off problems before they occur, prevent downtime, develop new opportunities, and even plan for the future by using simulations.
Digital-twin technology captures the real-time state of workflows and people, as well as data from across a variety of disparate devices and systems. Complex event processing, a reasoning engine, and real-time process orchestration operationalize the digital twin by displaying it in the right context for each stakeholder.
Read the full article, with more info on OPM and the Digital Twin on DZone.