Recently, we hosted a webinar with industry experts, Paul M. Maximuk from Newcomb & Boyd, Emmanuel Daniel from Microsoft, and Charles Whiteley from ExxonMobil. Below is part two of three excerpts from our expert Q&A where we will be discussing various themes. In theme two, our experts discuss the importance of a modern day portfolio data strategy.Read, listen or watch the Q&A discussion below:
Q: What makes a Digital Twin ideal for being the “new” standard of data models for an operational building?
A: Emmanuel: We try to standardize the way buildings are defined. When you look at a portfolio like ours we have buildings all across the world. We have different types of buildings, buildings categorized by functions, buildings categorized by people who work inside them. But underlying the assets in the building, the core infrastructure that goes into the building is somewhat similar. We have different manufacturers, different types, different specs, but on a broad level, they are still the same. Now that typically means they are all collecting similar data points. They may represent it differently, they may format it differently, or they may send it across, or store it differently. And, that is where the importance of having a data strategy fits in. If I have an HVAC operating out of the US versus an HVAC operating out of India or Europe - what parameters do I need from that system so that I can standardize and manage all the systems from different supplies and manufacturers across my entire portfolio? Now the same thing can be applied for access management or parking as well. Going down and defining the meta-model of what these assets are, how they perform, what their operating conditions are, and what they are going to deliver in terms of the impact they have on the experiences - standardizing that across the various class of assets and sharing that data across multiple systems through an integration layout would mandate a data strategy, without which it is not going to work. And that is why it is not about reliance on 3D models, or reliance on visualization, or reliance on BIM models - it is the reliance on the data that feeds those models. Without having the appropriate data in a BIM model, what is the value I am going to drive? And a large portion of our portfolio is older buildings as well. They may not have BIM files or CAT files, but they are part of the portfolio, and they have to cater to that experience and that operation layer, and the efficiency also. Hence, the definition of data models and data structure for the assets inside your building and the value they create is of utmost importance.
Q: What are some things that you see as being a challenge in implementing Digital Twin in an existing building versus a new building?
A: Emmanuel: In a new building you start with a clean slate. It is much easier for you to define the systems that go in and define the data that is going to flow out of these systems, and the value you want to drive. All the buildings have been in operation for a long time. The challenge is the systems may not have been implemented in the way they should be. The data points may not have been named or categorized the way they should have been. And, the unification of data across different systems may not exist. The other challenge is you don’t want to create something that can’t be replicated to an older building because then, all a sudden you have people saying “I don’t want to work in the older buildings anymore, put me in a new building,” and that creates another challenge. For us, it is merging, uniting data, upgrading how that data is extracted and standardized in the old as well as the new, and that’s going to be critical for us.
A: Charles: We even went as far as to define and develop a kind of mapping engine for some of our older sites. You can go out and crawl the BACnet network, identify all the data endpoints that are being generated, tie those to your equipment master if you will, and then leverage some type of transformation or translation engine to then convert those to your standards. Now you’re in a situation where you’re rapidly deploying advanced analytics, regardless of whether the site’s brownfield or greenfield. And, the only way to build that linkage and create that continuity between the two different types of systems is to have a defining data strategy in place.
A: Paul: You have to look at this from an enterprise view across your whole organization. At a greenfield site, it’s a clean slate. That’s a great time to make those changes and implement those changes into your new design. When you go to the brownfield sites, it’s challenging because you don’t know what you have until you evaluate your systems. You can go to five different buildings, for example, let’s use a BMS system, and five different programmers programmed the names of the data in five different ways, and they’re all the same data points. But, they’re not connected in a database, so you have to get in there and rework a lot of that database and be able to extract that data. So, bringing those systems in and then bringing the complexity of systems coming in with the new technologies - you start to say we need to standardize things. Create the standards, how we’re going to tag the data, how we’re going to have a naming convention for your data points, have an architecture that’s going to work, talk about ports and protocols that are necessary for the system that can be standardized across your enterprise. And then get to your communication protocols and make sure that you don’t have fifty different communication protocols, bring in standard API’s, bring in standardized BACnet points and Modbus points, and OPC points. But, get away from all those locked-down proprietary network protocols because that creates a huge issue when you try to update your systems, and bring your system into a Digital Twin.
Q: From your perspective, what would you consider the greatest challenges when migrating to a new data strategy?
A: Paul: There are three things I see that are the biggest issues. Number one is data integrity. I can guarantee you can go out to anybody that’s on this webinar right now, and they have data issues. The integrity of the data is either not even connected anymore, systems down, the data someone got skewed, the values are wrong, the configuration was wrong when they did the original configuration, or something else had happened. So, how do you offset that? You have to go back and really look at your systems and recommission some of those systems to validate that data. The data silos are another issue. There are servers all over the place, some people are starting to incorporate data lakes which makes it easier to centralize your data source. Or, it’s in somebody’s PC in a room on an isolated network that nobody has access to but that an individual. But, my favourite topic is cybersecurity because everybody has a major heartache with this. It’s a situation where this is an ongoing process that never changes. It’s going to always be there, there’s always going to be new threats to your data, there’s going to be people trying to skew your data, or hold your data for ransom through a hacking attack. So if I prioritize this, I would probably say cybersecurity is number one, data integrity is probably number two, and the data silos are number three.
A: Charles: So if I was to piggyback off that, I would agree wholeheartedly with all that Paul had mentioned. If I was to add any additional detail/insight, I think it’s around the context of that data. So when we talk about cybersecurity, one of the biggest challenges we’ve had is from a network traffic perspective. For example, segregation of the IT, OT network, where our applications and historians and databases sit within our broader ecosystems within ExxonMobil. So when you start thinking about the scale of the data and the types of data that you’re sending through the corporate network, in most cases you’re hopping a couple of firewalls to get to the cloud where you’re running your advanced analytics, you’re building your models, you’re building your algorithms, and then pointing back to the edge. One of the things we really thought to address and cover within our meta-model was this concept around the context. When you think about it from a cyber perspective, sending a random temperature reading or temperature measure to the cloud shouldn’t be a big issue. From a traffic perspective outbound - that should be a clean shot directly to the cloud. But, when you start to add things like temperature, sensor, flow rate readings, vibration, or if we’re talking about one of our wells in East Texas when you start adding a good name and flow rate, product, and contractor that’s working that well - then it becomes a different conversation. Understanding that within the data flow and the data traffic that you’re sending to the cloud to run an analysis. The sum is always more important than the individual parts, so just thinking about the data and what the data represents has allowed us to have much more meaningful conversations with our IT organization. So this is something to think about as you work through a data strategy.
As an organization that serves many industries such as healthcare, commercial real estate, and corporate real estate organizations, we empathize deeply with what our customers are going through at this moment. We’re a technology organization, but for us people have always come first. If you need advice on where to start or how you can make the changes your organization needs to be prepared for the future, get in touch. Community is important now more than ever.