Building your Asset Analytics Strategy for Industry 4.0

Industry 4.0 is powered by data. As companies make applications like Industrial AI/ML, digital twins, and operational orchestration a central part of their operations, they are exploring new operating enabled by the core principles of Industry 4.0 — interconnectivity, cloud computing, AI/ML, big data management, and user adoption — to guide the way.

In this blog, I’ll cover how thinking about process-intensive assets and systems in terms of data requirements sets up organizations for future success and flexibility. It is an important exercise as companies establish their asset analytics strategy.

Anchoring Asset Data Governance in the Asset Administrative Shell

When I talk with prospects and customers about use cases for asset performance management that tackle their digital and sustainability initiatives, the discussion usually turns to data availability, and the time and availability of subject matter resources to consolidate the data. They need widespread and secure data with the right domain context from various sources to establish and support these initiatives. It can be costly though: on average, data scientists report that they spend 39% of their time wrangling data.

I use this slide to explain some of the differentiating capabilities of our Uptake Fusion product and how it makes asset analytics possible and cost-effective. This graphic depicts Industry 4.0’s Asset Administrative Shell.

The Asset Administrative Shell with submodels for Industry 4.0

The asset administrative shell describes the many types of data and visibility provided by a connected asset in a process-intensive environment like oil and gas, manufacturing, energy and utilities, and mining and metals.

The submodels reference categories of collected data to support various asset lifecycle functions such as engineering, operations, maintenance, reliability, and energy. These submodels are virtual representations stored in the cloud for various analytics as canonical models, often shown as hierarchies or graphs.

Submodels represent key value levers that measure, control, and manage strategic goals, which are rooted in key performance indicators like overall equipment effectiveness (OEE).

At the operational submodel level, the focus might be on production and quality. At the maintenance submodel level, the focus is aptly on maintenance, availability, and cost.

Each submodel organizes asset data from various sources and references a combination of design and manufacturing technical specifications, sensor data, alarms and events, maintenance, and work order data.

The need for data access extends beyond maintenance and reliability decisions.

A key benefit of establishing a submodel strategy as per Asset Administrative Shell is that it allows organizations to implement a better structure for asset data governance and stewardship. This organization allows organizations to explore new business models with their analytics suppliers.

New models whereby original equipment manufacturers, specialized service providers, and even technology providers such as Uptake can maintain submodels as part of a SaaS-enabled managed service offering.

Shared Data Access for Industry 4.0

As Industry 4.0 technologies like digital twins and AI/ML gain more traction, asset-intensive companies that scale their initiatives will need to decentralize data management. Leveraging the submodels is key to making disparate data consumption possible by various stakeholders across the organization.

Since the management of these submodels involves various stakeholders in and outside the company, data access and consumption will need to be flexible as well.

And going forward, as companies eye autonomous or self-operating assets, decentralized data management will be fundamental. Without a human go-between, the asset administrative shell functions as the store of data (knowledge) about an asset or system. These submodels (or administrative asset shell as a whole) enable the adoption of other emerging technologies, including AR/VR to enhance the human interactions with these assets — either from remote locations or from the field.

Bringing Data Together with Uptake Fusion

But even before that organization of submodels can happen and which is necessary to enable Industry 4.0 technologies, data governance begins with bringing together
operational technology (OT) and information technology (IT) data in the cloud. It provides the scale of data consumption that enterprises are after.

We designed Uptake Fusion to be the flexible foundation to extract and store OT and IT data, manage these submodels, and in turn for industrial data analytics.

If you’re not familiar with Uptake Fusion, I would point you to this video below that offers the highlights.


At a very high level, Uptake Fusion securely extracts industrial data – including from on-premise systems — and moves it to the cloud for organization and curation with context (like maintenance work orders or metadata).

This cloud integration has been par for the course in IT for some time. For OT, due to complexity and costs, not so much.

With Uptake Fusion, data consumption doesn’t depend on a per-tag or license basis, or the number of deployment environments. Instead, it’s based on usage. The cloud environment of the organization allows authorized users to develop the industrial intelligence they need to improve operational performance, as well as environmental, social, and corporate governance (ESG) initiatives. In that way, it’s a key component of the data backbone for digital transformation and sustainability.

Data Traceability & Asset Lifecycle Changes: Pumps as an Example

Traceability in this shared framework of data management is especially important. As asset lifecycle changes come into effect, where these submodels are located in the cloud environment and who modified these submodels become relevant.

As an example, take a horizontal multistage centrifugal pump system used for onshore crude oil transfer.

Maintenance and reliability for pumps come as a priority for many organizations.

Over the course of the pump’s lifecycle, repairs, refurbishment, and replacements influence performance. For any authorized user to understand the productivity of and risk to oil transfer that is associated with these changes, this activity should be recorded and preserved.

With improvement and reporting needs spanning businesses, organizations depend on a contextualized and on-demand view of asset data to support decisions. If the motor attached to a pump were replaced by a better technology, this change will impact the efficiency and performance of the pump. The change could also influence energy consumptions, quality, and yield improvements.

Most organizations replace the motor in their monolithic asset framework model without tracking and making these lifecycle changes available. As a result, many of their reports will not be trustworthy if their time span expands beyond the repair time — a different motor was in place. This will also impact analytics results as the biased data for the previous motor will impact the analytics for the new motor, and new overall pump predictions and risk profiles.

Launch your Asset Analytics Strategy for Industry 4.0

As Industry 4.0 skills like data and digital expertise become fundamental to operating an industrial business, companies should also note the tradeoff in skills. People running site rounds and inspections are looking to their employers to store and develop subject matter expertise and best practices.

Subject matter experts and AI/ML-proficient resources are limited, and they need to be allocated to high-priority items that impact their critical production assets. Building your asset analytics strategy for Industry 4.0 means splitting the problem and delegating activities to third-party data consumers that can provide insights to your team.

Uptake provides asset analytics for ancillary (or "balance of plant" as known in power engineering) assets such as pumps and transformers. Maintaining an in-house focus on critical assets allows teams to reallocate your expert time to highest priority activities concerning your production assets; yet it maximizes their time as Uptake delivers simple insights and supporting evidence to enhance their decisions and prioritization of activities.

The dynamic for decentralized data consumption will be key as companies collaborate to address the knowledge gap, which includes equipment and system maintenance. Digitized subject matter expertise unlocks and accelerates the potential for digital initiatives and to improve throughput, cut unplanned downtime in half, and substantially improve labor productivity. It’s also one that the administrative asset shell — as a part of the backbone for digital transformation and sustainability — can fill.

Ready to put your pump data to work?

TALK TO AN EXPERT

Previous
Previous

Data Lakes And How to Move Time-Series Data There

Next
Next

The Value Chain of Industrial Intelligence