Data management best practices in pulp and paper: five tips for success

Process Optimization
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Effective process data management is the foundation for successful digitalization.

When data can be put to effective use, more informed decisions can be made which can in turn boost productivity and ultimately profitability. Despite the promise of such favorable outcomes, its estimated that 85% of big data projects fail.

The reasons are manifold, but in the pulp and paper industry, they fall mainly into two categories: suboptimal use of process data, and choosing the wrong process data management platform (and partner) to support your digitalization journey.

Here are five key tips that can help pulp and paper manufacturers avoid these common pitfalls.

1. Use your data

IBM Research suggests that up to 88% of Industrial Internet of Things data is unused. So, while pulp and paper customers are increasingly recognizing the value of a holistic view of their data assets throughout the business, this is by no means yet the industry norm. Just as it is with sales and marketing, it is important to align your data and business strategies. The aim should be to bring structure and standardization to all levels of data captured: from enterprise level systems (ISA-19 level 4) through manufacturing operations management (level 3) and including automation systems (level 2).

2. Structure and standardization

The Economist indicates that only 3% of manufacturing data is tagged and analyzed, but structuring and standardizing data - which mills may have amassed over many years - is vital in transforming it into something of significant business value. For example, standardizing data tags that are tied to specific production phases or equipment types enables them to be grouped together into units that align with a particular strategic business or process function – e.g. all data that is needed to analyze the equipment health of electric drives.

3. Value data modeling integrity

The integrity and consistency of data models is vital, as these models will be used throughout your organization to harmonize the data and interface, and as the basis for providing useable insights for smarter decision making. An example from one of our customers illustrates this. This particular organization wanted to transfer a sensor application between two of its mills, but only had measured values for sensors being tracked at one site, whereas the other yielded no set values for their process data algorithms to optimize. With robust data modeling, this does not present a challenge: process data powered applications can be easily implemented across multiple sites.

4. Choose the right implementation partner

The choice of implementation partner is crucial. There are plenty of digital options floating around, but a solution’s stickiness aligns tightly with its understanding of the complexities of the industry. So any partner under consideration should have both domain-specific expertise and knowledge of IT and OT infrastructure to ensure their solutions delivers real value. It’s also worth finding out if prospective partners appreciate the data collection challenges posed by real-world mill conditions: such as the need for robust and reliable sensors and how to integrate different legacy systems – including those not designed to support modern, open or standard connectivity protocols.

5. Choose the right process data management platform

Choosing the right process data management platform is likely to be linked with your choice of partner, so before finalizing your decision, consider the following platform checklist:

Does it allow the integration of third-party applications – and support the additional functionality that these bring to your digital environment? Can it be configured hierarchically – i.e. at a control system, site and enterprise level - saving time and implementation costs as the platform expands?

Does it support a range of open interfaces and application programming interfaces (APIs) for streaming and pulling data, such as REST API, .Net SDK, ODATA, ODBC, OPC UA, OPC DA? This is important when it comes to making the data available for new use cases, as they emerge.

Does it support data streaming in real time – both between levels, and to and from the Cloud? An application may be able to calculate the optimal set values for a process, but if it cannot communicate those values to the process, it becomes but an interesting experiment with no useful outcome.

The strength of your process data management system defines the success of your digitalization journey. That is why it is so important to get this foundation right.


Source: ABB