New Wine in Old Wineskins: The Challenge for Data-Centric Design Tools

‘Data-centric’ engineering design tools have been in the market for over a decade now.  These tools became the next-generation of CAD (2D/3D) tools that not only helped develop the engineering drawing deliverables but also managed the data underlying those documents.  They helped to reduce effort and improve reliability in the design process and to keep the underlying design data in sync.  These tools were initially adopted by Engineering, Procurement and Construction (EPC) firms to improve their efficiency of engineering design development.  The savings to EPCs were often good enough that many standardized on these tools for their internal work processes even when their clients continued to insist on the classical drawing file formats for the final deliverables.

Enter the Owner/Operator

More recently, the owner/operator community has recognized that they are leaving behind a wealth of information by requiring ‘dumbing down’ deliverables to suit their existing information management processes.  The rich datasets that the EPCs were already creating could significantly improve operational use of data for decades.  So there has been a trend to request the use and eventual handover of the data-centric design tools with the rich datasets contained within them.   The strategic goal is to own and leverage these data-sets for future projects and thereby continue to evolve the facility design information progressively at the data level, independent of the contractor who created or modified it.

Handover and Operations

However, the adoption of data-centric tools is not as seamless or easy as one might tend to think. It is not as simple as a statement in the contract saying to the effect, ‘oh, and by the way, please handover the databases as well.’  While it is important to take possession of the databases, the project team and the operating company must be prepared for the impact it creates on custody transfer verifications during its handover, ongoing maintenance, and future transfers to new EPCs.  Disconnect begins with the handover of the information from the EPC to the project team.  For example, typical cycles of project review and approvals tend to be performed using ‘new old-fashioned’ softcopy document-based approval processes, if not hard-copy markups.  The databases themselves are not reviewed or approved during this phase, because they are technically a work-in-progress owned by the EPC.  However this can cause discrepancy between data and documents without data quality assurance procedures to validate databases against the signed-off project deliverables.  After the handover, the operating company must determine how to leverage data for operational advantage and manage its ongoing upkeep for the life of the plant.  When time comes around to seed a project with As-Built data, the project team must ensure the EPCs are able to receive and leverage a reliable data asset.  The complexity of version control in document-centric information is quickly superseded in a data-centric information system, especially with multiple ongoing projects under different stages of planning.  Any ‘likelihood of uncertainty’ of the data quality can be a smear on the reliability of the entire data-set. It can impact the safety of the operating facility and the possibility of As-Builting the entire data-set for the next turnaround.

Data1

New Processes Needed

This is a classic case of impact of technology on processes and people that is further complicated by antiquated contractual paradigms between the contractor, project team and the operating company.  The desired end result of a consistent and rich dataset is powerful; but the pathway to a practical solution is not so obvious.  The challenge in assimilating data-centric tools typically results from continuing to use traditional work processes for design review, approval and handover without accounting for the implications of the new technology.  Modified paradigms and processes are necessary both for the information handover as well as for ongoing use and engineering.  The operating company must clearly define the operational work processes that can take advantage of the new data-centric information sources and their upkeep.  Very often, this work process analysis can lead to requirements that must be proactively back-fed into the capital project phase.  The analysis must include the adoption dynamics when the work-force takes full advantage of the new technology and reaps the returns on the investment.  Obtaining data-centric tools without the mechanism to use them properly is like installing fire-alarms that are never tested.  They can give a false sense of confidence that can be treacherous.

Wineskins

So what has all this to do with wineskins?  In the days of the Roman Empire, containers made of goat skin were used to store and transport fine wines.  Fresh wineskins had the elasticity to expand with the ongoing fermentation without bursting.  Old wineskins would never be re-used for a new batch of wine because they had no more room for expansion.

As new technology promises new benefits, the old business processes may no longer suffice to contain them or take the best advantage of them.  Surely changes to the business processes will impact people.  However, the good news is that minimally-disruptive adoption of a well-tailored business process can leverage new technology successfully for many years.  The bad news is that force-fitting new technology into old processes is a financial and safety-related event waiting to happen just as new wine can burst old wineskins.  There is no one solution for this problem.  But with proper guidance every company can determine how far they wish to integrate into the data revolution that is already here!

© Philip Simon, Neulogix Solutions LLC

Posted in Insights.