Author: David M. Rivers, E.I.T., Staff Engineer II
In the world of mechanical integrity (MI), data quality is critical to the total lifecycle management process. Every day, critical operations decisions are made and resources are spent based on historical data collected and stored within the inspection data management system (IDMS). The quality of the data within this IDMS plays a big part in making effective MI decisions.
When it comes to maintaining the quality of the data within the IDMS, there are many factors that have the potential to affect the reliability of this data, such as database changes made by many different users, changes in methodology and data recording practices, database migration projects, large-scale implementation work (such as risk-based inspection [RBI] studies) and importing and exporting data. These data quality issues have been problematic for many owner-users, especially at a time when many organizations are looking to leverage smarter reports and artificial intelligence (AI) to make more informed MI decisions. E2G has found that these data quality issues span the spectrum from readily apparent to highly obscure, such as:
- Condition monitoring location (CML) Tmins driven by outdated component information
- Historical events applied to the wrong asset
- Assets in the unit completely missing from the IDMS
- Assets with operating data higher than design data
- Improperly assigned damage mechanisms
- Incorrect fluid assignments
- Bad thickness readings driving CMLs
The Solution to Poor IDMS Data Quality
To raise the standard and quality of the historical data within the IDMS, database quality assurance (QA) and quality control (QC) have been shown to be reliable and repeatable in identifying critical shortcomings within the owner-users’ IDMS. Furthermore, database QA/QC can aid in maintaining a high level of quality within the IDMS and minimize the slow natural regression in data reliability.
- But where does one start?
- How does one verify compliance with procedures and guidelines?
- How does one know that a vendor is not overlooking mistakes or issues with the data?
This is where E2G has leveraged its expertise with developing QA/QC processes and automated QA tools to raise the bar on IDMS data quality.
It All Begins with Implementation Guidance Documentation and Proper IDMS Configuration
IDMS data QA/QC truly begins with understanding the organization’s implementation guidance documentation that has been utilized to date and how this guidance may have changed over time. One of the first steps to building and maintaining accurate data within an IDMS is having appropriate implementation guidance and a robust IDMS configuration methodology that will help ensure consistent and accurate data entry for new and existing assets. Some of the largest owner-users in industry have utilized the many API guidance documents, such as API 510, 570, 580, and 581, and collaborated with E2G’s MI subject matter experts (SMEs) to create company-specific procedures and implementation strategies. This same team ideally also works through creating a fit-for-purpose IDMS configuration guidance document that covers the corporation’s standard for implementation into the IDMS. These implementation and guidance manuals represent the complete procedure for building, documenting, and managing an asset’s data within and without the IDMS and often cover:
- Roles and responsibilities
- Meeting scheduling and documentation
- Mechanical, process, and inspection data collection
- Damage mechanism review procedure
- Fluid modeling requirements
- Data assumptions
- Inspection grading guidelines and inspection effectiveness
- Specialty risk assessment guidelines
- QA/QC review
- Inspection planning and scheduling
- Evergreening procedures
- Reassessment procedures
- IDMS: inspection schedule type
- IDMS: damage mechanism configuration
- IDMS: inspection events configuration
E2G utilizes these procedures and guidance documents as a starting point when creating the QA/QC process and tools needed to verify compliance. The goal of creating these QA processes and tools is not only to flag systemic issues within the IDMS but also to be robust enough to handle all changes to the implementation procedures.
The “How” of Database QA/QC
Performing database QA/QC can seem daunting at first glance, especially for organizations with an older IDMS. However, QA/QC can have various forms and depths of analysis. The first step is to identify QA checks that can be created based on business rules derived from the implementation guidance. The next step is to build a QA tool (or system of tools) that can validate compliance with these business rules in a repeatable fashion. E2G’s QA team typically starts with a standard list of QA checks, connects them to the IDMS, and then works with the owner-user to build new QA checks into the tool based on the implementation process and guidelines. Examples of some of the QA checks that have been built to find database issues are shown below:
- IDMS:
- Master asset list gaps and asset modeling
- Missing/incorrect damage mechanism (DM) assignments
- Inspection event scheduling (time-based and RBI parameters)
- CML Tmin and measured thickness
- CML source data and asset linking
- Mechanical Checks:
- Missing cladding information
- Bad assumption values
- Improper asset modeling
- Mixed material of construction in circuits
- Operating:
- Assets operating over design conditions
- Fluid modeling
- Inspection Histories:
- Measured thicknesses exceeding design thicknesses
- Measured thicknesses vs Tmin
- RBI components with incorrect insulation designation
- Measured thickness growths
Once the QA tool has been created, a significant number of flags will likely be revealed and corrective action will need to be taken. Because so many items flag initially, this often means the QA tool and review process will need to be run many times over to ensure full cleanup of the database. E2G typically runs two to three passes on the database, depending upon the structure of the QA project and the number of flags found. Historically, many of the issues identified in most IDMS databases today involve mechanical modeling, CML thickness readings, and inspection history records, especially for RBI assets.
The “What” of Database QA/QC: Vendor RBI Implementation
In a recent RBI implementation QA project, an owner-user wanted to understand the quality of the data feeding the RBI module of the IDMS since the implementation work was performed by two separate vendors at two different refineries. The QA analysis found that most of the issues identified in the study were centered around the Mechanical, Operating, and Inspection History data. E2G found that corrosion allowances were missing, assets were modeled with incorrect component types, inventory groups were completely missed for one site (leading to a significant under-estimation of risk), and measured thickness credit was assigned to poorly effective inspection histories. These types of errors can have profound effects on the calculated risk result if left uncorrected. In this case, there was a clear compliance issue with the vendors not abiding by the RBI Implementation manual; armed with this knowledge, the owner-user was able to make an informed decision regarding corrective action by the vendors and the cleanup of the IDMS needed at these sites.
The “What” of Database QA/QC: Large-Scale Implementation
IDMS QA/QC is also highly effective when performed during a large-scale implementation rollout with multiple vendors covering a multitude of facilities. In fact, QA plays one of the largest roles in these types of projects due to the complexity of site-specific needs, upholding local regulation requirements, and ensuring the implementation vendors are adhering to the implementation guidance. E2G has worked with several large owner-users to develop, deploy, and maintain QA tools and procedures that flagged critical data issues throughout the whole IDMS. In these large-scale projects, such as in the case shown below, the predominant data issues flagged by the QA tool included improper inspection schedule settings for RBI and special-emphasis assets, CMLs with inappropriate overrides applied, CMLs with Tmin issues, CMLs with no assigned component, inspection histories with measured thickness issues, assets modeled as operating above design conditions, assets with corrosion rate issues or inappropriate cracking susceptibilities, etc. The data represented in the case below comes from three units at two different sites and two different vendors. Notice the sheer number of potential data quality issues that can arise even when the owner-user starts with a new implementation with good supporting documentation. The inspection plan would have been completely erroneous and invalid due to the issues found if not corrected.
The “What” of Database QA/QC: Outlier Database QA
One unique use case to which E2G applied QA was for an owner-user in the upstream industry that was looking to perform RBI on their piping once the data was migrated to a new IDMS. This project was unique in that the owner-user’s initial data resided in a very simplistic and old asset management system created in-house that stored a limited amount of information, such as asset mechanicals and CML data, and not much else. The data used to build the RBI database had to be sourced from many other external systems, but E2G was able to compile the data, apply QA/QC against the load sheet once the business rules were defined, and build the database. Although the customization of the QA tool for this type of project was more nuanced, the QA tool was able to find data issues such as piping circuits with mixed MOC, circuits with incorrect Consequence of Failure (COF) calculations based on location, piping circuits with little to no thickness at the RBI calculation date, piping with mechanical issues around diameter, schedule, and Tmin, etc. The QA/QC in this case not only helped the team correct many of the issues found in the newly created database, but also informed decisions to tweak the circuitization data and drawings.
QA/QC Takeaways
To recap, there are several keys to a successful and sustainable QA/QC process. Having a clearly defined implementation strategy and IDMS configuration guidelines makes the QA/QC creation process much more straightforward. After reviewing the implementation procedures and configuration requirements, the QA checks can be built to validate the business rules for the owner-user’s organization. With QA, an organization can equip themselves to make the right decisions when it comes to daily operations of their facilities. Performing QA on the database is invaluable to maintaining the life of the aging infrastructure within the process facility, and E2G’s RBI team has the experience and development tools needed to help with any QA and data cleanup assessments an owner-user may need.
If you have any questions, please fill in the form below.