Asset management: From data to decision
Organizations are incessantly gathering information and data across every element of their supply chains from oil fields and pipelines to refineries, power stations and manufacturing. It has been estimated that 90% of the data in the world today has been created in the last two years alone, but it is feared that data overload is causing a barrier to the effective use of this information.
Big Data, Big Deal
The global oil and gas industry is continually evolving. The forecast decline in investment, production and price of oil, accompanied by ageing assets operating beyond their original design life, has fuelled demand for improved performance, operational uptime and safety through manipulation of big data.
Data exists in many forms. It may be structured or unstructured. It may be generated by systems and machines or by people in many forms. Systems and assets communicating data directly is leading to the ‘internet of things’. Big data is fundamentally about data on a huge scale, a scale beyond normal levels of analysis.
Big data is a big deal when it comes to asset management, and so is the speed with which it is expanding.
According to the EMC ‘Digital Universe of Opportunities’ with Research & Analysis by IDC, the digital universe is doubling in size every two years, and by 2020 will reach 44 zettabytes, or 44 trillion gigabytes.
To put data and its size into context: A jet engine can generate 10TB of data in 30 minutes—with 25,000 flights per day, the volume of one data source runs into Petabytes. Smart meters and heavy industrial equipment in refineries and oil rigs generate similar volumes. This brings both opportunity and threat—the chance to see previously undiscovered truths about asset performance, and the threat that this can be lost in the noise created by all this volume.
When data is used effectively it can provide insight and visibility of emerging trends, moving from measuring ‘what is’ to predicting ‘what if’. Used wisely, it can help organizations to achieve operating cost savings, higher production rates and enhanced safety.
Big data is defined by the four V’s that identify the challenges every organization faces when trying to extract insight and value from information for decision making. The velocity—the rate at which we gather data; its variety, including structured and unstructured data from images, audio, sensors to logs; its volume, from kilobyte to yottabyte being generated in volumes, unheard of even five years ago; and its veracity.
Data has been identified as a significant asset to helping organizations achieve their objectives, with real-time analogue and digital data collected across both onshore and offshore assets through sensors, monitoring equipment, machine data and logs, helping to make daily operational as well as long-term strategic decisions.
At the recent Institute of Asset Management Annual Lecture, Professor Richard Clegg MD of the Lloyd’s Register Foundation, delivered a presentation on dig data to over 200 senior executives and directors, focusing on data as “the new asset class”.
A survey at the lecture highlighted the vast spectrum of asset management decisions that organizations apply data to including cost control, managing risk, preventative maintenance, root cause analysis and investment decisions.
91% of professionals surveyed agreed that the range and types of data their business uses will expand in the next three years. Conversely, the remaining 9% said the data they will use in the next three years will only expand ‘slightly’, citing “pushing back due to data overload” and “lack of clarity over its use” as reasons for not exploiting the ever increasing bank of facts and figures.
Data overload
There seems to be little apprehension over how data is collected, with many organizations declaring they have a wealth of data from many sources in a variety of forms; however, concerns over quality, authentication and relevance applied to both historical and new data.
Although data has long supported decision making, the abundance of data streams and changing digital sphere is proving overwhelming for many organizations.
One executive at the lecture stated, “We have lots of historical and new data, but not much information. We need to improve how we extract the insight and actually use it to develop better solutions for our customers and drive our business forward.” This opinion was echoed by a number of organizations and emphasizes the need for the appropriate skilled personnel and enhancement of analytics capabilities.
As digital technology continues to swiftly advance, there is the fear that organizations will “drown in data” and not fully realize the benefits of effective data management. This challenge is widely recognized, however most organizations indicated they are poised to tackle the volume of data.
“We will get there; it’s a shift in process, resource and reliance. The collection of data needs to be run like any other project with the initial objectives defined, so we know what, why and how we’re collecting data as well as how it will be used to influence our operational and commercial asset management decisions,” stated a director from a leading power and water transmission company.
A concern was also raised suggesting that the oil and gas industry has the mentality of moving too quickly and therefore doesn’t give time to analyzing data due to the sheer volume and questions over veracity. In contrast, if operators had to slow production or shut down for a small period of time following information received from data regarding the condition of equipment, then timely delays can be mitigated and disasters, such as the Macondo blowout, could be avoided.
This was supported by findings from the Health and Safety Executive (HSE) Energy Division’s (ED’s) Key Programme 4 (KP4) report, which investigated the ageing and life extension (ALE) challenges facing hydrocarbon exploration and production installations on the UK Continental Shelf (UKCS). The report highlighted evidence of missing data, and insufficient data trending, recommending that this should be improved to help anticipate potential future failures, degradation, predict non-conformance and obsolescence management.
Digital Oil Field
Many organizations are already successfully managing their data and the concept of the ‘Digital Oil Field’ with the manipulation of data across the energy industry isn’t a new theme. BP introduced their ‘Field of the Future’ technologies to collect, manage and analyze data to deliver operating benefits, whilst Chevron have attributed hundreds of millions in cost savings and improved output to their ‘i-field’ digital oil field vision.
Technologies and software are required to transform raw data, which must be examined through algorithm and analytical tools, supported by human capability and expertise to help identify valuable insights and add context to the data.
Lloyd’s Register Energy offers customers a software solution for data management with its Reliability Based Mechanical Integrity (RBMI) program, a fully integrated Risk Based Inspection (RBI) software package enabling customers to manage their RBI processes and data. RBMI can be applied to all equipment types and assets including pressure vessels, piping, storage tanks, heat exchangers (including bundles), relief devices, subsea structures, pipeline, and user configurable assets.
It is anticipated that as the challenges of big data are addressed and overcome, the complexity and volume of data gathered and used will result in ‘design for data’ becoming a common theme, similar to current concepts such as ‘design for maintenance’ and ‘design for decommissioning’.
Data to Decision
There’s no denying organizations that prioritize big data and combine technical and performance data across all the lifecycle stages will reap the benefits of sustainability, delivery and profitability. They will discover unexpected truths, manage risks and move increasingly to predictive asset management.
Amongst other benefits, asset and integrity data will help organizations determine and mitigate potential future risks, forecast equipment failure, predict non-conformance periods, and allow for proactive equipment maintenance. It will provide lessons to feed into the design and construction of new assets, and give insights into completely new ways of working. It will improve inspection programs, prevent down time and unplanned outages and optimize production—safely.


