×
×

Well, the running joke in analytics is that if you torture data long enough, it will confess to anything – and the data generated by the Oil and Gas industry is so vast, almost anything is possible. For an industry that has decisively headed towards eking out every last ounce of production from their existing investments, making sense of this data is of paramount importance. The sustained low prices of oil have only further emphasised the need for getting more out of what you have.

Beyond the buzz, there are some significant challenges to achieving the all-important actionable insights and foresights from data. To start with, the different teams and assets within the enterprise generate many times the data that is currently aggregated. A McKinsey report1 suggests that of the 40,000 odd data tags found on a single rig, few of them are connected or used. Apart from sensor data, there is seismic data, equipment maintenance information, failure reports, contracts data, and a plethora of other information items that can help improve production efficiency. The report also highlights how inconsistencies in data can lead to situations where there is data not available, the available data not analysed, the analysed data not communicated, and the communicated data not used in decision-making for the simple fact that the decision-makers do not trust this data.

Enter Master Data Management

When data is captured, cleansed, stored and shared across the enterprise, methodically and systematically, it is always more trustworthy than a random aggregate of data from diverse sources. Master Data is all the data from rigs, contracts, supply chain, inventory, assets and production processes, aggregated in the Enterprise Data Warehouse – using a clearly defined data acquisition, management and access strategy. This offers a single version of truth, with users across the board accessing the same data, thus eliminating data inadequacies and anomalies arising out of maintaining multiple versions of this data in different department silos. This data however needs to be maintained and managed efficiently in order that the users get actionable insights that translate into tangible efficiencies. Thus, it is no surprise that data management is forecasted to grow at 28.4% into a $21 billion industry by 20202.

null

What to watch out for

While the statistics point to significant benefits, the road to an analytics-driven enterprise is not without obstacles. One of the significant pitfalls of implementing analytics is in asking the right questions to get the desired answers – this is a known challenge that is typically addressed by the mantra of thinking big and starting small. However, an even bigger challenge is in acquiring the right data for analytics. Some of the aspects that are potential pain areas are:

  • Quality of data – The data acquired must be of optimum quality and trusted by the organization.
  • Data security – Once acquired the data must be secured from wilful and unintended tampering of data.
  • Data cleansing – Done periodically to eliminate duplication and errors, would ensure the master data of the organization is up-to-date as well as trustworthy as a source of insights
  • Role based access – Protecting data from prying eyes is another challenge addressed by implementing a role-based access to the data as well as the analytics tools.

How to get there

Aggregating Master Data starts with an open minded approach to understanding the problems at hand first – the questions that the organisation would like to get answers to, in other words, the use cases. A simple approach to analytics would be:

  • Identify requirements: The ideal approach is to arrive at a wishlist of cases and starting with a small pilot, subsequently enhancing the scope of Master Data with additional data sets based on priority and budget
  • Implement DMS/EDMS: Implementing an electronic document management system could open the doorway to scores of analytics and data modelling opportunities by virtue of digitizing historical data and thus unlocking the insights that were hiding in plain sight earlier
  • Choosing the right solution: This is the tricky part. The solutions are many, each with their distinct benefits as well as limitations. It is ideal to look at a solution that scales to the current as well as future requirements – any data management solution that integrates well with structured data as well as unstructured, will be an ideal candidate.
  • Choosing the right partner – As Svetlana Sicular from Gartner put it3, learning Hadoop is easier than learning a company’s business – that your partner understands your business and its unique challenges is of paramount importance. It spells the difference between your data investments turning data into gold, or the other way around.

With current crude prices treading very close to the five-year low and no consensus on production caps in sight, it is imperative that O&G enterprises improve their production and distribution efficiency. In this backdrop, the value of collating and managing master data is undeniable, and investments trends in Master Data Management have only underscored the market sentiment. Thus it is the safest forecast that concerted efforts in Master Data Management shall yield consistent cost savings.

1 : http://www.mckinsey.com/industries/oil-and-gas/our-insights/digitizing-oil-and-gas-production#0

2 : http://www.marketsandmarkets.com/Market-Reports/oil-gas-data-management-market-85567816.html

3 : http://blogs.gartner.com/svetlana-sicular/data-scientist-mystified/

Written by Shruthi Shastry

on 20 Sep 2016