Digital oil: tools to aid operators amid tightened margins
A move towards digital oilfields could protect offshore operators from the slump in energy prices and equip them with integrated systems that optimize gains from an eventual price recovery, according to speakers at the Data-Driven Production Optimisation Conference held by Upstream Intelligence in Houston on June 16-17.
The two-day Conference dealt with the latest developments in data-driven production, and how firms are applying the latest techniques to improve the productivity, efficiency and reliability of their existing assets and operations.
The concept of digital oil, or intelligent energy, has the potential to turn real-time data from an offshore operation into analytic information, and to use that information to optimise operations.
One account of what digital oilfields mean in practice was given by Brock Mayer, a continuous improvement production engineer with Hess Corporation.
Mayer explained how his company had instituted a lean production system derived from the software development methods of technology firms such as Google and Facebook, which used development loops that are no more than a month long so as to minimise the consequences of mistakes in the planning and analysis process.
As an example, he used predicted decline curves for a given oil field.
In the past the company concentrated on devising algorithms that could be used to draw the curves and make more accurate production forecasts, but when actual production deviated from the forecast, engineers found it difficult to understand why because their data dealt with output at the broad asset level and was contained in spreadsheets that were difficult to access.
“We could say that field production was off by 5% last month, but we didn’t know which wells led to that change, whether it was a best-day forecast problem, or a problem with planned or unplanned downtime," Mayer said.
The response was to use the “sprint” principle to execute monthly checks to compare the expected decline curve with the actual output figures.
“What we’re trying to do is build a system that may use different forecasting methodologies but we want to get all the data stored in a central system at an atomic level – at the day level and at the well level so you can carry out a rigorous comparison between the results and the plan," Mayer said.
“When anybody asks how we’re doing compared to budget and we’re down by 3,000 barrels a day, with this system we can, at the click of a mouse, say which wells are affected, we can drill down to the asset level across the corporation, and we can use TIBCO Spotfire [intelligent analytics] visualisation to explain at the high managerial level exactly which events happened or didn’t happen that affected our results.”
An ideally-integrated system brings together all the processes taking place on an oil field so they can be automated and optimised. However, the reality is that the data and control systems on most operations have been built up in a somewhat unplanned, unstandardised way.
Steve Randolph, director of production technology at Anadarko, explained how his company was working to remedy this situation by putting in place an enterprise-wide IT architecture.
The aim is to capture more data and then allow it to flow around the company as information, and this means logging files and making them available in a virtual centralized database.
Benefits can be made when data is transferred out of an information silo-- the term for an insular system which does not fully integrate with other systems-- and Randolph gave an example of the use of real-time analytics in drilling systems.
This information, he said, is presently obtained from vendors and cobbled together in a way that makes analysis “a lot more difficult than it has to be”. The company is now bringing all its drilling data in-house and is planning to mine it for information.
The construction of the enterprise-wide system is now going ahead based on interviews with engineers who have explained what they would like to be able to access.
The data will then be delivered by applications that teams can configure differently, as each team wants to display its information in slightly different ways.
Reliability through data capture
The use of data has also been harnessed to minimise non-productive time on expensive assets such as drilling rigs.
Trigpoint Solutions is one company that has developed automated asset and operations management systems that gather data through radio-frequency identification and various forms of remote capture.
This is supplemented by data gathered from other sources to predict when each given component on drilling rigs will need to be repaired or replaced before it happens, which then allows a contractor to build up maintenance projection based on an understanding of when individual components will need to be maintained.
Data analysis also allows operators to select a contractor based on their maintenance performance, and can even ascertain which rigs are least likely to suffer unplanned downtime based on their compliance with maintenance plans or other factors, such as crew turnover figures.