This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Data and the new oil industry

Author : George Walker, Novotek UK and Ireland

17 November 2020

As far back as 2006, mathematician Clive Humby boldly stated that “data is the new oil.” Not everyone agrees with this, but there’s no denying that data is invaluable, or that it plays an increasingly important role in modern industrial applications. Here, George Walker, Managing Director of Novotek UK and Ireland, explains why this ‘new oil’ is integral to ushering in a new era of downstream oil and gas.

(Click here to view article in digital edition)

Despite its importance to the global economy — and sometimes because of it — the oil and gas industry is one that is placed under a great deal of pressure to operate efficiently and effectively. So, it’s no surprise that oil and gas has been among the most active adopters of modern technologies to assist in operations. From remote operated vehicles (ROVs) for inspecting subsea pipelines to thermal imaging drones for easier inspection of tank internals.

There is a clear trend here. Most of the assets that engineers are responsible for maintaining and overseeing in the oil industry are static assets, like pipelines, flare stacks and tanks. Unfortunately, these assets are difficult to access and have traditionally not been fitted with sensors to relay performance information to a central control system. So, unmanned and robotic systems are playing a key role.

As we move more downstream, industrial automation and control software is more prevalent, with tasks such as management of a refinery’s crude slate largely using these systems today. However, downstream has a similar problem to upstream in that maintenance remains one of the biggest challenges for engineers, due to the sheer number of complex assets to be maintained. This is exacerbated by the need to minimise downtime.

According to an ARC survey of senior executives and engineering, operations and maintenance managers in oil and gas, 3-5% of production is lost due to unplanned downtime. Maintenance can be seen as a preventative measure to combat this unplanned downtime and maximise uptime… in theory. The challenge is that system complexity often means maintenance has to be carried out using planned downtime — something that still costs the refinery production time and money, but is begrudgingly accounted for rather than being a surprise.

For these reasons, more downstream oil businesses are looking at ways of reducing the frequency of planned downtime and maintenance to save costs and maximise productivity. This has driven an interest in the idea of predictive, preventative maintenance, as well as the process-data collection that underpins it.

Prediction and prevention

Traditionally, maintenance has been scheduled on a rota-style basis in most oil refineries. A manager might keep a record of when a piece of equipment was last serviced, and routine maintenance will be arranged in accordance with the directions of the equipment’s manufacturer. This fails to account for abnormalities that can accelerate a decline in performance and puts the manager and the engineer at the mercy of circumstance because their approach is purely reactive.

This is changing as the industry digitalises. With more sophisticated sensors connected to equipment and assets, engineers can remotely view the performance data of equipment in use. The sensors transmit information about key parameters, such as the operating temperature of motors or the pressure in a centrifugal compressor, to a central control system, such as a SCADA system or manufacturing execution system (MES).

MESs have been used in downstream oil and gas for much of the past decade, so most will be familiar with the benefits and functions. With modern MESs, alongside sensors capable of measuring a wide range of different metrics and communicating quickly with the latest communication networks and protocols, it’s becoming possible for managers to receive all their operation and process data in almost real-time. 

Real-time access to data means that engineers can visualise data and create a snapshot of any given moment of an operation. With this, it becomes far easier to identify when equipment is underperforming.

However, the vast sums of information generated by these operations can easily be too much for an engineer to manage effectively, which is why an increasing number of MESs and industrial internet of things (IIoT) platforms are turning to machine learning algorithms.

Machine learning, artificial intelligence and many algorithms in general have become equal parts buzzwords and breakthroughs in recent years. Owing in no small part to the high number of start-ups that claim to use AI without actually using it, there is a lot of confusion surrounding exactly what these terms mean — and the benefits they provide.

In simple terms, an artificial intelligence is any computer software or algorithm that functions in a way that simulates the thinking processes of a human. And if you think of how human beings learn, much of our knowledge is acquired through our experiences and observations, as well as those of the people around us. Machine learning (ML) brings a similar concept to software, where algorithms are ‘trained’ on data to allow the algorithms to establish connections between data sets.

Analysis of industrial-equipment performance data is a matter of understanding the wider connotations of the presented data and spotting patterns. It’s something that lends itself to automation quite nicely, and the benefit of putting it in the hands of computers is that the system can parse thousands of data sets far quicker than a human.

This technology plays a key role in GE Digital’s Predix IIoT platform and MES. The platform uses ML algorithms that are trained on thousands of sets of industrial process data, so it can be integrated easily and run quickly, while using data collected by existing historian software to teach the algorithm what ‘normal’ looks like for a specific site’s operations. The implications and potential of this are far-reaching, with the only limitations really being the sophistication and number of sensors a business has to monitor assets.

For example, it might be that that a compressor is operating at too high an RPM or is vibrating abnormally while running. Even though the compressor could be an integral part of an oil refining operation, it might be something that is overlooked in the deluge of operational data, until the point where the compressor trips due to overspeed or high levels of vibration and the system starts flaring more than usual.

In a network of equipment monitored by a platform containing a ML algorithm, the software would detect the erroneous performance data and alert the most relevant maintenance engineer to attend to the compressor before tripping occurred. Potentially, this preventative maintenance could have helped the business avoid unexpected downtime — a rather understated accomplishment when you consider that unplanned downtime can cost an average of $49 million annually to offshore oil and gas companies.

This is a particularly time-sensitive example. For other applications, such as the management and mitigation of corrosion in pipework using sensors to detect the total acidity number (TAN) of fluids, the algorithms can recognise early symptoms of problems and automatically adjust maintenance schedules accordingly. This is where businesses can introduce effective predictive maintenance regimens to hydrocarbon engineering, minimising downtime across operations.

Although this application of industrial data can prove invaluable to oil engineers, it is just the beginning of the ways in which process data improves performance. Not only can the proper collection and analysis of data change the strategic and planning side of maintenance, it can also greatly enhance the process of conducting maintenance itself.

A new perspective

The trouble with increasingly complex industrial equipment is that it means that maintenance can be more difficult for engineers to carry out. In many cases, a period of planned downtime, albeit brief, is necessary to maintain systems. So, despite site managers introducing AI-supported predictive maintenance schedules with best of intentions, the maintenance itself can still trim down productivity.

However, this doesn’t need to be the case. One of the biggest advantages of collecting the vast sums of process data produced during operations is that it allows for increasingly advanced visualisation. This supports two interesting possibilities: digital twinning and augmented reality (AR).

Digital twinning software allows engineers to create a virtual representation of their site and operations using the collected data and real-time inputs. This twin can simulate the impact of actions on processes and operations. If a maintenance manager wants to test a new schedule for maintaining equipment, they can run a simulation to see what the repercussions of that decision might be.

Similarly, a procurement team member curious about the implications of modernising certain key assets can see a data-driven estimation of how operations will be affected.

Again, this serves as a useful planning tool that leverages collected data. But this visualisation shows how the detail contained in the data can allow for new levels of insight. This visualisation also helps explain how AR works in industrial settings.

Using a mixed reality headset such as Microsoft HoloLens, or even a modern smartphone, engineers can access industrial data through certain platforms, such as PTC’s Thingworx. This real-time data updates a virtual representation of each machine and component in a plant, and engineers can see the live data, as well as the location and easiest access point for components, while looking at a machine.

This is only possible if the AR application can properly access the real-time operational data from the system. In this instance, the engineer could use the AR functionality of PTC’s ThingWorx 8 industrial internet of things (IIoT) platform.

By using a purpose-built AR application, the engineer can view real-time system data from the ThingWorx IIoT platform and see which components are performing inefficiently. In addition to this, the functionality also allows more senior engineers to accelerate the training of new staff, providing further long-term value to the company.

This will not entirely eliminate the planned downtime from maintenance, but it will significantly reduce the amount of time needed for routine check-ups, as engineers can easily identify the most effective and efficient way to approach the task.

It’s fitting, then, that data has been referred to as the new oil. Just like oil, data is not inherently valuable – its value comes from how it is used, and so it must be refined and applied effectively to gain the most value.


Contact Details and Archive...

Print this page | E-mail this page