Start Submission

Reading: Evaluation of next generation emission measurement technologies under repeatable test protocols

Download

A- A+
Alt. Display

Research Article

Evaluation of next generation emission measurement technologies under repeatable test protocols

Authors:

Clay S. Bell ,

Colorado State University, Energy Institute, Fort Collins, Colorado, US
X close

Timothy Vaughn,

Colorado State University, Energy Institute, Fort Collins, Colorado, US
X close

Daniel Zimmerle

Colorado State University, Energy Institute, Fort Collins, Colorado, US
X close

Abstract

Twelve next generation emission measurement (NGEM) technologies completed single-blind testing at the Methane Emissions Technology Evaluation Center in 2018. This is the first series of tests to evaluate a wide variety of NGEM solutions including handheld, mobile, and continuous monitoring methods using comparable, repeatable protocols. Results assess performance of detection, localization and quantification, albeit with limited statistical significance due to a low number of tests. Overall, a higher detection rate is observed for handheld and mobile solutions than for continuous monitoring solutions. Compared to when a single emission source is present, a decline in detection rate is observed across all methods when multiple, steady emission sources are present. Localization by handheld and mobile solutions is more accurate than continuous monitoring solutions. These results support the common perception that detections by continuous monitoring systems will need to be confirmed and pinpointed by a follow-up inspection. Finally, this and other controlled release experiments, have been performed across a limited range of environmental conditions. To develop robust probability of detection curves needed for demonstrating emission reduction potential of leak detection and repair programs, new protocols are needed to evaluate methods across a wide range of metrological conditions and emission scenarios in a cost-effective manner.

Knowledge Domain: Atmospheric Science
How to Cite: Bell, C.S., Vaughn, T. and Zimmerle, D., 2020. Evaluation of next generation emission measurement technologies under repeatable test protocols. Elem Sci Anth, 8(1), p.32. DOI: http://doi.org/10.1525/elementa.426
303
Views
81
Downloads
4
Citations
Altmetric
 Published on 13 Jul 2020
 Accepted on 15 Jun 2020            Submitted on 02 Jan 2020
Domain Editor-in-Chief: Detlev Helmig; Institute of Alpine and Arctic Research, University of Colorado Boulder, US
Guest Editor: Brian Lamb; Washington State University, US


Introduction

Inventory and model-based estimates of methane emissions from the US natural gas supply chain vary and often disagree, for example the U.S greenhouse gas inventory estimates 6,700 Gg of methane emissions from natural gas systems in 2015 (1.2% of gross U.S. gas production), while Alvarez et al. developed a model-based estimate 13,000 Gg (2.3% of gross U.S. gas production) (US EPA, 2019; Alvarez et al., 2018). Improving these estimates is critical to our understanding of the climate implications of switching to natural gas as a low carbon fuel since methane is a potent greenhouse gas, 84 times more potent than carbon dioxide on a 20 year time span, and these emissions offset potential near-term climate benefits (Pachauri and Meyer, 2014). Emission sources span a large range of magnitudes, are temporally variable, include many source types such as combustion exhaust, process vents and fugitive leaks, and are spatially distributed across many facilities and facility types nationwide. The variety of source and facility types, the range and temporal variability of emission rates, and the spatial extent of the system make emission detection and measurement challenging and has necessitated the use of many methods in recent studies (Bell et al., 2017; Harriss et al., 2015; Marchese et al., 2015; Robertson et al., 2017; Schwietzke et al., 2017; Subramanian et al., 2015; Vaughn et al., 2018, 2017).

In 2015 the U.S. Department of Energy Advanced Research Project Agency – Energy (ARPA-E) funded the development of 11 next generation emission measurement (NGEM) technologies under the Methane Observation Networks with Innovative Technology to Obtain Reductions (MONITOR) program (U.S. Department of Energy, 2014). These and other NGEM technologies include a wide range of sensors (acoustic monitors, in-situ samplers, open path measurements, infrared, multispectral and hyperspectral imaging), deployment platforms (handheld systems, stationary sensor networks, unmanned aerial vehicles, piloted aircraft, and satellites) and use cases (voluntary monitoring, monitoring for environmental compliance, and critical safety applications) (Fox et al., 2019a). Many of these solutions aim to be implemented in Leak Detection and Repair (LDAR) programs established by operators to reduce emissions from their assets and to comply with state and federal regulations (Colorado Department of Public Health and Environment, Air Quality Control Commission, 2019; U.S. EPA, 2015). However, regulatory compliance of these programs requires the use of approved methods to perform leak detection. Many new solutions may be capable of achieving equivalent or better emission reductions, but demonstrating this equivalency remains a barrier to widespread adoption across the industry. A framework to demonstrate equivalent emissions reduction potential of LDAR programs was recently developed and has received widespread support across stakeholders (Fox et al., 2019b). This framework identified the need for performance testing of technologies under standardized protocols (the focus of this paper), coupled with modeling and field trials to achieve a full approval. Currently, such a standardized test protocol to evaluate the performance of methane detection technologies and allow direct comparisons of different solutions does not exist.

In 2016 Colorado State University was selected by ARPA-E to design, construct and operate the Methane Emissions Technology Evaluation Center (METEC) as a proving ground to evaluate technologies developed by MONITOR. Two rounds of blind tests were performed at METEC by the ARPA-E MONITOR complete solutions. The first round (R1) was performed in Spring 2017 and was intended to provide a proof of concept for the ARPA-E MONITOR performers in a simplified field setting. The second round (R2) was performed in Spring/Summer 2018 and was intended to provide more realistic emission scenarios by introducing larger facilities, the presence of multiple sources, and unsteady emission sources. Testing following the R2 protocols was offered to other non-MONITOR technologies in Summer 2018. There were no screening or selection criteria, however non-MONITOR participants were required to pay for testing time at METEC under a standardized fee structure. In total, six MONITOR and six non-MONITOR technologies (herein collectively referred to as Performers) participated using the R2 test protocols.

This paper discusses the R2 protocols and test results, and investigates the test requirements for evaluation of detection curves required by simulation models in the proposed equivalency framework.

Methods

Methane Emission Technology Evaluation Center

All experiments for R2 testing were performed at METEC (SM – Section 1). METEC was designed to include representations of natural gas facilities and equipment including production well pads, a small gathering facility, and buried pipelines similar to those found in gathering or distribution systems. This equipment was outfitted with release points where the flowrate can be controlled to simulate emission sources observed in field measurements. Emission release points included representations of process emission sources (e.g. gas operated pneumatic controllers) and fugitive emission sources (flanges, fittings, instrument ports, valve packings, etc.). An electronic control system was designed to allow emission release points to operate at steady or unsteady rates. The system allowed multiple emissions on each pad to be operated simultaneously and controlled independently.

The control system at METEC used precision orifices to control flowrates at emission locations throughout the facility. The pressure to each orifice-based flow controller was manually set by the operator before each test to achieve the target flowrates. Flowrates were metered using thermal mass flow meters. A pre-test calibration was used to assess emission flowrates if multiple sources were downstream of a single flow meter during a test. All tests were performed with compressed natural gas (CNG), a mixture of methane, ethane, propane, and carbon dioxide. Gas composition was assessed prior to testing using a gas chromatograph and quantification results were analyzed in terms of methane emission rates.

Description of tests and protocols

Test protocols were developed with three main objectives: (1) to enable testing of solutions as they would be deployed in the field in either pad-by-pad surveys or as continuous monitors; (2) to include grades of test complexity to allow evaluation of performers at a range of technical readiness level, from early stage technologies to well-developed solutions; and (3) to produce repeatable emission rates and emission locations in the series of tests allowing testing to be completed at different times for each performer.

Solution deployment methods

Two test protocols (SM – Section 2) were developed to support two distinct modes of solution deployment: (1) survey solutions in which a handheld or otherwise mobile instrument is deployed within the fence line of a production well pad to assess emissions during a routine, scheduled visit. Survey methods typically move pad-by-pad through an operational region and assess a snapshot of emissions at the time of the visit; and (2) continuous monitoring solutions in which an instrument is deployed as a fixed installation to monitor emissions from a system for an extended period, typically months or years. Emission detection methods work at a variety of scales or resolutions. Leak detections from either survey or continuous monitoring methods may be localized to the facility (the facility does/does not have emissions), to a unit of major equipment (emissions detected somewhere on a unit of equipment) or to a single component. Emission detection and sizing can range from binary presence/absence of emissions – i.e. leak detection only – to estimates of the emission rate at varying levels of precision. Finally, solutions may be designed to monitor one or multiple nearby facilities using a single instrument.

While deployment modes varied, test protocols for both survey and continuous monitoring solutions followed a similar testing process: The METEC operator established an emissions configuration on an assigned pad, the performer completed measurements on the pad, and reported results to METEC. Measurement teams were allowed time to post-process data prior to submitting their results. Test conditions were not disclosed to the performer until they had reported results to METEC.

Test complexity

Individual tests were developed using three levels:

  • A – Single emission source operated continuously at a steady flowrate for the duration of test. This complexity level provides a basic test configuration in which to detect emissions, but is rare in field conditions.
  • B – Multiple emission sources each operated continuously at a steady flowrate for the duration of test, with a maximum of one single emission source on a single unit of major equipment (a well head, separator, or tank. This complexity level tests the ability of solutions to discriminate between multiple sources, while not introducing the complexity of varying flowrates or sources in close proximity to each other.
  • C – Multiple emission sources operated intermittently or continuously at steady flowrates. More than one emission source may be located on the same unit of equipment. This complexity level is intended to approach realistic field conditions by introducing intermittent emission sources in addition to steady emission sources.

Test repeatability

To allow comparisons to be made between performers tested at different times, a set of predefined test conditions were developed. Each test was driven by a computer software which opened and closed valves on the METEC facility according to a predefined schedule. This allowed emission sources to be generated at the same locations, using the same valve and orifice combinations, and on the same temporal schedule for each performer. A pressure setpoint was defined and manually set by the operator for each test to achieve repeatable emission rates through the orifice-based flow control system (SM – Section 3).

Interpretation of emission detections

For each test, performers were required to report the leak location using GPS coordinates or a 0.25 m grid system. If the solution indicated the capability to quantify emissions, performers also reported an emission rate. Reported locations were translated into detection categories (SM – Section 4) using the reported pad, equipment type and equipment ID, relative to the actual location of the emission source, resulting in four categories of detection:

  • Equipment detect – An emission source was reported on the same unit of equipment as an actual emission source.
  • Group detect – An emission source was reported on the same group of equipment where an emission source was present, however the reported location was not on the same unit of equipment as the actual source.
  • False negative – No emission sources were reported on the group of equipment where an emission source was present during the test.
  • False positive – An emission source was reported on a group of equipment where no emission sources were present during the test.

Limitations of testing

In general, continuous monitoring solutions were at a lower technology readiness level than survey solutions, making testing more difficult to design to avoid biasing results. Three aspects of the test protocol proved particularly challenging for continuous monitoring solutions:

  1. Some continuous monitoring solutions have focused development primarily on detecting large emitters, due, in part, to the performance of the low-cost sensors (<$100 per unit) used in most solutions. The emission rates used here were taken from field measurements; the range of selected rates covered over 90% of 299 source measurements at production facilities by Bell et al., 2017. Therefore, the rates used here may fall below the lower detection limit of some continuous monitoring solutions, indicating that these solutions would not find most typical leaks seen in field studies of production emissions. However, an ‘always-on’ continuous monitor that detects large emission sources quickly could enable significant emissions reduction by identifying failures quickly – an operational modality that was not tested in this study.
  2. Since, by definition, the positioning of the sensor(s) used in continuous monitoring solutions are fixed, most of these solutions must acquire data across a range of wind directions to detect emissions. The time allotted for each test in the protocol may have been shorter than this data acquisition time in some cases, providing insufficient data for detection and localization algorithms to converge to a more accurate result. That said, performers were given the option to test for longer periods prior to the controlled trials and/or request more operational time; none did. While this limitation can be overcome in future tests by simply allotting more time per test, extended test durations increase overall testing costs and time. Additionally, the emission profile of oil and gas facilities is known to include many temporally variable sources, and increasing test times may make the testing less realistic. Future testing protocols could allow for two operational modes – a long duration mode to support algorithm testing and development, and a realistic operation mode, where emission patterns and variability match those seen in field studies.
  3. Some continuous monitoring solutions are designed to only provide detections localized to the equipment group, or, in some solutions, to the well pad. Therefore, it is important to consider both equipment- and group-detections as positive results. In practical deployments, either type of detection would likely be followed by a handheld survey to identify the equipment and component where the emissions were released.

Performers

Performers who completed testing are listed in Table 1. Seven solutions tested using the survey protocol and five solutions tested using the continuous monitoring protocol. Performers completing testing under the survey protocol were classified as (1) handheld – where a handheld instrument is used by the performer, and (2) mobile – where an instrument is mounted on a drone, vehicle or other means of mobility. Note that both handheld and mobile systems were deployed on-pad in this study, and mobile methods here are distinctly different from “screening” methods as discussed by Fox et al. [2019a] Two detection only solutions reported data in a different format than requested which has prevented their inclusion in the analysis below.

Table 1

Performers and measurement methods included in study. DOI: https://doi.org/10.1525/elementa.426.t1

Performer Description1 Deployment Mode Data Type

Aeris Technologies(ARPA-E MONITOR Performer) The Aeris R2 ARPA-E testing comprised a fixed Aeris Pico MIR CH4/C2H6 laser absorption spectrometer, a sonic anemometer, and neural-network based advanced analysis algorithm to identify leak location and size. Continuous Monitoring Detection, localization, quantification
Alert Plus2 Alert Plus Aegis can be implemented with a range of sensor types as a continuous emission data collection system. The CSU round up utilized point infrared sensors distributed throughout the equipment and connected to the 4 channels available on the Aegis system. Precise placement is a key logistic for maximum performance. Continuous Monitoring Detection
Heath Consulting2 The Remote Emission Monitor (RMLD-REM) is based on the TDLAS approach. The REM is a fixed deployed open path laser which is continuously measuring the methane level. Cloud based analytics processes, store and notifies when a leak is detected. Continuous Monitoring Detection
IBM T.J. Watson Research Center (ARPA-E MONITOR Performer) Mesh network of Metal-Oxide-Semiconductor sensors placed around the well pad perimeter at 1.5 m height above the ground measure time synchronized methane and wind condition. System designed for 24/7 remote operations and data acquisition rate of 1 Hz with integrated analytics for localization, emission rate estimates, and threshold warning system. Continuous Monitoring Detection, localization, quantification
University of Colorado, Boulder (ARPA-E MONITOR Performer) Stationary long-distance laser-based in-situ sensor coupled with dispersion modeling to detect, localize, and quantify specific methane emissions across multiple square kilometer regions. The dual-frequency comb laser spectrometer uses 100,000+ distinct light wavelengths to measure methane with high precision and accuracy. A telescope transceiver directs the laser beams to open lines-of-sight downwind of potential sources. Continuous Monitoring Detection, localization, quantification
Bridger Photonics(ARPA-E MONITOR Performer) Bridger’s Gas Mapping LiDAR™ (GML) locates (equipment level) and quantifies (industry leading levels) methane leaks throughout the entire natural gas value chain. GML uses a proprietary laser-based aerial remote sensor. For METEC Round 2 we deployed on a drone to image/quantify gas plumes, provide 3D topography and capture aerial photography. Mobile Detection, localization, quantification
Fluke Prototype, handheld, long-wave IR OGI camera utilizing an uncooled microbolometer focal plane array (FPA) with modified optics to ensure sensitivity to methane in the 7.3–8.1 micron wavelength range. Handheld Detection, localization
Gas Detection Services TDLAS mounted on pickup truck monitoring air around assets, analyzing methane concentrations (PPM) on real time, and creating a digital map with color-coded (green, yellow, red) data points for better visualization. Analytics are an extra feature. Mobile Detection, localization
Heath Consulting The Remote Methane Leak Detector (RMLD) instrument is based on the TDLAS approach. The RMLD is handheld and was used as part of a walking survey. Handheld Detection, localization
LaSen ALPIS™ is a mid-IR Differential Absorption LIDAR (DIAL) airborne sensor unit mounted under a Bell Jet Ranger helicopter. The main sensor enclosure houses an eye-safe laser, computer, receiver optics, detectors, high-resolution imaging cameras, and a GPS receiver. Mobile Detection, localization, quantification
Physical Sciences Inc. (ARPA-E MONITOR Performer) ARPAe R2: Downward-looking backscatter-TDLAS flies aboard a small (0.6 m, 1.3 kg) UAV. Semi-autonomous flight patterns scan areas of interest, creating quantitative plume maps overlain upon aerial camera images. Leak source locations are deduced from the imagery. Mass-balance analysis of the measured downwind plume concentration yields leak rate. Mobile Detection, localization, quantification
Rebellion Photonics (ARPA-E MONITOR Performer) With combined physics-based and data-driven analytics on edge device, the Gas Cloud Imaging (GCI) Technology is uniquely capable of processing real-time hyperspectral and visual data, identifying and quantifying hydrocarbons in real-time video. The miniGCI camera was mounted on a tripod or on a lift in a vehicle for this study. Mobile Detection, localization, quantification

1 Solutions descriptions written and provided by the performer, not by the authors.

2 Solution results did not conform to provided format and have been omitted from analysis.

Data aggregation and blinding

Due to confidentiality agreements in place at the time of testing, we do not analyze the performance of individual solutions in this paper. Instead we present results aggregated by test protocol and by solution deployment mode to blind the dataset. Therefore, it is important to note that the aggregated results do not illustrate the performance of any individual solution. Additionally, the analysis presented was performed solely by the authors and does not represent the opinions of the performers.

Protocol comparison to mobile monitoring challenge

Testing in this program had three primary differences from that performed in the Mobile Monitoring Challenge (Ravikumar et al., 2019):

  1. Experiments designed to assess detection in the MMC included only a single emission source, resulting in four possible outcomes for each experiment (true positive, false positive, true negative, and false negative). When multiple emission sources were present as part of the MMC quantification testing, different conventions were applied to interpret detections. For example, a scenario where three leaks were present and a single emission rate was reported was interpreted as three true positive detections. In contrast, in this study each reported emission was matched to a single controlled release. The equipment detect and group detect in our analysis are similar to the Level-1 and Level-2 true positive interpretation used by the Mobile Monitoring Challenge, however a true positive Level-3 detection would be identified as a false positive in our analysis.
  2. Testing under Mobile Monitoring Challenge included only cases where emission sources operated at a steady flowrate, and did not analyze experiments where a single release was present separately from those where multiple releases were present. Experiments in this paper identify the test complexity as an independent parameter to investigate the impact of single, multiple, and intermittent emission sources on solution performance as measured by controlled release experiments.
  3. This paper includes testing of mobile solutions, similar to those tested in Mobile Monitoring Challenge but also includes handheld solutions, as well as continuous monitoring solutions. We test handheld and mobile leak detection solutions under the same protocol since these methods would be applied in a similar manner where a team moves through a region from facility to facility performing emission surveys and recording leaks detected. We test continuous monitoring solutions under a separate protocol since these systems would not be deployed in a similar way. Both protocols test solutions under a range of emission rates and test complexity, and the same rules are applied to interpret emission detection reports.

Test results and discussion

A total of 192 tests were performed by the 10 solutions analyzed. An empirical cumulative distribution function (CDF) of emission rates is shown in Figure 1 for survey experiments (handheld and mobile solutions) and continuous monitoring experiments. We present emission rates in standard cubic feet per hour (scfh) of methane using the Compressed Gas Association standard conditions of 70°F and 14.7 psia because the U.S. gas industry typically measures flow rates in standard cubic feet. Under this standard 1 scfh of methane is equal to approximately 18.8 g·h–1. Tests were designed to target the ARPA-E MONITOR flowrate metric of 6 scfh, however the mean emission rate of emission points in survey protocol (7.5 scfh, σ = 5.6 scfh) was slightly higher than mean emission rate in the continuous monitoring protocol (5.2 scfh, σ = 4.0 scfh). A CDF of 299 direct measurements of emission sources at production sites in the Fayetteville shale, AR, is shown for comparison (mean emission rate = 11.1 scfh) with 94% of measurements found within the range of emission rates included in this series of tests (Bell et al., 2017). Although the mean of direct measurements was higher than the mean of the emission rates in this study, a larger fraction of sources was measured below 6 scfh by Bell et al. than included in the survey and continuous monitoring protocols.

Figure 1 

Empirical cumulative distribution function of emission rates included in tests under survey and continuous monitoring protocols. Protocols were designed to target the ARPA-E MONITOR flowrate metric of 6 scfh with mean emission rate of 7.5 scfh in survey protocol and 5.2 scfh in continuous monitoring protocol. 94% of 299 direct measurements made at production facilities by Bell et al. (2017) are within the range of emission rates included in this set of experiments. DOI: https://doi.org/10.1525/elementa.426.f1

Detection by emission rate

When results are aggregated by deployment mode, we observe increasing detection rates with increasing emission rates (Figure 2, left). The fraction of emission sources detected increases approximately linearly across the range of emission rates tested.

Figure 2 

Detection rates versus emission rate (left) and test complexity (right). Bin counts are shown above each bin. Detection rate of all solutions increases with increasing leak rates and decreases with increasing test complexity. Handheld solutions, with a human operator confirming detections during survey, exhibit highest detection rates. Mobile solutions, where the operator is supervising the survey but not directly confirming detections, exhibit slightly lower detection rates than handheld solutions, particularly as test complexity is increased. Continuous monitoring solutions exhibit much lower detection rates than handheld and mobile survey methods for similar emission rates, however detection rates of continuous monitoring solutions improve considering group detections, reflecting the fact that many may be designed to detect at the equipment group-level and may not be intended to provide more accurate localization. DOI: https://doi.org/10.1525/elementa.426.f2

Handheld solutions, which include a human operator moving through the pad and confirming detected emissions as part of the detection process, exhibited the highest detection rates among the deployment modes tested, and are the only category which achieved 100% detection at emission sources greater than 10 scfh (22 detections). Note, the survey protocol evaluates the performance of the solution plus the human operator, and is intended to evaluate the solution performance as deployed in the field. The performance of the sensor plus operator is notably different than the performance of the sensor alone in a controlled laboratory environment.

Mobile solutions exhibited overall detection rates (85%) comparable to handheld detectors (90%), however did not achieve 100% detection of sources greater than 10 scfh as handheld detectors did. Mobile solutions also include a human operator who typically remains at the pad edge and may complete additional data collection using the solution if a leak is suspected, however the operator typically does not confirm detections directly.

Considering detections in Figure 2 which identified emissions from the correct equipment group, continuous monitoring solutions also exhibit increasing detection rates with increasing emission rates. However, detection rates observed are lower than those of survey solutions, particularly at emission rates below 6 scfh. The drop in emission rate at 8–10 scfh can be partially attributed to a low overall count in the particular bin (11 emission sources) and zero experiments in test complexity A which included emission sources between 8–10 scfh. We acknowledge the limitations of testing discussed earlier for continuous monitors, particularly the detection limits and the time allotted per experiment.

Detection by test complexity

As test complexity was increased, detection rate generally decreased (Figure 2, right). The detection rate, including same group detections, decreased from 94% when only a single steady emission source was present (Test Complexity A) to 79% when multiple steady emission sources were present (Test Complexity B). This suggests that solutions found it more difficult to detect and isolate multiple sources from one another, than to simply detect the presence of an emission. This result has important implications with respect to field deployment since oil and gas facilities often include process emissions from pneumatic controllers, pressure relief valves, and compressor exhaust gases. This is particularly true for continuous monitoring solutions that will need to separate fugitive emission sources from a temporally variable background caused by local process emissions to provide actionable data in the form of emission rate estimates, location estimates, and/or alarms to an operator.

When intermittent emission sources were introduced (Test Complexity C) the overall detection rate further decreased to 63%, however intermittent emission sources had an uneven effect on different deployment modes. Detection rates of handheld solutions increased from complexity B to complexity C. This may be due to the human operator hearing the actuation of solenoid valves during the test and deducing an emission source had started or stopped. Note, this is analogous to a pneumatic controller actuation on a real facility which also produces an audible signal. In contrast, detection rates decreased when intermittent sources were included for mobile and continuous monitoring solutions where the operator is more removed from the equipment during the test and would not be able to deduce the start or stop of an emission source from an audible actuation. It is important to note that only 35% of emission sources in complexity C were intermittent, and therefore one would expect only a small change relative to test complexity B.

The testing performed here – including the ‘C’ complexity tests – represented a less-complex emission environment than is typical in field conditions at most well pads. Key differences include: (a) the timing of intermittent vents – intermittent emissions utilized here were more frequent (5 minute cycle time) and more regular (±1 second) than is typical of intermittent vents on field equipment, (b) no variable equipment failures – intermittent emissions that persist for minutes or hours before stopping or changing rate, (c) variable leak rates – in field locations leak rates from some locations, like tank vents, may vary with the cycling of other equipment on the well pad. Given the decrease in detections with increased complexity seen in study, it is reasonable to expect that testing with the full facility complexity will likely reduce probability of detection further.

False positives

False positive detections were reported on all test complexity levels and by all deployment modes (Figure 3). Mobile solutions had the lowest false positive rate as a fraction of the total number of reported emission sources including zero false positives in test complexities B (67 reported emissions) and C (28 reported emissions). Handheld solutions had low false positive rates (<5% of reported sources) in test complexities A and B, however the false positive rate increased to 25% (5 false positives in 20 reported emission sources) in test complexity C. Continuous monitoring solutions had the highest false positive rate, for which 35% of reported sources were identified as false positives. This implies if an LDAR program were to use a continuous monitoring solution to vector a repair team, then under this series of tests the repair team would locate an actual emission source only 65% of the time. Since test times were limited in this study, longer test times and improved analytics, using more data collected for longer periods and more variable wind conditions, may improve the performance of the continuous monitoring solutions. Also, the strength of continuous monitoring solutions may be in finding large emitters quickly, and large emitters were not tested in this study.

Figure 3 

False positives versus test complexity for each deployment mode. (a) In aggregate 17% of reported emissions were false positives. (b) Handheld solutions included false positives in all test complexities with 9% false positive rate overall. (c) Mobile solutions had the lowest overall false positive rate of 1.5%, with no false positive reports in test complexities B and C. (d) 35% of emission sources reported by Continuous Emissions Monitoring solutions were false positives. DOI: https://doi.org/10.1525/elementa.426.f3

Localization

Distances between the actual emission location and the emission location reported by performers were calculated using GPS coordinates for equipment detects and group detects. Handheld solutions reported 50% of emission sources to within 1 m of the actual emission location compared with 82% for mobile solutions (Figure 4). Some portion of this difference may be due to reporting. Manual solutions reported coordinates by reading GIS map layers (kml files provided by METEC) for horizontal coordinates and measuring distance from the ground for the vertical coordinate (SM – Section 2). This type of localization was unfamiliar to the teams and could have introduced some error. However, it is also indicative of field performance, future automated system will likely require this type of reporting. In contrast, mobile solutions often included algorithms for pinpointing the emission source relative to the sensor’s position using an onboard GPS sensor. For these solutions, the positioning is intrinsic to the solution and represents performance of full method as implemented.

Figure 4 

2D distance between reported emission location and actual emission location vs metered emission rate (left) and empirical cumulative distribution function of 2D distance between reported emission location and actual emission location (right) by deployment type. DOI: https://doi.org/10.1525/elementa.426.f4

Location accuracy of continuous monitoring solutions was much lower than mobile and handheld solutions under this series of tests, with 18% of detected emission sources reported within 1 m of the actual source. 51% of detected emission sources were reported greater than 4 m from the actual emission source by continuous monitoring systems. Note, the continuous monitoring protocol included tests only on the larger (45 m × 60 m) Pad 4 This reflects the detection results (Figure 2), where continuous monitoring solutions often reported the emission location in the correct equipment group but not on the correct unit of equipment. This result is also consistent with the use case, where upon detection continuous monitoring solutions provide an alarm to the operator who will then deploy a team with a handheld or mobile system to pinpoint and repair the emission source.

These results identify a key learning for integrating NGEM solutions into operator workflow: While current man-operated leak detection methods track detections by a component or location description (e.g. “the dump valve actuator on east separator”), most NGEM solutions will likely report using coordinates relative to a reference location at the facility, and these coordinates must be translated into work order instructions for subsequent teams. Further, reporting in this method is often less subject to operator error and more amenable for long-term tracking and analysis. However, since most facilities currently have no such absolute coordinate system, additional work will be required to establish these reference points and to accommodate detection results into operational workflows.

Quantification

Mass flow rate estimates were reported for 143 detected emission sources by seven of the solutions under test (SM – Section 5). No handheld solutions reported quantification estimates of emission mass flow rates. Quantification error was calculated for each detection as the reported mass flowrate minus the metered mass flowrate (Error = ṁreported–ṁmetered). The distribution of quantification error from mobile methods was nearly centered around zero (Figure 5 left) with 43% of estimates lower (negative error) and 57% of estimates higher (positive error) than metered emission rates. Mean error from mobile solutions was 1.3 scfh (17% of mean emission rate in the survey experiments). Measurement error in estimates from continuous monitoring solutions were generally biased high (Figure 5 right) with 25% of estimates lower (negative error) and 75% of estimates higher (positive error) than metered emission rates. Mean error from continuous monitoring solutions was 8.8 scfh (167% of mean emission rate in the continuous monitoring experiments). Quantification error of individual performers show solutions with better performance (narrower error bounds illustrated by a steeper CDF, and higher accuracy illustrated by a central value or mean error closer to zero), as well as solutions with worse performance (less accuracy illustrated by an error CDF shifted left or right) than the average results across all solutions.

Figure 5 

Quantification error of mobile solutions (left) and continuous monitoring solutions (right) for detected emission sources. Mobile methods produced quantification estimates that averaged 17% higher than the mean actual emission rate in testing, while continuous monitoring solutions had a more substantial overestimate of emissions, 167% higher than the mean actual emission rate. Quantification error of individual solutions (anonymized and shown as lighter lines) illustrate individual performance may be biased lower or higher (shifted left or right) from performance of all solutions. DOI: https://doi.org/10.1525/elementa.426.f5

These results indicate that quantification estimates from NGEM methods are likely to produce an estimate of total emissions that has a high uncertainty, and for the solutions tested here, a positive bias of 17–170%. This suggests more extensive testing focused on the accuracy and uncertainty of quantification methods is needed if these solutions are to inform operators or regulators of overall emissions from oil and gas operations.

Detection probability curves

Recent work to establish a “pathway to equivalency” has identified a modeling approach to demonstrate equivalent emission reduction potential of programs which use alternative methods for leak detection (Fox et al., 2019b). The modeling approach will rely on a method-specific detection probability curve where the probability of detecting a given leak is a function of the source characteristics (emission rates, gas composition, component type, etc.) and/or environmental conditions (temperature, wind speed, wind direction, etc.). The R2 tests are insufficient to evaluate this probability curve since they do not include significant variability in emission rates or environmental conditions, nor enough tests to develop statistically significant results to characterize these metrics for each solution tested.

This discussion highlights several key points. First, this study, and all other controlled tests of NGEM solutions known to the authors, have performed testing in a narrow band of environmental conditions insufficient to derive robust detection probability curves. Since it is well-understood that the detection rate of most NGEM solutions will depend upon environmental conditions, larger batteries of tests, over longer periods including more variable weather conditions, would be required to develop robust curves. Second, characterizing performance will likely require the construction of a probability-of-detection surface, with additional variables for environmental (i.e. weather, terrain, vegetation, etc.) conditions. Key variables will likely differ between different solutions, reflecting technological constraints of each solution. This suggests new protocols are needed to evaluate methods across a wide range of metrological conditions and emission scenarios in a cost-effective manner.

Field deployment

During this testing, solutions were tasked with detecting emissions, and were not tasked with discriminating between leaks (unplanned emissions) and vents (planned emissions). Leaks are generally equipment failures which release gas. Venting refers to emissions from equipment in the normal operation of that equipment. Emissions from gas-pneumatic controllers, pneumatic pumps, tank flash, and similar emissions are typically classified as venting. In field deployments at most onshore natural gas facilities, it will be necessary to distinguish between leaked and vented emissions to avoid unnecessary follow-up actions. Future protocols need to consider, if not implement, such testing.

Data Accessibility Statement

Data used in this analysis are available in the supplemental material.

Supplemental files

The supplemental files for this article can be found as follows:

Acknowledgements

We acknowledge the participation of all performing technology groups in this testing.

Funding information

Funding for the design, construction and operations of the Methane Emission Technology Evaluation Center, including testing of MONITOR technology performers was provided by US Department of Energy ARPA-E MONITOR program. Non-MONITOR performers included in this testing paid to participate in the CSU Methane Solution Roundup.

Competing interests

The authors have no competing interests to declare.

Author contributions

  • Contributed to conception and design: CB, TV, DZ
  • Contributed to acquisition of data: CB, TV
  • Contributed to analysis and interpretation of data: CB, DZ
  • Drafted and/or revised the article: CB, TV, DZ
  • Approved the submitted version for publication: CB, TV, DZ

References

  1. Alvarez, RA, Zavala-Araiza, D, Lyon, DR, Allen, DT, Barkley, ZR, Brandt, AR, Davis, KJ, Herndon, SC, Jacob, DJ, Karion, A, Kort, EA, Lamb, BK, Lauvaux, T, Maasakkers, JD, Marchese, AJ, Omara, M, Pacala, SW, Peischl, J, Robinson, AL, Shepson, PB, Sweeney, C, Townsend-Small, A, Wofsy, SC and Hamburg, SP. 2018. Assessment of methane emissions from the U.S. oil and gas supply chain. Science , eaar7204. DOI: 10.1126/science.aar7204

  2. Bell, C, Vaughn, T, Zimmerle, D, Herndon, S, Yacovitch, T, Heath, G, Pétron, G, Edie, R, Field, R, Murphy, S, Robertson, A and Soltis, J. 2017. Comparison of methane emission estimates from multiple measurement techniques at natural gas production pads. Elem Sci Anth 5. DOI: 10.1525/elementa.266

  3. Colorado Department of Public Health and Environment, Air Quality Control Commission. 2019. Regulation Number 7 Control of Ozone Via Ozone Precursors and Control of Hydrocarbons Via Oil and Gas Emissions (emissions of Volatile Organic Compounds and Nitrogen Oxides).

  4. Fox, TA, Barchyn, TE, Risk, D, Ravikumar, AP and Hugenholtz, CH. 2019a. A review of close-range and screening technologies for mitigating fugitive methane emissions in upstream oil and gas. Environ. Res. Lett . 14: 053002. DOI: 10.1088/1748-9326/ab0cc3

  5. Fox, TA, Ravikumar, AP, Hugenholtz, CH, Zimmerle, D, Barchyn, TE, Johnson, MR, Lyon, D and Taylor, T. 2019b. A methane emissions reduction equivalence framework for alternative leak detection and repair programs. Elem Sci Anth 7: 30. DOI: 10.1525/elementa.369

  6. Harriss, R, Alvarez, RA, Lyon, D, Zavala-Araiza, D, Nelson, D and Hamburg, SP. 2015. Using Multi-Scale Measurements to Improve Methane Emission Estimates from Oil and Gas Operations in the Barnett Shale Region, Texas. Environ. Sci. Technol . 49: 7524–7526. DOI: 10.1021/acs.est.5b02305

  7. Marchese, AJ, Vaughn, TL, Zimmerle, DJ, Martinez, DM, Williams, LL, Robinson, AL, Mitchell, AL, Subramanian, R, Tkacik, DS, Roscioli, JR and Herndon, SC. 2015. Methane Emissions from United States Natural Gas Gathering and Processing. Environ. Sci. Technol . 49: 10718–10727. DOI: 10.1021/acs.est.5b02275

  8. Pachauri, R and Meyer, L. 2014. IPCC, 2014: Climate Change 2014: Synthesis Report. Contribution of Working Groups I II and III to the Fifth Assessment Report of the intergovernmental panel on Climate Change . Geneva, Switzerland: IPCC.

  9. Ravikumar, AP, Sreedhara, S, Wang, J, Englander, J, Roda-Stuart, D, Bell, C, Zimmerle, D, Lyon, D, Mogstad, I, Ratner, B and Brandt, AR. 2019. Single-blind inter-comparison of methane detection technologies – results from the Stanford/EDF Mobile Monitoring Challenge. Elem Sci Anth 7: 37. DOI: 10.1525/elementa.373

  10. Robertson, AM, Edie, R, Snare, D, Soltis, J, Field, RA, Burkhart, MD, Bell, CS, Zimmerle, D and Murphy, SM. 2017. Variation in Methane Emission Rates from Well Pads in Four Oil and Gas Basins with Contrasting Production Volumes and Compositions. Environ. Sci. Technol . 51: 8832–8840. DOI: 10.1021/acs.est.7b00571

  11. Schwietzke, S, Pétron, G, Conley, S, Pickering, C, Mielke-Maday, I, Dlugokencky, EJ, Tans, PP, Vaughn, T, Bell, C, Zimmerle, D, Wolter, S, King, CW, White, AB, Coleman, T, Bianco, L and Schnell, RC. 2017. Improved Mechanistic Understanding of Natural Gas Methane Emissions from Spatially Resolved Aircraft Measurements. Environ. Sci. Technol . DOI: 10.1021/acs.est.7b01810

  12. Subramanian, R, Williams, LL, Vaughn, TL, Zimmerle, D, Roscioli, JR, Herndon, SC, Yacovitch, TI, Floerchinger, C, Tkacik, DS, Mitchell, AL, Sullivan, MR, Dallmann, TR and Robinson, AL. 2015. Methane Emissions from Natural Gas Compressor Stations in the Transmission and Storage Sector: Measurements and Comparisons with the EPA Greenhouse Gas Reporting Program Protocol. Environ. Sci. Technol . 49: 3252–3261. DOI: 10.1021/es5060258

  13. U.S. Department of Energy. 2014. ARPA-E|MONITOR [WWW Document]. ARPA-E Programs. URL https://arpa-e.energy.gov/?q=programs/monitor (accessed 6.5.19).

  14. U.S. EPA. 2015. Title 40, Part 60, subpart OOOOa – Standards of Performance for Crude Oil and Natural Gas Facilities for which Construction, Modification or Reconstruction Commenced After September 18, 2015, Electronic Code of Federal Regulations.

  15. U.S. EPA, O. 2019. Inventory of U.S. Greenhouse Gas Emissions and Sinks: 1990–2017 [WWW Document]. US EPA. URL https://www.epa.gov/ghgemissions/inventory-us-greenhouse-gas-emissions-and-sinks-1990-2017 (accessed 6.5.19).

  16. Vaughn, TL, Bell, CS, Pickering, CK, Schwietzke, S, Heath, GA, Pétron, G, Zimmerle, DJ, Schnell, RC and Nummedal, D. 2018. Temporal variability largely explains top-down/bottom-up difference in methane emission estimates from a natural gas production region. Proc. Natl. Acad. Sci . 201805687. DOI: 10.1073/pnas.1805687115

  17. Vaughn, TL, Bell, CS, Yacovitch, TI, Roscioli, JR, Herndon, SC, Conley, S, Schwietzke, S, Heath, GA, Pétron, G and Zimmerle, D. 2017. Comparing facility-level methane emission rate estimates at natural gas gathering and boosting stations. Elem Sci Anth 5. DOI: 10.1525/elementa.257

comments powered by Disqus