Evaluation of next generation emission measurement technologies under repeatable test protocols

multispec-tral and hyperspectral imaging), deployment platforms (handheld systems, stationary sensor networks, unmanned aerial vehicles, piloted aircraft, and satellites) and use cases (voluntary monitoring, monitoring for environmental compliance, and critical safety applications) (Fox et al., 2019a). Many of these solutions aim to be implemented in Leak Detection and Repair (LDAR) programs established by operators to reduce emissions from their assets and to comply with state and federal regulations (Colorado Department of Public Health and Environment, Air Quality Control Commission, 2019; U.S. EPA, 2015). However, reg-ulatory compliance of these programs requires the use of approved methods to perform leak detection. Many new solutions may be capable of achieving equivalent or better emission reductions, but demonstrating this equivalency remains a barrier to widespread adoption across the


Introduction
Inventory and model-based estimates of methane emissions from the US natural gas supply chain vary and often disagree, for example the U.S greenhouse gas inventory estimates 6,700 Gg of methane emissions from natural gas systems in 2015 (1.2% of gross U.S. gas production), while Alvarez et al. developed a model-based estimate 13,000 Gg (2.3% of gross U.S. gas production) (US EPA, 2019; Alvarez et al., 2018). Improving these estimates is critical to our understanding of the climate implications of switching to natural gas as a low carbon fuel since methane is a potent greenhouse gas, 84 times more potent than carbon dioxide on a 20 year time span, and these emissions offset potential near-term climate benefits (Pachauri and Meyer, 2014). Emission sources span a large range of magnitudes, are temporally variable, include many source types such as combustion exhaust, process vents and fugitive leaks, and are spatially distributed across many facilities and facility types nationwide. The variety of source and facility types, the range and temporal variability of emission rates, and the spatial extent of the system make emission detection and measurement challenging and has necessitated the use of many methods in recent studies Harriss et al., 2015;Marchese et al., 2015;Robertson et al., 2017;Schwietzke et al., 2017;Subramanian et al., 2015;Vaughn et al., 2018Vaughn et al., , 2017.
In 2015 the U.S. Department of Energy Advanced Research Project Agency -Energy (ARPA-E) funded the development of 11 next generation emission measurement (NGEM) technologies under the Methane Observation Networks with Innovative Technology to Obtain Reductions (MONITOR) program (U.S. Department of Energy, 2014). These and other NGEM technologies include a wide range of sensors (acoustic monitors, in-situ samplers, open path measurements, infrared, multispectral and hyperspectral imaging), deployment platforms (handheld systems, stationary sensor networks, unmanned aerial vehicles, piloted aircraft, and satellites) and use cases (voluntary monitoring, monitoring for environmental compliance, and critical safety applications) (Fox et al., 2019a). Many of these solutions aim to be implemented in Leak Detection and Repair (LDAR) programs established by operators to reduce emissions from their assets and to comply with state and federal regulations (Colorado Department of Public Health and Environment, Air Quality Control Commission, 2019; U.S. EPA, 2015). However, regulatory compliance of these programs requires the use of approved methods to perform leak detection. Many new solutions may be capable of achieving equivalent or better emission reductions, but demonstrating this equivalency remains a barrier to widespread adoption across the

RESEARCH ARTICLE
Evaluation of next generation emission measurement technologies under repeatable test protocols industry. A framework to demonstrate equivalent emissions reduction potential of LDAR programs was recently developed and has received widespread support across stakeholders (Fox et al., 2019b). This framework identified the need for performance testing of technologies under standardized protocols (the focus of this paper), coupled with modeling and field trials to achieve a full approval. Currently, such a standardized test protocol to evaluate the performance of methane detection technologies and allow direct comparisons of different solutions does not exist.
In 2016 Colorado State University was selected by ARPA-E to design, construct and operate the Methane Emissions Technology Evaluation Center (METEC) as a proving ground to evaluate technologies developed by MONITOR. Two rounds of blind tests were performed at METEC by the ARPA-E MONITOR complete solutions. The first round (R1) was performed in Spring 2017 and was intended to provide a proof of concept for the ARPA-E MONITOR performers in a simplified field setting. The second round (R2) was performed in Spring/Summer 2018 and was intended to provide more realistic emission scenarios by introducing larger facilities, the presence of multiple sources, and unsteady emission sources. Testing following the R2 protocols was offered to other non-MONITOR technologies in Summer 2018. There were no screening or selection criteria, however non-MONITOR participants were required to pay for testing time at METEC under a standardized fee structure. In total, six MONITOR and six non-MONITOR technologies (herein collectively referred to as Performers) participated using the R2 test protocols. This paper discusses the R2 protocols and test results, and investigates the test requirements for evaluation of detection curves required by simulation models in the proposed equivalency framework.

Methane Emission Technology Evaluation Center
All experiments for R2 testing were performed at METEC (SM -Section 1). METEC was designed to include representations of natural gas facilities and equipment including production well pads, a small gathering facility, and buried pipelines similar to those found in gathering or distribution systems. This equipment was outfitted with release points where the flowrate can be controlled to simulate emission sources observed in field measurements. Emission release points included representations of process emission sources (e.g. gas operated pneumatic controllers) and fugitive emission sources (flanges, fittings, instrument ports, valve packings, etc.). An electronic control system was designed to allow emission release points to operate at steady or unsteady rates. The system allowed multiple emissions on each pad to be operated simultaneously and controlled independently.
The control system at METEC used precision orifices to control flowrates at emission locations throughout the facility. The pressure to each orifice-based flow controller was manually set by the operator before each test to achieve the target flowrates. Flowrates were metered using thermal mass flow meters. A pre-test calibration was used to assess emission flowrates if multiple sources were downstream of a single flow meter during a test. All tests were performed with compressed natural gas (CNG), a mixture of methane, ethane, propane, and carbon dioxide. Gas composition was assessed prior to testing using a gas chromatograph and quantification results were analyzed in terms of methane emission rates.

Description of tests and protocols
Test protocols were developed with three main objectives: (1) to enable testing of solutions as they would be deployed in the field in either pad-by-pad surveys or as continuous monitors; (2) to include grades of test complexity to allow evaluation of performers at a range of technical readiness level, from early stage technologies to well-developed solutions; and (3) to produce repeatable emission rates and emission locations in the series of tests allowing testing to be completed at different times for each performer.

Solution deployment methods
Two test protocols (SM -Section 2) were developed to support two distinct modes of solution deployment: (1) survey solutions in which a handheld or otherwise mobile instrument is deployed within the fence line of a production well pad to assess emissions during a routine, scheduled visit. Survey methods typically move pad-by-pad through an operational region and assess a snapshot of emissions at the time of the visit; and (2) continuous monitoring solutions in which an instrument is deployed as a fixed installation to monitor emissions from a system for an extended period, typically months or years. Emission detection methods work at a variety of scales or resolutions. Leak detections from either survey or continuous monitoring methods may be localized to the facility (the facility does/does not have emissions), to a unit of major equipment (emissions detected somewhere on a unit of equipment) or to a single component. Emission detection and sizing can range from binary presence/absence of emissions -i.e. leak detection only -to estimates of the emission rate at varying levels of precision. Finally, solutions may be designed to monitor one or multiple nearby facilities using a single instrument.
While deployment modes varied, test protocols for both survey and continuous monitoring solutions followed a similar testing process: The METEC operator established an emissions configuration on an assigned pad, the performer completed measurements on the pad, and reported results to METEC. Measurement teams were allowed time to post-process data prior to submitting their results. Test conditions were not disclosed to the performer until they had reported results to METEC.

Test complexity
Individual tests were developed using three levels: • A -Single emission source operated continuously at a steady flowrate for the duration of test. This complexity level provides a basic test configuration in which to detect emissions, but is rare in field conditions.
• B -Multiple emission sources each operated continuously at a steady flowrate for the duration of test, with a maximum of one single emission source on a single unit of major equipment (a well head, separator, or tank. This complexity level tests the ability of solutions to discriminate between multiple sources, while not introducing the complexity of varying flowrates or sources in close proximity to each other. • C -Multiple emission sources operated intermittently or continuously at steady flowrates. More than one emission source may be located on the same unit of equipment. This complexity level is intended to approach realistic field conditions by introducing intermittent emission sources in addition to steady emission sources.

Test repeatability
To allow comparisons to be made between performers tested at different times, a set of predefined test conditions were developed. Each test was driven by a computer software which opened and closed valves on the METEC facility according to a predefined schedule. This allowed emission sources to be generated at the same locations, using the same valve and orifice combinations, and on the same temporal schedule for each performer. A pressure setpoint was defined and manually set by the operator for each test to achieve repeatable emission rates through the orifice-based flow control system (SM -Section 3).

Interpretation of emission detections
For each test, performers were required to report the leak location using GPS coordinates or a 0.25 m grid system. If the solution indicated the capability to quantify emissions, performers also reported an emission rate. Reported locations were translated into detection categories (SM -Section 4) using the reported pad, equipment type and equipment ID, relative to the actual location of the emission source, resulting in four categories of detection: • Equipment detect -An emission source was reported on the same unit of equipment as an actual emission source. • Group detect -An emission source was reported on the same group of equipment where an emission source was present, however the reported location was not on the same unit of equipment as the actual source. • False negative -No emission sources were reported on the group of equipment where an emission source was present during the test. • False positive -An emission source was reported on a group of equipment where no emission sources were present during the test.

Limitations of testing
In general, continuous monitoring solutions were at a lower technology readiness level than survey solutions, making testing more difficult to design to avoid biasing results. Three aspects of the test protocol proved particularly challenging for continuous monitoring solutions: 1) Some continuous monitoring solutions have focused development primarily on detecting large emitters, due, in part, to the performance of the low-cost sensors (<$100 per unit) used in most solutions. The emission rates used here were taken from field measurements; the range of selected rates covered over 90% of 299 source measurements at production facilities by Bell et al., 2017. Therefore, the rates used here may fall below the lower detection limit of some continuous monitoring solutions, indicating that these solutions would not find most typical leaks seen in field studies of production emissions. However, an ' always-on' continuous monitor that detects large emission sources quickly could enable significant emissions reduction by identifying failures quickly -an operational modality that was not tested in this study. 2) Since, by definition, the positioning of the sensor(s) used in continuous monitoring solutions are fixed, most of these solutions must acquire data across a range of wind directions to detect emissions. The time allotted for each test in the protocol may have been shorter than this data acquisition time in some cases, providing insufficient data for detection and localization algorithms to converge to a more accurate result. That said, performers were given the option to test for longer periods prior to the controlled trials and/or request more operational time; none did. While this limitation can be overcome in future tests by simply allotting more time per test, extended test durations increase overall testing costs and time. Additionally, the emission profile of oil and gas facilities is known to include many temporally variable sources, and increasing test times may make the testing less realistic. Future testing protocols could allow for two operational modes -a long duration mode to support algorithm testing and development, and a realistic operation mode, where emission patterns and variability match those seen in field studies. 3) Some continuous monitoring solutions are designed to only provide detections localized to the equipment group, or, in some solutions, to the well pad. Therefore, it is important to consider both equipment-and group-detections as positive results. In practical deployments, either type of detection would likely be followed by a handheld survey to identify the equipment and component where the emissions were released.

Performers
Performers who completed testing are listed in Table 1. Seven solutions tested using the survey protocol and five solutions tested using the continuous monitoring protocol. Performers completing testing under the survey protocol were classified as (1) handheld -where a handheld instrument is used by the performer, and (2) mobilewhere an instrument is mounted on a drone, vehicle or other means of mobility. Note that both handheld and mobile systems were deployed on-pad in this study, and mobile methods here are distinctly different from With combined physics-based and data-driven analytics on edge device, the Gas Cloud Imaging (GCI) Technology is uniquely capable of processing realtime hyperspectral and visual data, identifying and quantifying hydrocarbons in real-time video. The miniGCI camera was mounted on a tripod or on a lift in a vehicle for this study.

Mobile
Detection, localization, quantification " screening" methods as discussed by Fox et al. [2019a] Two detection only solutions reported data in a different format than requested which has prevented their inclusion in the analysis below.

Data aggregation and blinding
Due to confidentiality agreements in place at the time of testing, we do not analyze the performance of individual solutions in this paper. Instead we present results aggregated by test protocol and by solution deployment mode to blind the dataset. Therefore, it is important to note that the aggregated results do not illustrate the performance of any individual solution. Additionally, the analysis presented was performed solely by the authors and does not represent the opinions of the performers.

Protocol comparison to mobile monitoring challenge
Testing in this program had three primary differences from that performed in the Mobile Monitoring Challenge (Ravikumar et al., 2019): 1) Experiments designed to assess detection in the MMC included only a single emission source, resulting in four possible outcomes for each experiment (true positive, false positive, true negative, and false negative). When multiple emission sources were present as part of the MMC quantification testing, different conventions were applied to interpret detections. For example, a scenario where three leaks were present and a single emission rate was reported was interpreted as three true positive detections.
In contrast, in this study each reported emission was matched to a single controlled release. The equipment detect and group detect in our analysis are similar to the Level-1 and Level-2 true positive interpretation used by the Mobile Monitoring Challenge, however a true positive Level-3 detection would be identified as a false positive in our analysis. 2) Testing under Mobile Monitoring Challenge included only cases where emission sources operated at a steady flowrate, and did not analyze experiments where a single release was present separately from those where multiple releases were present. Experiments in this paper identify the test complexity as an independent parameter to investigate the impact of single, multiple, and intermittent emission sources on solution performance as measured by controlled release experiments. 3) This paper includes testing of mobile solutions, similar to those tested in Mobile Monitoring Challenge but also includes handheld solutions, as well as continuous monitoring solutions. We test handheld and mobile leak detection solutions under the same protocol since these methods would be applied in a similar manner where a team moves through a region from facility to facility performing emission surveys and recording leaks detected. We test continuous monitoring solutions under a separate protocol since these systems would not be deployed in a similar way. Both protocols test solutions under a range of emission rates and test complexity, and the same rules are applied to interpret emission detection reports.

Test results and discussion
A total of 192 tests were performed by the 10 solutions analyzed. An empirical cumulative distribution function (CDF) of emission rates is shown in Figure 1 for survey experiments (handheld and mobile solutions) and continuous monitoring experiments. We present emission rates in standard cubic feet per hour (scfh) of methane using the Compressed Gas Association standard conditions of 70°F and 14.7 psia because the U.S. gas industry typically measures flow rates in standard cubic feet. Under this standard 1 scfh of methane is equal to approximately 18.8 g·h -1 . Tests were designed to target the ARPA-E MONITOR flowrate metric of 6 scfh, however the mean emission rate of emission points in survey protocol (7.5 scfh, σ = 5.6 scfh) was slightly higher than mean emission rate in the continuous monitoring pro- Art. 32, page 6 of 11 tocol (5.2 scfh, σ = 4.0 scfh). A CDF of 299 direct measurements of emission sources at production sites in the Fayetteville shale, AR, is shown for comparison (mean emission rate = 11.1 scfh) with 94% of measurements found within the range of emission rates included in this series of tests . Although the mean of direct measurements was higher than the mean of the emission rates in this study, a larger fraction of sources was measured below 6 scfh by Bell et al. than included in the survey and continuous monitoring protocols.

Detection by emission rate
When results are aggregated by deployment mode, we observe increasing detection rates with increasing emission rates (Figure 2, left). The fraction of emission sources detected increases approximately linearly across the range of emission rates tested. Handheld solutions, which include a human operator moving through the pad and confirming detected emissions as part of the detection process, exhibited the highest detection rates among the deployment modes tested, and are the only category which achieved 100% detection at emission sources greater than 10 scfh (22 detections). Note, the survey protocol evaluates the performance of the solution plus the human operator, and is intended to evaluate the solution performance as deployed in the field. The performance of the sensor plus operator is notably different than the performance of the sensor alone in a controlled laboratory environment.
Mobile solutions exhibited overall detection rates (85%) comparable to handheld detectors (90%), however did not achieve 100% detection of sources greater than 10 scfh as handheld detectors did. Mobile solutions also include a human operator who typically remains at the pad edge and may complete additional data collection using the solution if a leak is suspected, however the operator typically does not confirm detections directly.
Considering detections in Figure 2 which identified emissions from the correct equipment group, continuous monitoring solutions also exhibit increasing detection rates with increasing emission rates. However, detection rates observed are lower than those of survey solutions, particularly at emission rates below 6 scfh. The drop in emission rate at 8-10 scfh can be partially attributed to a low overall count in the particular bin (11 emission sources) and zero experiments in test complexity A which included emission sources between 8-10 scfh. We acknowledge the limitations of testing discussed earlier for continuous monitors, particularly the detection limits and the time allotted per experiment.

Detection by test complexity
As test complexity was increased, detection rate generally decreased (Figure 2, right). The detection rate, including same group detections, decreased from 94% when only a single steady emission source was present (Test Complexity A) to 79% when multiple steady emission sources were present (Test Complexity B). This suggests that solutions found it more difficult to detect and isolate multiple sources from one another, than to simply detect the pres-ence of an emission. This result has important implications with respect to field deployment since oil and gas facilities often include process emissions from pneumatic controllers, pressure relief valves, and compressor exhaust gases. This is particularly true for continuous monitoring solutions that will need to separate fugitive emission sources from a temporally variable background caused by local process emissions to provide actionable data in the form of emission rate estimates, location estimates, and/or alarms to an operator.
When intermittent emission sources were introduced (Test Complexity C) the overall detection rate further decreased to 63%, however intermittent emission sources had an uneven effect on different deployment modes. Detection rates of handheld solutions increased from complexity B to complexity C. This may be due to the human operator hearing the actuation of solenoid valves during the test and deducing an emission source had started or stopped. Note, this is analogous to a pneumatic controller actuation on a real facility which also produces an audible signal. In contrast, detection rates decreased when intermittent sources were included for mobile and continuous monitoring solutions where the operator is more removed from the equipment during the test and would not be able to deduce the start or stop of an emission source from an audible actuation. It is important to note that only 35% of emission sources in complexity C were intermittent, and therefore one would expect only a small change relative to test complexity B.
The testing performed here -including the 'C' complexity tests -represented a less-complex emission environment than is typical in field conditions at most well pads. Key differences include: (a) the timing of intermittent vents -intermittent emissions utilized here were more frequent (5 minute cycle time) and more regular (±1 second) than is typical of intermittent vents on field equipment, (b) no variable equipment failures -intermittent emissions that persist for minutes or hours before stopping or changing rate, (c) variable leak ratesin field locations leak rates from some locations, like tank vents, may vary with the cycling of other equipment on the well pad. Given the decrease in detections with increased complexity seen in study, it is reasonable to expect that testing with the full facility complexity will likely reduce probability of detection further.

False positives
False positive detections were reported on all test complexity levels and by all deployment modes (Figure 3). Mobile solutions had the lowest false positive rate as a fraction of the total number of reported emission sources including zero false positives in test complexities B (67 reported emissions) and C (28 reported emissions). Handheld solutions had low false positive rates (<5% of reported sources) in test complexities A and B, however the false positive rate increased to 25% (5 false positives in 20 reported emission sources) in test complexity C. Continuous monitoring solutions had the highest false positive rate, for which 35% of reported sources were identified as false positives.
This implies if an LDAR program were to use a continuous monitoring solution to vector a repair team, then under this series of tests the repair team would locate an actual emission source only 65% of the time. Since test times were limited in this study, longer test times and improved analytics, using more data collected for longer periods and more variable wind conditions, may improve the performance of the continuous monitoring solutions. Also, the Detection rate of all solutions increases with increasing leak rates and decreases with increasing test complexity. Handheld solutions, with a human operator confirming detections during survey, exhibit highest detection rates. Mobile solutions, where the operator is supervising the survey but not directly confirming detections, exhibit slightly lower detection rates than handheld solutions, particularly as test complexity is increased. Continuous monitoring solutions exhibit much lower detection rates than handheld and mobile survey methods for similar emission rates, however detection rates of continuous monitoring solutions improve considering group detections, reflecting the fact that many may be designed to detect at the equipment group-level and may not be intended to provide more accurate localization. DOI: https://doi.org/10.1525/elementa.426.f2 strength of continuous monitoring solutions may be in finding large emitters quickly, and large emitters were not tested in this study.

Localization
Distances between the actual emission location and the emission location reported by performers were calculated using GPS coordinates for equipment detects and group detects. Handheld solutions reported 50% of emission sources to within 1 m of the actual emission location compared with 82% for mobile solutions (Figure 4). Some portion of this difference may be due to reporting. Manual solutions reported coordinates by reading GIS map layers (kml files provided by METEC) for horizontal coordinates and measuring distance from the ground for the vertical coordinate (SM -Section 2). This type of localization was unfamiliar to the teams and could have introduced some error. However, it is also indicative of field performance, future automated system will likely require this type of reporting. In contrast, mobile solutions often included algorithms for pinpointing the emission source relative to the sensor's position using an onboard GPS sensor. For these solutions, the positioning is intrinsic to the solution and represents performance of full method as implemented. Location accuracy of continuous monitoring solutions was much lower than mobile and handheld solutions under this series of tests, with 18% of detected emission sources reported within 1 m of the actual source. 51% of detected emission sources were reported greater than 4 m from the actual emission source by continuous monitoring systems. Note, the continuous monitoring protocol included tests only on the larger (45 m × 60 m) Pad 4 This reflects the detection results (Figure 2), where continuous monitoring solutions often reported the emission location in the correct equipment group but not on the correct unit of equipment. This result is also consistent with the use case, where upon detection continuous monitoring solutions provide an alarm to the operator who will then deploy a team with a handheld or mobile system to pinpoint and repair the emission source. These results identify a key learning for integrating NGEM solutions into operator workflow: While current man-operated leak detection methods track detections by a component or location description (e.g. "the dump valve actuator on east separator"), most NGEM solutions will likely report using coordinates relative to a reference location at the facility, and these coordinates must be translated into work order instructions for subsequent teams. Further, reporting in this method is often less subject to operator error and more amenable for long-term tracking and analysis. However, since most facilities currently have no such absolute coordinate system, additional work will be required to establish these reference points and to accommodate detection results into operational workflows.

Quantification
Mass flow rate estimates were reported for 143 detected emission sources by seven of the solutions under test (SM -Section 5). No handheld solutions reported quantification estimates of emission mass flow rates. Quantification error was calculated for each detection as the reported mass flowrate minus the metered mass flowrate (Error = ṁ reported -ṁ metered ). The distribution of quantification error from mobile methods was nearly centered around zero (Figure 5 left) with 43% of estimates lower (negative error) and 57% of estimates higher (positive error) than metered emission rates. Mean error from mobile solutions was 1.3 scfh (17% of mean emission rate in the survey experiments). Measurement error in estimates from continuous monitoring solutions were generally biased high (Figure 5 right) with 25% of estimates lower (negative error) and 75% of estimates higher (positive error) than metered emission rates. Mean error from continuous monitoring solutions was 8.8 scfh (167% of mean emission rate in the continuous monitoring experiments). Quantification error of individual performers show solutions with better performance (narrower error bounds illustrated by a steeper CDF, and higher accuracy illustrated by a central value or mean error closer to zero), as well as solutions with worse performance (less accuracy illustrated by an error CDF shifted left or right) than the average results across all solutions.
These results indicate that quantification estimates from NGEM methods are likely to produce an estimate of total emissions that has a high uncertainty, and for the solutions tested here, a positive bias of 17-170%. This suggests more extensive testing focused on the accuracy and uncertainty of quantification methods is needed if these solutions are to inform operators or regulators of overall emissions from oil and gas operations.

Detection probability curves
Recent work to establish a "pathway to equivalency" has identified a modeling approach to demonstrate equivalent emission reduction potential of programs which use alternative methods for leak detection (Fox et al., 2019b). The modeling approach will rely on a method-specific detection probability curve where the probability of detecting a given leak is a function of the source characteristics (emission rates, gas composition, component type, etc.) and/or environmental conditions (temperature, wind speed, wind direction, etc.). The R2 tests are insufficient to evaluate this probability curve since they do not include significant variability in emission rates or environmental conditions, nor enough tests to develop statistically significant results to characterize these metrics for each solution tested. This discussion highlights several key points. First, this study, and all other controlled tests of NGEM solutions known to the authors, have performed testing in a narrow band of environmental conditions insufficient to derive robust detection probability curves. Since it is well-understood that the detection rate of most NGEM solutions will depend upon environmental conditions, larger batteries of tests, over longer periods including more variable weather conditions, would be required to develop robust curves. Second, characterizing performance will likely require the construction of a probability-of-detection surface, with additional variables for environmental (i.e. weather, terrain, vegetation, etc.) conditions. Key variables will likely differ between different solutions, reflecting technological constraints of each solution. This suggests new protocols are needed to evaluate methods across a wide range of metrological conditions and emission scenarios in a cost-effective manner.

Field deployment
During this testing, solutions were tasked with detecting emissions, and were not tasked with discriminating between leaks (unplanned emissions) and vents (planned emissions). Leaks are generally equipment failures which release gas. Venting refers to emissions from equipment in the normal operation of that equipment. Emissions from gas-pneumatic controllers, pneumatic pumps, tank flash, and similar emissions are typically classified as venting.
In field deployments at most onshore natural gas facilities, it will be necessary to distinguish between leaked and vented emissions to avoid unnecessary follow-up actions. Future protocols need to consider, if not implement, such testing.

Data Accessibility Statement
Data used in this analysis are available in the supplemental material.