Image
oil and natural gas infrastructure in the Permian Basin
Oil and gas extraction infrastructure in the Permian Basin

MethaneMapper is poised to solve the problem of underreported methane emissions

A central difficulty in controlling greenhouse gas emissions to slow down climate change is finding them in the first place.

Such is the case with methane, a colorless, odorless gas that is the second most abundant greenhouse gas in the atmosphere today, after carbon dioxide. Although it has a shorter life than carbon dioxide, according to the U.S. Environmental Protection Agency, it’s more than 25 times as potent as CO2 at trapping heat, and is estimated to trap 80 times more heat in the atmosphere than CO2 over 20 years.

 For that reason, curbing methane has become a priority, said UC Santa Barbara researcher Satish Kumar, a doctoral student in the Vision Research Lab of computer scientist B.S. Manjunath.

“Recently, at the 2022 International Climate Summit, methane was actually the highlight because everybody is struggling with it,” he said.

Even with reporting requirements in the U.S., methane’s invisibility means that its emissions are likely going underreported. In some cases the discrepancies are vast, such as with the Permian Basin, an 86,000-square-mile oil and natural gas extraction field located in Texas and New Mexico that hosts tens of thousands of wells. Independent methane monitoring of the area has revealed that the site emits eight to 10 times more methane than reported by the field’s operators.

 In the wake of the COP27 meetings, the U.S. government is now seeking ways to tighten controls over these types of “super emitting” leaks, especially as oil and gas production is expected to increase in the country in the near future. To do so, however, there must be a way of gathering reliable fugitive emissions data in order to assess the oil and gas operators’ performance and levy appropriate penalties as needed.

 Enter MethaneMapper, an artificial intelligence-powered hyperspectral imaging tool that Kumar and colleagues have developed to detect real-time methane emissions and trace them to their sources. The tool works by processing hyperspectral data gathered during overhead, airborne scans of the target area.

“We have 432 channels,” Kumar said. Using survey images from NASA’s Jet Propulsion Laboratory, the researchers take pictures starting from 400 nanometer wavelengths, and at intervals up to 2,500 nanometers — a range that encompasses the spectral signatures of hydrocarbons, including that of methane. Each pixel in the photograph contains a spectrum and represents a range of wavelengths called a “spectral band.” From there, machine learning takes on the huge amount of data to differentiate methane from other hydrocarbons captured in the imaging process. The method also allows users to see not just the magnitude of the plume, but also its source.

 Hyperspectral imaging for methane detection is a hot field, with companies jumping into the fray with equipment and detection systems. What makes MethaneMapper stand out is the diversity and depth of data collected from various types of terrain that allows the machine learning model to pick out the presence of methane against a backdrop of different topographies, foliage and other backgrounds.

“A very common problem with the remote sensing community is that whatever is designed for one place won’t work outside that place,” Kumar explained. Thus, a remote sensing program will often learn what methane looks like against a certain landscape — say, the dry desert of the American Southwest — but pit it against the rocky shale of Colorado or the flat expanses of the Midwest, and the system might not be as successful.

 “We curated our own data sets, which cover approximately 4,000 emissions sites,” Kumar said. “We have the dry states of California, Texas and Arizona. But we have the dense vegetation of the state of Virginia too. So it’s pretty diverse.” According to him, MethaneMapper’s performance accuracy currently stands at 91%.

 

Satish Kumar's presentation on MethaneMapper

The current operating version of MethaneMapper relies on airplanes for the scanning component of the system. But the researchers are setting some ambitious sights for a satellite-enabled program, which has the potential to scan wider swaths of terrain repeatedly, without the greenhouse gasses that airplanes emit. The major tradeoff between using planes and using satellites is in the resolution, Kumar said.

 “You can detect emissions as small as 50 kg per hour from an airplane,” he said. With a satellite, the threshold increases to about 1000 kg or 1 ton per hour. But for the purpose of monitoring emissions from oil and gas operations, which tend to emit in the thousands of kilograms per hour, it’s a small price to pay for the ability to scan larger parts of the Earth, and in places that might not be on the radar, so to speak.

 “The most recent case, I think seven or eight months ago, were emissions from an oil rig off the coast somewhere toward Mexico,” Kumar said, “which was emitting methane at a rate of 7,610 kilograms per hour for six months. And nobody knew about it.

Satellite detection could not only track carbon emissions on the global scale, it can also be used to direct subsequent airplane-based scans for higher-resolution investigations.

Ultimately, Kumar and colleagues want to bring the power of AI and hyperspectral methane imaging to the mainstream, making it available to a wide variety of users even without expertise in machine learning.

“What we want to provide is an interface through a web platform such as BisQue, where anyone can click and upload their data and it can generate an analysis,” he said. “I want to provide a simple and effective interface that anyone can use.”

The MethaneMapper project is funded by National Science Foundation award SI2-SSI #1664172. The project is part of the Center for Multimodal Big Data Science and Healthcare initiative at UC Santa Barbara, led by Prof. B.S. Manjunath. Additionally, MethaneMapper will be featured as a Highlight Paper at the 2023 Computer Vision and Pattern Recognition (CVPR) Conference — the premiere event in the computer vision field — to be held June 18–22 in Vancouver, British Columbia.

Media Contact

Sonia Fernandez

Senior Science Writer

(805) 893-4765

sonia.fernandez@ucsb.edu

Share this article

FacebookXShare