Key takeaways from ICML 2019

Tackling climate change challenges with AI through collaboration by Andrew Ng

Continuing our article series on the 2019 International Conference on Machine Learning in June, I’ll dive into my thoughts on the presentation titled Tackling climate change challenges with AI through collaboration by Andrew Ng. I highlight Andrew Ng’s talk not just because he is one of the most well-known names in machine learning, but because his talk was one of the best all week and his topic was on something that overlaps with my previous professional career an Atmospheric Scientist. 

Rather than focusing on a single application of machine learning, Ng gave short summaries of his research projects that address different aspects of climate change. I will go over 3 of them below. 

Andrew Ng presenting Tackling climate change challenges with AI through collaboration.

Project #1 – Predictions of methane gas emissions from wetlands

The first project he discussed aims to make predictions of methane gas emissions from wetlands. Methane is one of the strongest greenhouse gases in our atmosphere; stronger than carbon dioxide, though less abundant. Methane has many sources that include both natural and anthropogenic, and understanding these sources is key to better understanding its role on Earth’s energy budget and global temperature. 

Wetlands are one of the major natural sources of methane to the atmosphere, but monitoring this flux can be difficult. Currently, there are only a handful of monitoring stations (I think they said ~150) around the globe. These monitoring stations also track local conditions (e.g., temperature, wind, humidity, etc.). Ng and his colleagues built a machine learning model (random forest) that is trained on the limited data at a 30-minute resolution to predict methane fluxes. They in turn can make predictions over wetland areas where there is no methane monitoring (using local meteorological conditions as predictors). 

Project #2 – Renewable energy and identifying the location of wind turbines

The second project that was presented looked at renewable energy, and identifying the location of wind turbines. They used a dataset of known wind turbine locations and trained a neural network on visible satellite images with the correct turbine locations. The model was trained on 100k images (with 50k positives from a USGS dataset). 

Wind turbine detection was then conducted on 1.8M images. They used this model to uncover recently constructed wind turbines and detect when wind turbines no longer exist. They even found certain locations where the USGS dataset identified a turbine, but there wasn’t one. This particular project and its methodology could be useful when analyzing images from planet.com. For example, the same type of method could be used to detect features that pose harm to the environment (e.g., natural gas mines or oil rigs).

Ng also discussed his work to improve climate models by obtaining better parameterization for certain model components. “Model parameterization” is often used in global climate models to estimate certain processes that can’t be resolved with state-of-the-art climate models that typically have a coarse spatial resolution (e.g., the model can’t resolve the physics that go on within clouds, so they need a way to simplify the system). Ng didn’t go into too much detail about this, but he mentioned how he helped improve the parameterization of moist convection within atmospheric models (i.e., cloud microphysics). This is a component that has, for a long time, been difficult to model and is a very important process to accurately predict within climate models as it controls precipitation and cloud cover within the low latitudes; both of which have large impacts on Earth’s energy budget (i.e., global temperatures).