New Information Systems Projects Aim to Advance Earth Science

NASA Funds New Information Systems Technologies to Advance Earth Science
Elizabeth Goldbaum, October 2019, elizabeth.f.goldbaum@nasa.gov

 

A new collection of ESTO-funded projects plans to revolutionize the way we observe and understand Earth.

The Advanced Information Systems Technology Program, in support of the NASA Earth Science Division, selected 22 project proposals to fund and manage over a two-year timeframe.

The projects fall under one of two thrusts, the New Observing Strategy or Analytic Center Framework. The former includes technology looking for new ways to design and approach Earth observing missions and the latter seeks new ways to process and understand big, often messy, Earth science data. Both thrusts aim to reduce the overall risk and cost of achieving NASA’s goals to effectively monitor and understand our planet.

New Observing Strategy

Every Earth observing mission begins with a mission design. The New Observing Strategy helps develop and evolve new ways of designing an overall Earth observation system to incorporate technological advances, like smaller satellites and smarter sensors, and information gathered from space, air and ground-based sources. Whether it’s to monitor fire outbreaks or reveal snowfall patterns, for example, these new projects, led by researchers and their teams from academia, government and private industry, are here to help.

Janice Coen and Ethan Gutman, both scientists at the National Center for Atmospheric Research, will be developing systems to better understand fire and snow, respectively. Fire seasons are longer and more intense now than they’ve been in the recent past, according to the National Forest Service. To create a more complete picture of wildland fires, Coen will be developing a system that can simulate how fires move and change over time. On a different note, Gutman’s goals are to help NASA plan and operate a more cost-effective snow mission able to obtain higher resolution snow data than what is currently available. Snow acts as a reservoir of freshwater that sustains billions of people – having a more detailed view of snowpack helps decision makers identify and react to changes on a local scale.

Bart Forman, a professor and researcher at the University of Maryland, is also interested in better understanding snow and water cycling. His project will create a mission planning tool that can pinpoint an ideal combination of satellites to better understand terrestrial snow, as well as soil moisture and vegetation.

In contrast to crackling fires and bright snow, wind is only visible by the movement it creates. Winds control how the ocean and atmosphere interact when they meet, which influences weather. To help us better understand wind, James Carr, the president and CEO of Carr Astronautics Corporation, is designing a new tool, called StereoBit, to help small satellites send their data back to Earth more quickly and efficiently. Once StereoBit is operational, it could be used for many other science areas, too.

Three scientists in California are exploring new ways to design more effective Earth observing missions to drive scientific exploration more broadly: Derek Posselt, a scientist with NASA’s Jet Propulsion Laboratory, Mahta Moghaddam, a professor and researcher at the University of Southern California, and Sreeja Nag, a scientist with NASA Ames Research Center. Posselt’s goal is to create an easy-to-use, scalable toolkit to help design Earth observing missions that want to consider all possible instrument configurations, including CubeSats and other small satellites. Moghaddam’s project, called SPCTOR, will develop a concept to coordinate how ground-based and unmanned aerial vehicle-based sensors coordinate their operations to deliver ground truth information to a diverse set of end-users. Nag is building a tool called D-SHIELD, which includes software that will help future constellations of satellites strategize how to obtain the most valuable scientific measurements. For instance, D-SHIELD will be able to anticipate a short-term event, like a storm, and tell a sensor to take a onetime measurement, and anticipate a long-term event, like a drought, and tell a sensor to take repeated measurements.

Analytic Center Framework

Once an Earth observing mission is in operation, with satellites orbiting in space, we can expect a lot of data back on land. Data from different missions often arrive in different formats, and when combined with ground-based and airborne-derived data, uncertainties can appear. Scientists will soon be able to look to these Analytic Center Framework projects, which incorporate software tools like machine learning, to help them use this data and incorporate it into their research.

Jennifer Swenson, a professor and scientist with Duke University, and Walter Jetz, a professor and scientist with Yale University, are both interested in exploring biodiversity through remote sensing data. Swenson plans to create a software tool that incorporates lidar and hyperspectral imagery to reveal canopy structure, which many birds, mammals and insects call home. Jetz plans to create an online database that includes analysis products, visualizations and more from remotely sensed biodiversity data. These tools will lower the barrier of entry for anyone seeking to learn where monarch butterflies live or how sugar maples are faring, among other variables. It will also support NASA’s research and analysis of global hyperspectral data for active geologic processes, algal biomass and other topics.

Philip Townsend, a professor and scientist at the University of Wisconsin, is also exploring ways to share biodiversity data. He will be creating an On-Demand Geospatial Spectroscopy Processing Environment on the Cloud (GeoSPEC), which will help scientists obtain customized, specific information on terrestrial vegetation, among other variables. It will also help analysis of global hyperspectral data for biological and geological science.

To help predict and mitigate the impacts of harmful algal blooms, Stephanie Schollaert Uz, a scientist at NASA Goddard Space Flight Center, will use artificial intelligence to detect poor water quality in the Chesapeake Bay. This information will help shellfish aquaculture managers understand and anticipate impacts to their operations.

Pollutants can also impact air quality, especially in Los Angeles, California. Jeanne Holm, the Senior Technology Advisor to the Mayor of the City of Los Angeles, will develop machine learning algorithms that combine ground-based information and space-based remote sensing observations to help create models that can predict air quality, which is strongly linked to public health.

Daven Henze, a professor and engineer at the University of Colorado, Boulder, and Randall Martin, a professor and engineer at the Washington University in St. Louis, are both looking to improve access to atmospheric chemistry data. Henze will build a system that incorporates machine learning and modeling to better understand ozone’s link to climate change, among other topics. Martin is looking to integrate atmospheric chemistry models into Earth system models to provide more relevant and accessible data for research into issues like air quality.

Riley Duren, an engineer at the University of Arizona and JPL, will create a tool to process atmospheric methane data. Methane is one of the most potent greenhouse gases in Earth’s atmosphere. The Multi-scale Methane Analytic Framework will help extend the capabilities of existing methane analysis systems, reduce overall cost of methane data analyses and provide easier access to researchers and decision makers.

To help enhance precipitation data, John Beck, a scientist at the University of Alabama, Huntsville, will develop a Cloud-based Analytic framework for Precipitation Research (CAPRi). This framework will generate large volumes of high-quality training data to aid deep learning models. These models will then help enhance the resolution of data from the Global Precipitation Measurement Mission.

To reveal other Earth processes on the ground and in the water, Andrea Donnellan, a scientist at JPL, will create a uniform crustal deformation reference model for the active plate margin of California to improve earthquake forecasts. John Moisan, a scientist at NASA Wallops Flight Facility, will create a framework to help scientists easily put together algorithms to better understand satellite and in situ data. Although this can be used for a variety of Earth science phenomena, the team will be focusing on ocean chlorophyll a, which is often used to assess ocean health, in their proof-of-concept.

For a more general approach to the Analytic Center Framework, Anthony Ives, a scientist with the University of Wisconsin, will develop new statistical tools to analyze large, remotely sensed datasets to better support scientific findings. Beth Huffer, a philosopher and the President and founder of Lingua Logica, will develop an automated metadata pipeline that integrates machine learning and ontologies to generate consistent metadata records that are findable, accessible, interoperable and reusable. Jia Zhang, a professor and engineer with Carnegie Mellon University, will develop a technique to help Earth scientists design data analytics workflows to address scientific questions. Hook Hua, a scientist at JPL, will build upon the Advanced Rapid Imaging and Analysis science data system to address issues arising from the complexity of large-scale algorithm development.

You can find a full list of the projects and their abstracts here.