2011 AIST Projects Awarded


NASA’s Science Mission Directorate Awards Funding for 18 Projects Under

the Advanced Information Systems Technology (AIST)
ROSES A.41 Solicitation NNH11ZDA001N-AIST
Research Opportunities in Space and Earth Sciences

02/10/2012 – NASA’s Science Mission Directorate, NASA Headquarters, Washington, DC, has selected proposals for the Advanced Information Systems Technology Program (AIST-11) in support of the Earth Science Division (ESD). The AIST-11 will provide technologies to reduce the risk and cost of evolving NASA information systems to support future Earth observation missions and to transform observations into Earth information as envisioned by the National Research Council (NRC) decadal survey.

Through ESD’s Earth Science Technology Office a total of 18 proposals will be awarded over a 3-4 year period. Two of the selected proposals will receive additional funding from ESD’s Applied Sciences Program to extend the work to enhance a decision-making activity in a relevant application area. The total amount of all the awards is roughly $23 million. The Advanced Information Systems Technology (AIST) program sought proposals for technology development activities leading to new systems for sensor support, advanced data processing, and management of data services to be developed in support of the Science Mission Directorate’s Earth Science Division. The objectives of the AIST Program are to identify, develop and demonstrate advanced information system technologies which would reduce the risk, cost, size, and development time of Earth Science Division space-based and ground-based information systems and increase the accessibility and utility of science data.

A total of 88 proposals were received.

The awards are as follows (click on the name to go directly to the project abstract):

Alexander Berk, Spectral Sciences Incorporated
Plume Tracer: Interactive Mapping of Atmospheric Plumes via GPU-based Volumetric Ray Casting

Jeffrey Beyon, Langley Research Center
High-Speed On-Board Data Processing for Science Instruments (HOPS)

Yehuda Bock, Scripps Institution of Oceanography
Next-Generation Real-Time Geodetic Station Sensor Web for Natural Hazards Research and Applications

Amy Braverman, Jet Propulsion Laboratory
Multivariate Data Fusion and Uncertainty Quantification for Remote Sensing

Thomas Clune, Goddard Space Flight Center
Automated Event Service: Efficient and Flexible Searching for Earth Science Phenomena

Melba Crawford, Purdue University
An Advanced Learning Framework for High Dimensional Multi-Sensor Remote Sensing Data

Andrea Donnellan, Jet Propulsion Laboratory
QuakeSim: Multi-Source Synergistic Data Intensive Computing for Earth Science

Svetla Hristova-Veleva, Jet Propulsion Laboratory
Fusion of hurricane models and observations: Developing the technology to improve the forecasts

Hook Hua, Jet Propulsion Laboratory
Advanced Rapid Imaging & Analysis for Monitoring Hazards (ARIA-MH)

Stephan Kolitz, Draper Laboratory
EPOS for Coordination of Asynchronous Sensor Webs

Daniel Mandl, Goddard Space Flight Center
A High Performance, Onboard Multicore Intelligent Payload Module for Orbital and Suborbital Remote Sensing Missions

Mahta Moghaddam, University of Michigan
Land Information System for SMAP Tier-1 and AirMOSS Earth Venture-1 Decadal Survey Missions: Integration of SoilSCAPE, Remote Sensing, and Modeling

Ramakrishna Nemani, Ames Research Center
Semi-Automatic Science Workflow Synthesis for High-End Computing on the NASA Earth Exchange

Christa Peters-Lidard, Goddard Space Flight Center
A mission Simulation and Evaluation Platform for Terrestrial Hydrology Using the NASA Land Information System (LIS)

Paula Pingree, Jet Propulsion Laboratory
On-Board Processing (OBP) to Advance the PanFTS Imaging System for GEO-CAPE

Bo-Wen Shen, Goddard Space Flight Center
Integration of the NASA CAMVis and Multiscale Analysis Package (CAMVis-MAP) For Tropical Cyclone Climate Study

Simone Tanelli, Jet Propulsion Laboratory
Unified Simulator for Earth Remote Sensing (USERS)

Wei-Kuo Tao, Goddard Space Flight Center
Empowering Cloud Resolving Models Through GPU and Asynchronous IO

 


Return to Top

 Alexander Berk, Spectral Sciences Incorporated
Plume Tracer: Interactive Mapping of Atmospheric Plumes via GPU-based Volumetric Ray Casting

Timely quantification of volcanic gaseous and particulate releases is important for a number of applications. For example, recognizing rapid increases in SO2 gaseous emissions may signal an impending eruption, characterizing ash clouds is critical for safe and efficient commercial aviation, and quantifying volcanic aerosols is necessary for assessing their impact on climate forcing. The basic information needed for these applications (i.e., gaseous SO2 and particulate concentrations) can be retrieved from remotely sensed Thermal InfraRed (TIR) spectral imagery. JPL has developed state-of-the-art algorithms, embedded in their analyst-driven MAP_SO2 toolkit, for performing these retrievals. While MAP_SO2 provides very accurate results, analyzing multi-spectral imagery typically takes several days of analyst time. The bottleneck in this process is the relatively slow but accurate FORTRAN-based MODTRAN atmospheric and plume radiative transfer (RT) model, developed by Spectral Sciences, Inc. (SSI). To overcome this bottleneck, SSI in collaboration with JPL, propose to port these slow RT algorithms onto massively parallel, relatively inexpensive and commercially-available GPUs. Plume Tracer, the final integrated product, will be an interactive toolkit for retrieving and mapping the 3D composition of atmospheric plumes from remotely measured TIR radiance spectra. This includes current satellite assets, such as ASTER, MODIS and AIRS, and planned future missions, such as HyspIRI. We anticipate a 100-fold increase in the computational speed of the TIR data analysis. This will result in several key benefits including improved analyst productivity, improved quantity, quality and turnaround time of the analysis results, and expanded technical capabilities of the analysis software.

This proposal targets the Advanced Data Processing category of the ROSES-11 AIST (A.41) solicitation. As part of the proposed 3 year effort, we will exploit GPU technology to enable real-time interaction with RT models, exploration of the parameter space of these models, and visualization of 3D variations in the composition of atmospheric plumes. GPU computing, MODTRAN and MAP_SO2 are mature technologies, but the adaptation of RT models for the GPU and integration of data import, interactive steering or exploration, and visualization tools are challenges that place our entry TRL at 2 (formulation of concept). At the conclusion of this project we will promote Plume Tracer to TRL 6 with an end-to-end system-level demonstration featuring the import of satellite data, retrieval of plume composition, and visualization of these retrievals.

Plume Tracer will be based on volumetric ray tracing, casting rays through 3D atmospheres to compute TIR radiance from the sensor to surface along optical paths defined by the sensor position, view angles and surface topography. We will exploit the multi-channel GPU architecture to simultaneously cast multiple rays and model multiple spectral frequencies.

Plume Tracer will find immediate use in Dr. Realmuto’s current projects to study volcanic plumes with ASTER, AIRS, MODIS and MISR data, which are funded by the Earth Surface and Interior (ESI) and Terra/Aqua Science programs. The remote sensing of volcanic plumes is a high priority for the Hyperspectral and Infrared Imager (HyspIRI) Decadal Survey mission. The Plume Tracer prototype will focus on the remote sensing of volcanic plumes, but the techniques will be applicable to additional 3D chemical or particulate plumes, such as those emanating from industrial flares and forest fires. In addition, the compensation/correction for atmospheric absorption and emission is a major area of research and development at SSI and fundamental to any effort to estimate the geophysical properties of objects or materials from remote measurements of scene TIR radiance. The development of GPU-based RT modeling and interactive steering of RT models will benefit the entire remote sensing community.

 


Return to Top

Jeffrey Beyon, Langley Research Center
High-Speed On-Board Data Processing for Science Instruments (HOPS)

Processing performance and data throughput are key factors that affect overall system performance to make meaningful science measurements from space in missions. Estimated data processing and transfer requirements for the ASCENDS mission are principal examples of the high data rate challenge posed when measuring global wind profiles or CO2. We propose to advance the state-of-the-art in high performance on-board data processing to accommodate the demand in data rate, size, and processing speed of advanced science instruments. We propose a hybrid modular architecture that supports both general-purpose (serial processing) along with reconfigurable fabric that can support high-speed parallel computation. In order to handle the high rate input, we will implement a high-speed interconnection fabric. Computationally expensive multi-dimensional science algorithms will be used to both drive the design and validate the final architecture. Advanced electronics, chosen for a path to spaceflight, will be employed to reduce mass and power while improving performance. The architecture will be modular and scalable to support ASCENDS and a broad class of future science missions.

 


Return to Top

Yehuda Bock, Scripps Institution of Oceanography
Next-Generation Real-Time Geodetic Station Sensor Web for Natural Hazards Research and Applications

The primary objective of our collaborative project is to better forecast, assess, and mitigate natural hazards, including earthquakes, tsunamis, and extreme storms and flooding through development and implementation of a modular technology for the next-generation in-situ geodetic station, and a Geodetic Sensor Web to support the flow of information from multiple stations to scientists, mission planners, decision makers, and first responders. Meaningful warnings save lives when issued within 1-2 minutes for destructive earthquakes, several tens of minutes for tsunamis, and up to several hours for extreme storms and flooding, and can be provided by on-site fusion of multiple data types and generation of higher-order data products: GPS and accelerometer measurements to estimate point displacements, and GPS and meteorological measurements to estimate moisture variability in the free atmosphere. By operating semi-autonomously, each station can provide low-latency, high-fidelity and compact data products within the constraints of narrow communications bandwidth that often accompanies natural disasters.

The project encompasses the following tasks, including hardware and software components:

(1) Development of a power-efficient, low-cost, plug-in Geodetic Module (~$2,000) for fusion of data from in situ sensors including GPS, a strong-motion accelerometer module (~$500), and a meteorological sensor package (~$500), for deployment at 26 existing continuous GPS stations in southern California. The low-cost modular design is scalable to the many existing continuous GPS stations worldwide.

(2) Estimation of new on-the-fly data products with 1 mm precision and accuracy, including three-dimensional broadband displacements and precipitable water, by new software embedded in the Geodetic Module’s processor, rather than at a central processing facility.

(3) Development of a Geodetic Sensor Web to allow the semi-autonomous sensors to transmit and receive information in real time by means of redundant sensor proxy servers and message broker networks to allow for robust sensor control, flow of data, data products, models and alarms, and to avoid single points of failure during emergencies.

The proposed period of performance is four years, including a three-year technology development phase and a one-year technology infusion phase with the same team of investigators throughout. The project will increase Technology Readiness Levels from 4 to 7 in the first phase, starting with existing hardware and software components developed as part of earlier NASA and NOAA projects, and will further increase to TRL 8 in the fourth year. The team from SIO and JPL will interact with users (Co-I’s) at the two National Weather Service Weather Forecast Offices in southern California (San Diego and Los Angeles/Oxnard) and NOAA’s Earth System Research Laboratory in Boulder to provide tropospheric signal delays and precipitable water vapor estimates for forecasting severe storms and flooding. Broadband displacements for earthquake and tsunami early warning and rapid response will be made available to users in the geophysics community through the Southern California Earthquake Data Center, directed by the Caltech PI. There is a natural and obvious pathway to infusing this technology into NASA’s Space Geodesy Project, which is developing the next generation of collocated space geodetic fiducial stations, envisioned for real-time and autonomous operations. Next-generation geodetic stations can also supply real-time calibration information to NASA space missions such as the NPOESS Preparatory Project mission with the CrIMSS instrument suite to produce accurate temperature, water vapor, and pressure profiles, as part of a demonstration of the next-generation weather satellite (with NOAA), and the future DESDynI mission.

 


Return to Top

Amy Braverman, Jet Propulsion Laboratory
Multivariate Data Fusion and Uncertainty Quantification for Remote Sensing

Carbon dioxide is a crucial factor in climate change and global warming, but as much as forty percent of anthropogenic carbon dioxide remains unaccounted for in the Earth system. NASA and the rest of the scientific community are engaged in vigorous data collection, analysis, and modeling efforts to better characterize and understand the distribution and evolution of atmospheric carbon dioxide. The Atmospheric Infrared Sounder (AIRS) and Japan’s Greenhouse Gases Observing Satellite (GOSAT) are already producing data products, the Orbiting Carbon Observatory (OCO-2) will launch in less than two years, and the Decadal Survey mission ASCENDS is slated for launch by the end of the decade. However, no remote sensing instrument observes everywhere all the time, so combining information from multiple missions is necessary to achieve the most complete and accurate picture possible of the state of the atmosphere. Each instrument has unique observing characteristics, the strengths of which can complement each other if that fusion of information is done in a way that yields accurate estimates with minimum uncertainties.

The overall objective of this project is to develop and implement statistical algorithms for fusing heterogenous remote sensing data sets to produce optimal fused estimates of an underlying geophysical field and probabilistically defined uncertainties associated with them. In particular, we focus here on fusion to estimate multivariate quantities such as atmospheric profiles. The benefits of this technology derive from the ability to estimate profiles, at any location and time, from input data that have different footprint geometries, measurement error characteristics, observation times and locations, and patterns of missingness. As a guiding use-case, consider the need for an observational data set that can be used to diagnose and improve CO2 atmospheric transport models. One of the primary limitations of these models is their representation of vertical mixing. An observational data set of the full, three-dimensional atmospheric CO2 field over time would contribute to better understanding of vertical transport and to improving these models. No single instrument provides this information at all locations and times, but we can infer this field by fusing, for example, total column CO2 from GOSAT with mid-tropospheric and stratospheric CO2 measurements from AIRS, and total column or profile information from OCO-2. There are a wide variety of other applications for this technology, but we focus here on CO2 because there is a concrete, well articulated need that motivates a focused effort.

The proposed work has two main components. First, we will extend the Spatio-Temporal Data Fusion (STDF) technology developed under our AIST-08 project, Geostatistical Data Fusion for Remote Sensing Applications, to the case of fusing profiles rather than fusing scalars. Second, we will characterize the computational performance of our algorithm and assess its output. To characterize computational performance, we will build a simulation testbed for generating synthetic “truth" data sets, derive synthetic instrument data, fuse them, and compare the results with this “truth". To assess the fused estimates produced from real remote sensing inputs, we will compare those estimates to available in-situ and model-generated data sources.
The period of performance for this project is May 1, 2012 through April 30, 2015. The entry TRL for this project is 3, and the planned exit TRL is 6.

 


Return to Top

Thomas Clune, Goddard Space Flight Center
Automated Event Service: Efficient and Flexible Searching for Earth Science Phenomena

Motivation/Problem Statement
A large portion of Earth Science investigations is phenomenon- or event-based, such as the studies of Rossby waves, mesoscale convective systems, and tropical cyclones. However, except for a few high-impact phenomena, e.g., tropical cyclones, comprehensive records are absent for the occurrences or events of these phenomena. Phenomenon-based studies therefore often focus on a few prominent cases while the lesser ones are overlooked. Without an automated means to gather the events, comprehensive investigation of a phenomenon is at least time-consuming if not impossible.

Proposed Solution
We therefore propose an Automated Event Service (AES) system that methodically mines custom-defined events in the reanalysis data sets of atmospheric general circulation models. Our AES will enable researchers to specify their custom, numeric event criteria using a user-friendly web interface to search the reanalysis data sets. Searches can also be performed using our Event Specification Language (ESL) to afford more flexibility and versatility. Investigators will be able to subscribe to event searches and get notified of new results when data sets are updated with the latest additions. Moreover, we will implement a social component to enable dynamic formation of collaboration groups for researchers to cooperate on event definitions of common interest.

Research Strategy
Our primary implementation strategy is to construct AES via a sequence of small increments driven by use cases and test data, i.e., the agile development methodology. Initially we will pur-sue a functional, but narrowly focused, system built from relatively mature components includ-ing high-end computing technologies such as Map-Reduce and Hadoop/HBase for efficiently accessing and searching large collections of data. This baseline system will provide a robust foundation from which to pursue the less mature aspects of our research goals: the ESL and ad-vanced search technologies. Either of these requires analysis of a representative suite of event scenarios to choose among competing design trade-offs. For ESL the trade-offs are between sim-plicity and capability, while for the advanced search technologies, we must evaluate a number of operational parameters including dimensionality and variant indexing schemes for performance and storage requirements.

Significance
The AES, with its Event Web Service and compatibility with NASA Earth Observing System Clearinghouse (ECHO), will streamline event identifications and subsequent discovery of con-current satellite observations from the vast stores of NASA Earth Science remote sensing data. This previously unavailable capability will enable more systematic, comprehensive, and efficient investigations into Earth Science phenomena, thereby significantly improving Earth Science productivity. The AES, providing nearly instantaneous event search results enabled by distributed parallel processing and novel index scheme, combined with a sophisticated collaborative social compo-nent, and aimed to break through a bottleneck in the Earth Science research process, represents a significantly innovative use and integration of information technologies.

Relevance to the Program Element
AES is a technology “intended to improve the science value of the data at minimal cost.” Thus Advance Data Processing is the relevant program element. Specifically, AES is relevant to the following sub-elements:
Data mining and visualization to enable analysis.
Processing techniques to enable multi-source data fusion across models, satellites, and in situ sensors.
Techniques to exploit cloud computing technologies for large-scale on-demand data process-ing, mining, distribution, and provenance.

 


Return to Top

Melba Crawford, Purdue University
An Advanced Learning Framework for High Dimensional Multi-Sensor Remote Sensing Data

Advances in optical remote sensing technology over the past two decades have enabled the dramatic increase in spatial, spectral, and temporal data now available to support Earth science research and applications, necessitating equivalent developments in signal processing and data exploitation algorithms. Although this increase in the quality and quantity of diverse multi-source data can potentially facilitate improved understanding of fundamental scientific questions, conventional data processing and analysis algorithms that are currently available to scientists are designed for single-sensor, low dimensional data. These methods cannot fully exploit existing and future earth observation data. Capability to extract information from high dimensional data sets acquired by hyperspectral sensors, textural features derived from multispectral and polarimetric SAR data, and vertical information represented in full waveform LIDAR is particularly lacking. New algorithms are specifically needed for analysis of existing data from the NASA airborne (e.g. AVIRIS, LVIS, and G-LiHT) and EO-1 Hyperion and ALI instruments, and future data acquired by the upcoming Landsat Data Continuity Mission, ICESat-II, and HyspIRI. New paradigms are also required to effectively extract useful information from the high dimensional feature space resulting from multi-source, multi-sensor, and multi-temporal data.

Classification of remotely sensed data is an integral part of the analysis stream for land cover mapping in diverse applications. High dimensional data provide capability to make significant improvements in classification results, particularly in environments with complex spectral signatures which may overlap or where textural features provide discriminative information. In this project, a new classification framework will be developed and implemented for robust analysis of high dimensional, multi-sensor/multi-source data in small training sample size conditions (limited in-situ/ground reference data). The framework will employ a multi-kernel Support Vector Machine, an ensemble-classification, and decision fusion system for effectively exploiting a diverse collection of potentially disparate features derived either from the same sensor (e.g. spatial-spectral analysis tasks) or from different sensors (e.g. LIDAR and hyperspectral data). The system will significantly advance reliability of accurate mapping of remotely sensed data, particularly for scenarios with very little reference data to train the classification model. An active learning (AL) component will be integrated in the multi-source/multi-sensor environment to mitigate the impact of limited training data, effectively closing the loop between image analysis and field-collection to acquire the most informative samples for the classification task.

The proposed project will have an entry level of TRL-2. During Year 1, the team will incorporate the multi-kernel SVM into the classification system for hyperspectral and multi-source high dimensional data. Multi-view active learning methods previously developed by the PI will also be implemented in the multi-source environment. Efforts in Year 2 will focus on extending the spatial-spectral feature extraction capability, incorporating multi-sensor classification via the multi-kernel SVM ensemble model, and integrating active learning in the multi-sensor environment. Application of the methods will be initiated using a test-bed of multispectral, hyperspectral, and LIDAR data. In Year 3, integration of the proposed framework will be completed, and extensive testing and validation will be conducted using multiple relevancy scenarios. The prototype system will be implemented on the Purdue HUB computational platform to enable a TRL-4 system at the end of the study.

 


Return to Top

Andrea Donnellan, Jet Propulsion Laboratory
QuakeSim: Multi-Source Synergistic Data Intensive Computing for Earth Science

We will develop a multi-source, synergistic, data-intensive computing system to support modeling earthquake faults individually and as complex interacting systems. This will involve information technology research and development for data management and data-centric cloud computing. Numerous and growing online data sources from NASA, USGS, NSF, and other resources provide researchers with an exceptional opportunity to integrate varied data sources to support comprehensive efforts in data mining, analysis, simulation, and forecasting. The primary focus of this project is to extend this infrastructure to support fault modeling with a focus toward earthquake forecasting and response, but the developed technology can support a wide array of science and engineering applications.

This project will: 1) Develop bridging services within the QuakeSim service-oriented architecture that will integrate data from multiple sources, including interferogram, GPS position and velocity measurements, and seismicity; 2) Develop a fundamental cloud computing framework to support fault model optimization through the integration of multiple data types; 3) Develop the cyberinfrastructure within the QuakeSim science gateway to handle the computing requirements of the optimization framework; 4) Improve the QuakeTables fault database to handle issues of model contribution, provenance, version tracking, commenting, rating, etc of fault models produced by the optimization framework; and 5) Use the improved fault models in downstream earthquake hazard assessment and forecasts such as by the SCEC simulations group.

Understanding crustal deformation and fault behavior leads to improved forecasting, emergency planning, and disaster response. Accurate fault models supported through complementary information such as geologic observations, crustal deformation from InSAR and GPS, and seismicity. Fault models are subject to both known and unknown uncertainties that propagate through any analysis and downstream applications. Providing better constraints on the models by integrating multiple data collections, delivering these models through flexible, Web-based catalog services, and validating these models with numerous downstream applications will improve our understanding of earthquake processes. Analysis of crustal deformation data often indicates the existence of otherwise unknown faults. This project provides the computing infrastructure to identify, characterize, model and consider the consequences of unknown faults.

Handling large volumes of InSAR data and integrating the data with model applications is necessary for optimizing the utility of NASA’s DESDynI-R mission, which will produce tremendous volumes of InSAR data products. All developed capabilities will be made available through QuakeSim’s science gateway infrastructure. The project concludes with a deployment of selected project components at appropriate production facilities, including the Alaska Satellite Facility and UNAVCO.

Infusion will be through several collaborations and will support disasters. Through collaboration with the NASA-funded E-DECIDER project to deliver tools to emergency response communities. We will infuse the crustal deformation modeling tools into analysis of flow of fluids in reservoirs to the civil engineering community. We will work closely with the US Geological Survey to develop deformation and aftershock assessment tools that are coupled to the QuakeCatcher early warning network. QuakeSim simulations will feed into the Southern California Earthquake Center Simulations group, which in turn will be used for new versions of the Uniform California Earthquake Rupture Forecast (versions 3 and 4).

This four-year project has a period of performance for this work is from February 2012 through January 2016. Entry level TRL for the project is 2 with an exit TRL of 5. The entry level TRL for the infusion part of the project is 2 and the planned exit TRL is 7.

 


Return to Top

Svetla Hristova-Veleva, Jet Propulsion Laboratory
Fusion of hurricane models and observations: Developing the technology to improve the forecasts

Recognizing an urgent need for more accurate hurricane forecasts, the National Oceanic and Atmospheric Administration (NOAA) recently established the multi-agency 10-year Hurricane Forecast Improvement Project (HFIP). The two critical pathways to hurricane forecast improvement are: validation and improvement of hurricane models through the use of satellite data; development and implementation of advanced techniques for assimilation of satellite observations inside the hurricane precipitating core. Despite the significant amount of satellite observations today, they are still underutilized in hurricane research and operations. Our work will bring the unique expertise of NASA and JPL to bear and will result in significant contributions in both areas.
The proposed work will build upon two very successful NASA-funded projects, the JPL Tropical Cyclone Information System (TCIS) (https://tropicalcyclone.jpl.nasa.gov and https://grip.jpl.nasa.gov) and the Instrument Simulator Suite for Atmospheric Remote Sensing (ISSARS).

We will develop three critical new extensions of the TCIS portal that will allow the fusion of observations and model simulations to improve our understanding and forecasting of the hurricane processes. The three components fall into "1.3.1 Advanced Data Processing" section of the call. They are:

1) the coupling of the instrument simulator with operational hurricane forecast models and incorporation of simulated satellite observables into the existing database of satellite and air-borne observations. As part of this integration we will develop tools for model-observations fusion (e.g. data mining to determine when and what satellite observations are available inside the model domain; model sub-sampling in accordance with the time and space coverage of the satellite/airborne overpasses)

2) the development of a set of analysis tools that will enable users to calculate joint statistics, produce composites, compare modeled and observed quantities, and apply advanced strategies to assimilate remote sensing observations into meso-scale models.

3) the development of data immersion techniques to enable real-time interaction with the models and visualization of highly complex systems. We will build upon the approach we have developed to visualize a comprehensive set of satellite observations (see https://grip.jpl.nasa.gov). Under this effort we will develop new approaches to include the visualization of the time-series 3D model data.

The developed technology will be infused into NOAA’s Hurricane Research Division (HRD) operations and will be used in conjunction with hurricane models that are used operationally, the Hurricane Weather Research and Forecasting (HWRF) model and the associated Hurricane Ensemble Data Assimilation System (HEDAS), to validate and improve their forecast, thus addressing the goal of the HFIP effort.

The proposed work will provide the missing components that are needed to fully realize the potential of NASA’s satellite and airborne observations to validate and improve hurricane forecasts, demonstrating to the public the high value of NASA’s satellite data in monitoring and accurately predicting extreme weather events with high societal impact.

We will integrate the above three independent subsystems into one end-to-end hurricane model and observational data system. We will advance the TRL level from 2 to 6 at the end of the three-year technology development phase and from 6 to 7 at the end of the 4th year technology infusion phase.

 


Return to Top

Hook Hua, Jet Propulsion Laboratory
Advanced Rapid Imaging & Analysis for Monitoring Hazards (ARIA-MH)

Volcanic eruptions, landslides, and levee failures are some examples of hazards that can be more accurately forecasted with sufficient monitoring of precursory ground deformation, such as the high-resolution measurements from GPS and InSAR. In addition, coherence and reflectivity change maps can be used to detect surface change due to lava flows, mudslides, tornadoes, floods, and other natural and man-made disasters. However, it is difficult for many volcano observatories and other monitoring agencies to process GPS and InSAR products in an automated scenario needed for continual monitoring of events. Additionally, numerous interoperability barriers exist in multi-sensor observation data access, preparation, and fusion to create actionable products. Combining high spatial resolution InSAR products with high temporal resolution GPS products–and automating this data preparation & processing across global-scale areas of interests–present an untapped science and monitoring opportunity.

Objectives:
We will develop an advanced service-oriented architecture for hazard monitoring leveraging NASA-funded algorithms and data management to enable both science and decision-making communities to monitor areas of interests via seamless data preparation, processing, and distribution. Our objectives:
* Enable high-volume and low-latency automatic generation of NASA Solid Earth science data products (InSAR and GPS) to support hazards monitoring.
* Facilitate NASA-USGS collaborations to share NASA InSAR and GPS data products, which are difficult to process in high-volume and low-latency, for decision-support.
* Enable interoperable discovery, access, and sharing of NASA observations and derived actionable products, and between the observation and decision-making communities.
* Enable their improved understanding through visualization, mining, and cross-agency sharing.

Expected Benefits:
We will enable greater understanding of surface processes leading up to, during, and after natural and man-made disasters. The global coverage offered by satellite-based SAR observations, and the rapidly expanding GPS networks, can provide orders of magnitude more data on these hazardous events if we have a data system that can efficiently and effectively analyze the voluminous data, and provide users the tools to access data from their regions of interest. Currently, combined GPS & InSAR time series are primarily generated for specific research applications, and are not implemented to run on large continuous data sets and delivered to decision-making communities.

Methodology:
We will leverage our team’s prototype Advanced Rapid Imaging & Analysis for Earthquakes data system that automates the generation of geodetic imaging products. Existing InSAR & GPS processing packages and other software will be integrated for generating geodetic decision support monitoring products. We will employ semantic and cloud-based data management and processing techniques for handling large data volumes, reducing end product latency, codifying data system information with semantics, and deploying interoperable services for actionable products to decision-making communities.

Entry TRL: 3, Exit TRL: 6

Infusion Plan:
Though ARIA-MH can be used for generic deformation hazard monitoring, we will focus on volcano monitoring as a first infusion point. Working with USGS Volcano Science Center and the Hawaiian Volcano Observatory, we will deploy interoperable data and services to infuse InSAR and InSAR/GPS combined products into the USGS Volcano Analysis and Visualization Environment (VALVE) to assist in decision-making at HVO. Exit TRL: 7
Period of Performance: 2012-04 to 2016-03

 


Return to Top

Stephan Kolitz, Draper Laboratory
EPOS for Coordination of Asynchronous Sensor Webs

We propose to enhance and extend the capabilities of Draper’s Earth Phenomena Observation System (EPOS), developed under previous AIST (99, 02 and 05) funding, to increase science data utility through new information product development for: 1) disaster management, with a focus on wildfires, and 2) sensor web design and operations.

We propose to address the problems of wildfire mitigation, warning and response by providing improved situational awareness/assessment and optimized planning. Improved situation awareness/assessment will include the near-term (i.e., over the next few days) prediction of wildfire location and size. Optimized planning will use these predictions and other input (e.g., cloud predictions) to produce optimized wildfire observation plans.

These planning capabilities will also provide support for: 1) optimized design of new sensor systems (including sensors on satellites, aircraft, ground vehicles, and in situ); 2) optimized observation planning (including asynchronous distributed sensor systems); and 3) optimized calibration and science campaigns (including the planning of simultaneous observations from multiple sensor systems on satellites, aircraft and ground vehicles).

 


Return to Top

Daniel Mandl, Goddard Space Flight Center
A High Performance, Onboard Multicore Intelligent Payload Module for Orbital and Suborbital Remote Sensing Missions

This research effort seeks to raise the Technology Readiness Level (TRL) of onboard multicore processing for use on the Intelligent Payload Module (IPM) which is targeted for use on the HyspIRI NASA Decadal Survey Mission. Although the IPM is targeted for HyspIRI, it is also applicable to other NASA Decadal missions such as GeoCape and airborne remote sensing missions that use Global Hawk, Sierra or the ER-2. As a secondary benefit, this effort sets precedent for the use of the NASA airborne program for testbed platforms to infuse technology into NASA Decadal missions.

The purpose of the IPM is to provide a secondary science processor for high data rate science missions which will provide access to subsets of the instrument data in realtime and then provide the ability to send the subset of data rapidly to the ground or to process the data onboard to build quicklook products which would be useful to low latency users such as emergency workers. The ability to perform this operational concept is predicated on the ability to capture a subset of data from a data stream of close to 1 Gbps and then to process it in near realtime. That capability does not exist at present on space flight ready computer systems. This effort seeks to demonstrate this capability using a multicore processor system which will target the combined use of the Maestro and SpaceCube architectures. The Maestro architecture is being developed under a NASA/DoD collaboration and is based on the non-radiation hardened multi-tiled Tilera architecture. Space Cube is an architecture developed at NASA/GSFC base on FPGA and Virtex processor technologies.

The plan is built on efforts that were begun with a GSFC Internal Research and Development (IRAD) effort during FY 11 and funding from the HyspIRI project to build a Tilera/Maestro testbed which runs typical HyspIRI applications. The flight testbed components would be further developed into an integrated flight box that would be installed on various airborne vehicles. This could include, but not limited to, the King Air B200, ER-2 and the Global Hawk Unmanned Aerial Vehicle (UAV). Furthermore, an instrument that is similar to the instruments on HyspIRI such as the imaging spectrometer Enhanced MODIS Airborne Simulator [eMAS] would be interfaced to the IPM for these flights to provide actual data to the IPM. The vision is to obtain two to three flight
demonstrations with increasingly complex scenarios to emulate the HyspIRI operations concept for low latency users

The key research components to enable the IPM concept to work are: (1) demonstrating that the selected hybrid multicore processor architecture (Maestro/Tilera and SpaceCube) can utilized parallel processing to make the processing fast enough to keep up with the high data rates; (2) demonstrating that the various processing levels can be allocated to the various cores/tiles which include Level 0, Level 1, Atmospheric Correction, Geocorrection and Level 2 processing; (3) demonstrating viable optimization of the processing by allocating various phases of the processing to appropriate processors, for example allocating band stripping to the FPGA in the Space Cube processor; and (4) demonstrating user control from the ground via a standardized interface called Web Coverage Processing Service (WCPS) to allow users to specify in near realtime the algorithms they want the IPM to implement.

The TRL for multicore onboard processing is at TRL 3 and the target is to raise the TRL to TRL 6 by the end of the 3 year research project. This will be accomplished by demonstrating the complete set of capabilities using the target chip sets, WCPS and the ability to manage the parallel processing allocated to the multiple cores/tiles.

 


Return to Top

Mahta Moghaddam, University of Michigan
Land Information System for SMAP Tier-1 and AirMOSS Earth Venture-1 Decadal Survey Missions: Integration of SoilSCAPE, Remote Sensing, and Modeling

This project seeks to develop information technologies for enhancing science return and science data quality from two Decadal Survey (DS) missions by (1) developing a generalized framework for large-scale ground-based sensor web technologies for near-real-time validation of, and product generation from, coarse satellite observations in heterogeneous landscapes and (2) building a generalized science information system framework for harmonizing the life cycles of the variety of data and models needed to generate and validate NASA mission products and to address key driving science questions of the DS missions. The technologies proposed here directly apply to, and will be prototyped for, the Soil Moisture Active Passive (SMAP) Tier 1 and the Airborne Microwave Observatory of Subcanopy and Subsurface (AirMOSS) Earth Venture-1 missions. These goals specifically target the “Sensor Web Systems” and the “Data Services Management” focus areas of the current AIST program announcement. The entry technology readiness level (TRL) is 4-5 and the exit TRL is 7. The period of performance of this project is 3 years, with an anticipated start in February 2012.

 


Return to Top

Ramakrishna Nemani, Ames Research Center
Semi-Automatic Science Workflow Synthesis for High-End Computing on the NASA Earth Exchange

The goal of this project is to enhance the capabilities for collaborative data analysis and modeling in Earth sciences. We propose to develop a system that will enable capture and management of research in Earth sciences through a semi-automated workflow generation, archiving and management including seamless migration of workflow execution between commodity and high-end computing resources at NASA. We propose to develop a set of components for identifying processing steps during execution of scientific codes, converting these steps into VisTrails workflow components, assisting user in completing the workflow and enabling workflow processing in a HEC (High-End Computing) environment. We will integrate this system with the NASA Earth Exchange (NEX), a collaboration platform for the Earth science community that provides a mechanism for scientific collaboration, knowledge and data sharing together with direct access to almost 1PB of Earth science data and 10,000-cores processing system.

The project will be developed in several stages each addressing separate challenge process identification, automatic workflow generation and execution of workflows within NASA MPI-based HEC environment. We will first develop the capability to identify processes during execution of user scripts and the identified processes will then be translated into modules within the VisTrails data and provenance management software. We will also provide automatic reasoning about the workflow sequencing, so that we minimize the user involvement in the process. Nevertheless, we will give the user ability to edit and modify the workflow so that it fits their needs. In order to deploy the workflows on NASA’s supercomputers, we will develop a new workflow execution engine for VisTrails that will be able to interact with the MPI-based supercomputing environment.

The project is proposed under “Data Services Management” category for this NRA. The direct benefit of this project will be to enhance the ability of research teams on NEX to more effectively collaborate and significantly speed up their development efforts. They will be able to leverage existing data and process provenance infrastructure as well as NASA supercomputing capabilities without the additional burden of learning new tools or techniques. The scientists will be able to seamlessly migrate the execution of their codes between the NEX test/development facility and the HEC supercomputing facility. This will significantly improve the rate at which global, multidisciplinary studies can be executed end-to-end while decreasing the cost for both science and operations support teams. This study will also provide better foundation for scaling NEX to larger number of users while maintaining relatively fixed overhead. This development will provide early stage benefits to next generation Earth science missions (OCO-2, SMAP,

HyspIRI, ASCENDS, SCLP) as well as improved science data analysis at later stages for a number of other missions including NPP and LDCM.

The period of performance of the project is three years and we have estimated the beginning to be March 1, 2012. However, the exact start date is not critical for this project and it can be readily adjusted.

We have estimated the entry TRL of the efforts at 3 and we will deliver a system with exit TRL of 6. The detailed TRL justification is provided in the proposal.

 


Return to Top

Christa Peters-Lidard, Goddard Space Flight Center
A mission Simulation and Evaluation Platform for Terrestrial Hydrology Using the NASA Land Information System (LIS)

Objectives and benefits: The proposed investigation addresses NRA topic area 1: Advanced Data Processing. The primary focus of this work is the development of a mission simulation and evaluation platform using the NASA Land Information System (LIS) and Land surface Verification Toolkit (LVT). LIS includes several advanced subsystems that enable multi-scale ensemble land surface modeling, data assimilation, optimization, and uncertainty estimation in addition to coupled land surface and atmospheric radiative transfer, weather forecasting and end-use terrestrial hydrology application modeling environments. LVT provides a comprehensive framework for evaluating outputs from various LIS subsystems using a wide range of metrics. A key benefit of our proposed LIS-based platform is the ability to conduct enhanced observation system simulation experiments (OSSEs) to demonstrate and quantify the impact of remotely sensed observations for improving both terrestrial hydrologic science and societal applications.

The LIS data assimilation, optimization, uncertainty estimation, and application subsystems will allow us to fully exploit the information content of current (e.g., TRMM, EOS-Aqua, GRACE, Aquarius) and future (e.g., SMAP, GPM, GRACE Follow-on, GRACE-II) terrestrial observations in OSSEs that go beyond the "classic" assimilation-only OSSEs. The proposed work is relevant with respect to several technology areas identified in the NRA, including (1) Tools to manage the validation and assessment of model data inter-comparisons and (2) Tools to broaden the applicability and reduce the cost of simulations (e.g., OSSEs).

Outline of proposed work and methodology: The proposed work will include the development of a new mission simulation OSSE platform by employing the extensive modeling and computational capabilities of LIS. In parallel, LVT will be further developed to process the results from OSSEs, yielding metrics needed to answer common mission simulation study questions. The first phase of the work will involve the integration and configuration of various LIS subsystems and LVT to develop the OSSE platform, advancing the TRL of the system from 2 to 3. During the second phase, the OSSE platform will be applied to OSSEs related to the SMAP and GPM missions. The demonstration of the platform for the SMAP OSSEs will advance the TRL to 4 and the subsequent application for GPM OSSEs will advance the TRL to 5. During the third phase of the project, the OSSE platform will be applied to conduct experiments for a future mission:GRACE-II. The completion of this phase will advance the TRL of the OSSE platform and LIS subsystems to 6.

Period of performance: 1 Feb 2012–31 Jan 2015. Entry TRL: 2 Planned exit TRL: 6

 


Return to Top

Paula Pingree, Jet Propulsion Laboratory
On-Board Processing (OBP) to Advance the PanFTS Imaging System for GEO-CAPE

The NRC Earth Science Decadal Survey recommended the GEO-CAPE mission to study changes in atmospheric composition using imaging spectrometers to acquire high-resolution spatial and spectral measurements with high temporal coverage. The simultaneous measurements of all the atmospheric trace species called out in the Decadal Survey will require instruments with high throughput, broadband coverage, and high resolution.

Imaging Fourier Transform Spectrometers (FTS), containing focal plane array detectors with high frame rates, uniquely allow these missions to effectively accomplish their science objectives, but generate raw data at rates that will challenge the downlink capabilities likely available in the timeframe of these missions. An FTS uses both axes of the FPA to retrieve spatial information, with the spectrum (in the form of an interferogram) recorded in the time domain. This leads to two important advantages for GEO-CAPE that will enhance the mission scientific return: (1) the measurement of many trace gases simultaneously and (2) the retrieval of vertical profiles rather than just total column abundances for most of the trace gases. Vertical profiles are especially important in the troposphere where comparisons with air quality models at several altitudes are required to carry out operational forecasting. The FTS frame rates, coupled with the continuous observation performed by the instrument, will produce enormous amounts of data. There is a strong need for a significant advance in the data handling technology for an imaging FTS to fully realize its inherent advantages. Our proposed on-board processing technology will establish the readiness of the Earth science information system by reducing these data volumes onboard the spacecraft. Onboard conversion of the time-domain interferograms to spectra will reduce the data set size by at least a factor of twenty and will allow downlink via the available Ka-band communication technology at a substantial reduction in cost.

We propose to implement an innovative approach to manage the high data rates (12 Gbit/s for 7 FPAs of 128 x 128 pixels each) from an imaging FTS. Each focal plane array is serviced by a pipeline of FPGAs (field programmable gate arrays), implemented both for data handling and for timing control. The FPGAs receive the sampled fringes of the metrology laser that monitors the path difference in the FTS interferometer and performs the conversion from time-domain interferograms to spectra. Our onboard processing system will take advantage of the high-performance logic, connectivity, digital signal processing, and embedded processing capabilities of the Xilinx Virtex-5 FPGA (Rad-hard version now available).

During the first year of this task, we will use synthetic Earth spectra to simulate the expected FPA data and demonstrate our FTS algorithms in a software environment. We will then perform algorithm validation with 4×4 and 8×16 arrays. In the second year, we will develop the FPGA-based on-board processing subsystem, refine our algorithms for larger arrays, and evaluate the performance of the integrated subsystem in the laboratory using the 128 x 128 in-pixel FPA, with the Read-Out Integrated Circuit (ROIC). This ROIC is developed under ACT-08 and will be hybridized and flown on a CubeSat in 2013 (ESTO-funded GRIFEX task). During the third year, we will integrate our complete OBP system with the PanFTS EM spectrometer (developed under IIP-10) at the California Laboratory for Atmospheric Remote Sensing (CLARS, Mount Wilson) and demonstrate real-time production of atmospheric spectra of the LA Basin.

Our entry TRL is 3 and planned exit TRL is 5. This proposal team is uniquely experienced and qualified to meet the milestones and deliveries of this proposed effort.

 


Return to Top

Bo-Wen Shen, Goddard Space Flight Center
Integration of the NASA CAMVis and Multiscale Analysis Package (CAMVis-MAP) For Tropical Cyclone Climate Study

This proposal addresses the support for the Earth Science missions described in the Decadal Survey (DS) report (National Research Council, NRC, 2007), which include, but are not limited to, the Aerosol-Cloud-Ecosystems (ACE), Precipitation and All-Weather Temperature and Humidity (PATH), Soil Moisture Active-Passive (SMAP), Extended Ocean Vector Winds (XOVWM), and Three-dimensional Tropospheric winds from Space-based Lidar (3D-winds) missions. Among the scenarios in these DS Missions, the advanced data processing group at the ESTO AIST PI workshop identified "Extreme Event Warning" and "Climate Projections" as two of the top priority scenarios. Previously, we (e.g., Shen et al., 2010a,b; 2011a,b) have made attempts at addressing the first by successfully developing the NASA Coupled Advanced global multiscale Modeling and concurrent Visualization systems (CAMVis) on NASA supercomputers and demonstrating the great potential for extending the lead time (up to 10-20 days) for tropical cyclone (TC) prediction with improved multi-scale interactions between a TC and the large-scale environmental conditions such as African Easterly Waves (AEWs) and Madden Julian Oscillation (MJO). In order to increase our confidence in long-term TC prediction and thus TC climate projection, it is important to further examine and verify the predictive relationships between large-scale tropical waves and TC formation, namely discovering hidden predictive relationship between meteorological and climatological events with massive model and satellite data sets. For example, TC genesis processes, accompanying downscaling (from large-scale events) and upscaling processes (from small-scale events), and their subsequent non-linear interactions need to be analyzed. Our approach is to (1) develop a scalable Multiscale Analysis Package (MAP) that includes the NASA state-of-the-art Hilbert-Huang Transform (HHT) and improved multi-dimensional ensemble empiric mode decomposition (EEMD, e.g., Wu and Huang, 2009); (2) integrate the MAP with the models and satellite data modules of the CAMVis (CAMVis-MAP); and (3) apply the coupled system to conduct multiscale time-frequency and/or space-wavenumber analysis on long-term satellite and/or model data with the aim of studying TC climate.

The potential major outcomes from our proposed tasks, which fall mainly into the category of "advanced data processing" and partially into the category of "data services management", include: (1) the development of the MAP, which will enable the integration of NASA multi-core supercomputing, visualization and HHT technologies for efficient multiscale time-frequency and/or space-wavenumber analysis on multi-year satellite data from current (e.g., TRMM) and future (e.g., SMAP) missions and high-resolution global model outputs; (2) integrative information for TC climate statistics at different temporal scales, derived from data exploiting activities with the MAP; (3) improved understanding of the impact of small-scale processes that are resolved in high-resolution model outputs or satellite data, providing feedback on the development or improvement of model dynamics and physics parameterizations and satellite retrieved algorithms; (4) a large-scale scalable MAP that can take advantage of the next-generation supercomputers, enhancing its productivity in high end computing facilities; (5) innovative visualization that can provide a simplified view of sophisticated interactive processes at multiple scales; and (6) progress in fostering interdisciplinary collaborations and experience sharing among Earth science, computer science and computational science disciplines.

 


Return to Top

Simone Tanelli, Jet Propulsion Laboratory
Unified Simulator for Earth Remote Sensing (USERS)

We propose to implement the integrated Unified Simulator for Earth Remote Sensing (USERS) which will enable modeling of instruments designed for the observation of Earth’s surface properties. USERS will receive in input geophysical scenarios from models and datasets, and it will produce simulated measurements from actual or hypothetical instruments and missions, and their associated error characterization. This proposal builds upon our current AIST project ISSARS (Instrument Simulator Suite for Atmospheric Remote Sensing, ESTO/AIST’08, JPL/GSFC/LaRC, Tanelli et al. 2011) which focuses on active and passive instruments aiming at the remote sensing of the atmosphere, and whose architecture was designed to facilitate the proposed expansions. USERS will preserve all current ISSARS capabilities.

We plan to augment ISSARS architecture and add underlying models to:

O-1) accommodate a new library of state of the art models for the scattering and emission properties of layered surface (e.g., soil moisture, vegetation, snow and ice, subsurface layers); O-2) implement a direct, Automated OSSE (Observing System Simulation Experiment) Interface to enable process-to-process functionality via web services; and

O-3) expand its processing capabilities by enabling its function in cloud computing environment.

USERS aims at filling the existing gap of a unified and adaptable multi-mission simulator bridging across missions that focus on the atmosphere (but whose measurements are affected by the surface properties) and missions that focus on the surface (but whose measurements are affected by the atmospheric properties). The synergy achieved through contextual analysis of data from diverse missions is limited by the availability of a thorough and self-consistent tool capable of providing rapidly and accurately the forward simulations that are the fundamental building block of any retrieval, data assimilation, or sensitivity study necessary to design new missions and assess mission capabilities. USERS will fill this need. O-1 primarily addresses the science needs of NASA missions such as Aquarius, SMAP, SCLP, SWOT and GPM but upon completion the new simulator suite (inclusive of O-2 and O-3) will be suitable for a wider range of NASA Decadal Survey Missions (e.g., ACE, PATH, and ASCENDS) because of the inherited ISSARS capabilities. O-2 primarily addresses the needs to implement a general OSSE framework, but it is expected that the developed technologies will facilitate USERS integration in a wider variety of tools for Earth Science Remote Sensing data analysis and interpretation (e.g., tools aiming at model vs. observation comparison, model-supported retrieval algorithms).

The main value of USERS to NASA falls exactly in the same category of ISSARS (the main difference being the wider range of missions and science communities):

1) broaden the applicability and reduce the cost of simulations and studies (e.g., OSSE) for evaluating instruments, missions, sensor networks, and field campaigns;

2) manage the validation and assessment of modeled data inter-comparisons (e.g., to more easily evaluate new algorithms, and/or quantify data and product uncertainty); accelerate the transition to higher technology readiness levels of new scattering and radiative transfer models;

3) enable multi-source data fusion across models, satellites, and in situ sensors.

USERS is an Advanced Data Processing project. Upon completion, it will belong to a broader collaborative set of tools to address wider Data Management and Sensor-Web needs.

 


Return to Top

Wei-Kuo Tao, Goddard Space Flight Center
Empowering Cloud Resolving Models Through GPU and Asynchronous IO

A multi-scale modeling system, consisting of a cloud-system resolving model (CRM), a regional-scale model and a coupled cloud-general circulation model with unified physics packages, has been developed at NASA Goddard. A key aspect of this system is the proper representation of aerosols, clouds, precipitation, radiation and their interaction. It has been used to study a variety of weather and climate processes from thunderstorms, snowstorms, and hurricanes to the indirect effect of aerosols on radiation. It has also been coupled to the Goddard Earth Satellite Simulator (Matsui et al. 2009; Masunaga et al. 2011) to evaluate the modeling system’s physical packages via direct comparisons with EOS satellite measure-ments and to support NASA satellite missions (e.g., A-Train, TRMM, GPM and ACE) by providing virtual satellite measurements for the development of physically-based precipita-tion retrieval algorithms.

The kernel model in this system is the Goddard Cumulus Ensemble model (GCE), a CRM. Recently, the GCE was implemented with two-moment and spectral-bin microphysical schemes that can explicitly simulate cloud and ice particle size distributions. These schemes capture the essence of aerosol-cloud-radiation interaction, which was identified in the 2007 IPCC (Intergovernment Panel on Climate Change) report as having the largest uncertainty in climate modeling. However, the bin scheme requires over 200 additional prognostic vari-ables per gridcell than the conventional microphysics schemes used in CRMs and NWP, a challenge for current computational technology.

The GCE run with bin microphysics requires significant computations as well as I/O (in-put/output). Tests have shown that computational time is increased by a factor of 226 and I/O time by a factor of 200 as compared to the base scheme. Although massive parallel comput-ing at the NCCS and NAS have already improved bin model performance significantly, addi-tional technology is required to fully apply the scheme to weather and climate studies, espe-cially those involving the effects of aerosols on weather and climate.

In this proposal, we will (1) port the computationally-intensive components of the GCE (i.e, the microphysics and long- and short-wave radiation) to graphics processing units (GPUs) to accelerate their performance, (2) develop an asynchronous I/O tool to offload out-put data from compute nodes to reduce the idle time of computing processors, and (3) de-velop a data compression mechanism to further empower the asynchronous I/O tool in (2). The accelerated GCE will greatly broaden not only its own scientific application but those of the Weather Research and Forecasting Model (WRF) and the Goddard Multi-scale Modeling Framework (MMF, Tao et al. 2009) via the shared physical packages, breaking the bottle neck that currently prevents large-domain, long-term integrations. In addition, the proposed compression-enhanced synchronized output tool will be used in other output-intensive appli-cations.

This task will have a performance period of three years. During this time, we plan to take the Technology Readiness Level from an entry level of 2 (concept) to an exit level of 7 (sys-tem prototype in an operational setting). The rapid development is possible because much of the core infrastructure currently exists. We also have access to the GPU hardware hosted by the NCCS.