Session Title | Speaker | Type | Recording | Materials | Year |
---|---|---|---|---|---|
Breakout Probabilistic Data Synthesis to Provide a Defensible Risk Assessment for Army Munition (Abstract)
Military grade energetics are, by design, required to operate under extreme conditions. As such, warheads in a munition must demonstrate a high level of structural integrity in order to ensure safe and reliable operation by the Warfighter. In this example which involved an artillery munition, a systematic analytics-driven approach was executed which synthesized physical test data results with probabilistic analysis, non-destructive evaluation, modeling and simulation, and comprehensive risk analysis tools in order to determine the probability of a catastrophic event. Once the severity, probability of detection, occurrence, were synthesized, a model was built to determine the risk of a catastrophic event during firing which then accounts for defect growth occurring as a result of rough-handling. This comprehensive analysis provided a defensible, credible, and dynamic snapshot of risk while allowing for a transparent assessment of contribution to risk of the various inputs through sensitivity analyses. This paper will illustrate intersection of product safety, reliability, systems-safety policy, and analytics, and highlight the impact of a holistic multidisciplinary approach. The benefits of this rigorous assessment included quantifying risk to the user, supporting effective decision-making, improving resultant safety and reliability of the munition, and supporting triage and prioritization of future Non-Destructive Evaluation (NDE) screening efforts by identifying at-risk subpopulations. |
Kevin Singer | Breakout |
![]() | 2019 |
|
Breakout Decentralized Signal Processing and Distributed Control for Collaborative Autonomous Sensor Networks (Abstract)
Collaborative autonomous sensor networks have recently been used in many applications including inspection, law enforcement, search and rescue, and national security. They offer scalable, low cost solutions which are robust to the loss of multiple sensors in hostile or dangerous environments. While often comprised of less capable sensors, the performance of a large network can approach the performance of far more capable and expensive platforms if nodes are effectively coordinating their sensing actions and data processing. This talk will summarize work to date at LLNL on distributed signal processing and decentralized optimization algorithms for collaborative autonomous sensor networks, focusing on ADMM-based solutions for detection/estimation problems and sequential greedy optimization solutions which maximize submodular functions, e.g. mutual information. |
Ryan Goldhahn | Breakout | 2019 |
||
Breakout Design and Analysis of Experiments for Europa Clipper’s Science Sensitivity Model (Abstract)
The Europa Clipper Science Sensitivity Model (SSM) can be thought of as a graph in which the nodes are mission requirements at ten levels in a hierarchy, and edges represent how requirements at one level of the hierarchy depend on those at lower levels. At the top of the hierarchy, there are ten nodes representing ten, Level 1 science requirements for the mission. At the bottom of the hierarchy, there are 100 or so nodes representing instrument-specific science requirements. In between, nodes represent intermediate science requirements with complex interdependencies. Meeting, or failing to meet, bottom-level requirements depends on the frequency of faults and the lengths of recovery times on the nine Europa Clipper instruments and the spacecraft. Our task was to design and analyze the results of a Monte Carlo experiment to estimate the probabilities of meeting the Level 1 science requirements based on parameters of the distributions of time between failures and of recovery times. We simulated an ensemble of synthetic missions in which failures and recoveries were random realizations from those distributions. The pass-fail status of the bottom-level instrument-specific requirements were propagated up the graph for each of the synthetic missions. Aggregating over the collection of synthetic missions produced estimates of the pass-fail probabilities for the Level 1 requirements. We constructed a definitive screening design and supplemented it with additional space-filing runs, using JMP 14 software. Finally, we used the vectors of failure and recovery parameters as predictors, and the pass-fail probabilities of the high-level requirements as responses, and built statistical models to predict the latter from the former. In this talk, we will describe the design considerations and review the fitted models and their implications for mission success. |
Amy Braverman | Breakout |
![]() | 2019 |
|
Short Course Categorical Data Analysis (Abstract)
Categorical data is abundant in the 21st century, and its analysis is vital to advance research across many domains. Thus, data-analytic techniques that are tailored for categorical data are an essential part of the practitioner’s toolset. The purpose of this short course is to help attendees develop and sharpen their abilities with these tools. Topics covered in this short course will include logistic regression, ordinal regression, and classification, and methods to assess predictive accuracy of these approaches will be discussed. Data will be analyzed using the R software package, and course content loosely follow Alan Agresti’s excellent textbook An Introduction to Categorical Data Analysis, Third Edition. |
Christopher Franck Virginia Tech |
Short Course | Materials | 2019 |
|
Tutorial Tutorial: Learning Python and Julia (Abstract)
In recent years, the programming language Python with its supporting ecosystem has established itself as a significant capability to support the activities of the typical data scientist. Recently, version 1.0 of the programming language Julia has been released; from a software engineering perspective, it can be viewed as a modern alternative. This tutorial presents both Python and Julia from both a user and developer point of view. From a user’s point of view, the basic syntax of each, along with fundamental prerequisite knowledge presented. From a developers point of view the underlying infrastructure of the programming language / interpreter / compiler is discussed. |
Douglas Hodson Associate Professor Air Force Institute of Technology |
Tutorial | 2019 |
||
Breakout Demystifying the Black Box: A Test Strategy for Autonomy (Abstract)
Systems with autonomy are beginning to permeate civilian, industrial, and military sectors. Though these technologies have the potential to revolutionize our world, they also bring a host of new challenges in evaluating whether these tools are safe, effective, and reliable. The Institute for Defense Analyses is developing methodologies to enable testing systems that can, to some extent, think for themselves. In this talk, we share how we think about this problem and how this framing can help you develop a test strategy for your own domain. |
Dan Porter | Breakout |
![]() | 2019 |
|
Keynote Tuesday Keynote |
David Chu President Institute for Defense Analyses ![]() (bio)
David Chu serves as President of the Institute for Defense Analyses. IDA is a non-profit corporation operating in the public interest. Its three federally funded research and development centers provide objective analyses of national security issues and related national challenges, particularly those requiring extraordinary scientific and technical expertise. As president, Dr. Chu directs the activities of more than 1,000 scientists and technologists. Together, they conduct and support research requested by federal agencies involved in advancing national security and advising on science and technology issues. Dr. Chu served in the Department of Defense as Under Secretary of Defense for Personnel and Readiness from 2001-2009, and earlier as Assistant Secretary of Defense and Director for Program Analysis and Evaluation from 1981-1993. From 1978-1981 he was the Assistant Director of the Congressional Budget Office for National Security and International Affairs. Dr. Chu served in the U. S. Army from 1968-1970. He was an economist with the RAND Corporation from 1970-1978, director of RAND’s Washington Office from 1994-1998, and vice president for its Army Research Division from 1998-2001. He earned a bachelor of arts in economics and mathematics, and his doctorate in economics, from Yale University. Dr. Chu is a member of the Defense Science Board and a Fellow of the National Academy of Public Administration. He is a recipient of the Department of Defense Medal for Distinguished Public Service with Gold Palm, the Department of Veterans Affairs Meritorious Service Award, the Department of the Army Distinguished Civilian Service Award, the Department of the Navy Distinguished Public Service Award, and the National Academy of Public Administration’s National Public Service Award. |
Keynote | 2019 |
||
Breakout A Statistical Approach for Uncertainty Quantification with Missing Data (Abstract)
Uncertainty quantification (UQ) has emerged as the science of quantitative characterization and reduction of uncertainties in simulation and testing. Stretching across applied mathematics, statistics, and engineering, UQ is a multidisciplinary field with broad applications. A popular UQ method to analyze the effects of input variability and uncertainty on the system responses is generalized Polynomial Chaos Expansion (gPCE). This method was developed using applied mathematics and does not require knowledge of a simulation’s physics. Thus, gPCE may be used across disparate industries and is applicable to both individual component and system level simulations. The gPCE method can encounter problems when any of the input configurations fail to produce valid simulation results. gPCE requires that results be collected on a sparse grid Design of Experiment (DOE), which is generated based on probability distributions of the input variables. A failure to run the simulation at any one input configuration can result in a large decrease in the accuracy of a gPCE. In practice, simulation data sets with missing values are common because simulations regularly yield invalid results due to physical restrictions or numerical instability. We propose a statistical approach to mitigating the cost of missing values. This approach yields accurate UQ results if simulation failure makes gPCE methods unreliable. The proposed approach addresses this missing data problem by introducing an iterative machine learning algorithm. This methodology allows gPCE modelling to handle missing values in the sparse grid DOE. The study will demonstrate the convergence characteristics of the methodology to reach steady state values for the missing points using a series of simulations and numerical results. Remarks about the convergence rate and the advantages and feasibility of the proposed methodology will be provided. Several examples are used to demonstrate the proposed framework and its utility including a secondary air system example from the jet engine industry and several non-linear test functions. This is based on joint work with Dr. Mark Andrews at SmartUQ. |
Mark Andrews | Breakout | 2019 |
||
Breakout Using Bayesian Neural Networks for Uncertainty Quantification of Hyperspectral Image Target Detection (Abstract)
Target detection in hyperspectral images (HSI) has broad value in defense applications, and neural networks have recently begun to be applied for this problem. A common criticism of neural networks is they give a point estimate with no uncertainty quantification (UQ). In defense applications, UQ is imperative because the cost of a false positive or negative is high. Users desire high confidence in either “target” or “not target” predictions, and if high confidence cannot be achieved, more inspection is warranted. One possible solution is Bayesian neural networks (BNN). Compared to traditional neural networks which are constructed by choosing a loss function, BNN take a probabilistic approach and place a likelihood function on the data and prior distributions for all parameters (weights and biases), which in turn implies a loss function. Training results in posterior predictive distributions, from which prediction intervals can be computed, rather than only point estimates. Heatmaps show where and how much uncertainty there is at any location and give insight into the physical area being imaged as well as possible improvements to the model. Using pytorch and pyro software, we test BNN on a simulated HSI scene produced using the Rochester Institute of Technology (RIT) Digital Imaging and Remote Sensing Image Generation (DIRSIG) model. The scene geometry used is also developed by RIT and is a detailed representation of a suburban neighborhood near Rochester, NY, named “MegaScene.” Target panels were inserted for this effort, using paint reflectance and bi-directional reflectance distribution function (BRDF) data acquired from the Nonconventional Exploitation Factors Database System (NEFDS). The target panels range in size from large to subpixel, with some targets only partially visible. Multiple renderings of this scene are created under different times of day and with different atmospheric conditions to assess model generalization. We explore the uncertainty heatmap for different times and environments on MegaScene as well as individual target predictive distributions to gain insight into the power of BNN. |
Daniel Ries | Breakout |
![]() | 2019 |
|
Breakout An Overview of Uncertainty-Tolerant Decision Support Modeling for Cybersecurity (Abstract)
Cyber system defenders face the challenging task of continually protecting critical assets and information from a variety of malicious attackers. Defenders typically function within resource constraints, while attackers operate at relatively low costs. As a result, design and development of resilient cyber systems that support mission goals under attack, while accounting for the dynamics between attackers and defenders, is an important research problem. This talk will highlight decision support modeling challenges under uncertainty within non-cooperative cybersecurity settings. Multiple attacker-defender game formulations under uncertainty are discussed with steps for further research. |
Samrat Chatterjee | Breakout | 2019 |
||
Keynote Wednesday Lunchtime Keynote Speaker |
Jared Freeman Chief Scientist of Aptima and Chair of the Human Systems Division National Defense Industry Association ![]() (bio)
Jared Freeman, Ph.D., is Chief Scientist of Aptima and Chair of the Human Systems Division of the National Defense Industry Association. His research and publications address measurement, assessment, and enhancement of human learning, cognition, and performance in technologically complex military environments. |
Keynote |
![]() | 2019 |
|
Short Course Multivariate Data Analysis (Abstract)
In this one-day workshop, we will explore five techniques that are commonly used to model human behavior: principal component analysis, factor analysis, cluster analysis, mixture modeling, and multidimensional scaling. Brief discussions of the theory of each method will be provided, along with some examples showing how the techniques work and how the results are interpreted in practice. Accompanying R-code will be provided so attendees are able to implement these methods on their own. |
Doug Steinley University of Missouri |
Short Course | Materials | 2019 |
|
Breakout Human in the Loop Experiment Series Evaluating Synthetic Vision Displays for Enhanced Airplane State Awareness (Abstract)
Recent data from Boeing’s Statistical Summary of Commercial Jet Airplane Accidents shows that Loss of Control – In Flight (LOC-I) is the leading cause of fatalities in commercial aviation accidents worldwide. The Commercial Aviation Safety Team (CAST), a joint government and industry effort tasked with reducing the rate of fatal accidents, requested that the National Aeronautics and Space Administration (NASA) conduct research on virtual day-visual meteorological conditions displays, such as synthetic vision, in order to combat LOC-I. NASA recently concluded a series of experiments using commercial pilots from various backgrounds to evaluate synthetic vision displays. This presentation will focus on the two most recent experiments: one conducted with the Navy’s Disorientation Research Device and one completed at NASA Langley Research Center that utilized the Microsoft HoloLens to display synthetic vision. Statistical analysis was done on aircraft performance data, pilot inputs, and a range of subjective questionnaires to assess the efficacy of the displays. |
Kathryn Ballard | Breakout |
![]() | 2019 |
|
Breakout Satellite Affordability in LEO (SAL) (Abstract)
The Satellite Affordability in LEO (SAL) model identifies the cheapest constellation capable of providing a desired level of performance within certain constraints. SAL achieves this using a combination of analytical models, statistical emulators, and geometric relationships. SAL is flexible and modular, allowing users to customize certain components while retaining default behavior in other cases. This is desirable if users wish to consider an alternative cost formulation or different types of payload. Uses for SAL include examining cost tradeoffs with respect to factors like constellation size and desired performance level, evaluating the sensitivity of constellation costs to different assumptions about cost behavior, and providing a first-pass look at what proliferated smallsats might be capable of. At this point, SAL is limited to Walker constellations with sun-synchronous, polar orbits. |
Matthew Avery | Breakout |
![]() | 2019 |
|
Keynote Thursday Keynote Speaker II |
Michael Little Program Manager, Advanced Information Systems Technology Earth Science Technology Office, NASA Headquarters ![]() (bio)
Over the past 45 years, Mike’s primary focus has been on the management of research and development, focusing on making the results more useful in meeting the needs of the user community. Since 1984, he has specialized in communications, data and processing systems, including projects in NASA, the US Air Force, the FAA and the Census Bureau. Before that, he worked on Major System Acquisition Programs, in the Department of Defense including Marine Corps combat vehicles and US Navy submarines. Currently, Mike manages a comprehensive program to provide NASA’s Earth Science research efforts with the information technologies it will need in the 2020-2035 time-frame to characterize, model and understand the Earth. This Program addresses the full range of data lifecycle from generating data using instruments and models, through the management of the data and including the ways in which information technology can help to exploit the data. Of particular interest today are the ways in which NASA can measure and understand transient and transitional phenomena and the impact of climate change. The AIST Program focuses the application of applied math and statistics, artificial intelligence, case-based reasoning, machine learning and automation to improve our ability to use observational data and model output in understanding Earth’s physical processes and natural phenomena. Training and odd skills: Application of cloud computing US Government Computer Security US Navy Nuclear Propulsion operations and maintenance on two submarines |
Keynote |
![]() | 2019 |
|
Keynote Thursday Lunchtime Keynote Speaker |
T. Charles Clancy Bradley Professor of Electrical and Computer Engineering Virginia Tech ![]() (bio)
Charles Clancy is the Bradley Professor of Electrical and Computer Engineering at Virginia Tech where he serves as the Executive Director of the Hume Center for National Security and Technology. Clancy leads a range of strategic programs at Virginia Tech related to security, including the Commonwealth Cyber Initiative. Prior to joining VT in 2010, Clancy was an engineering leader in the National Security Agency, leading research programs in digital communications and signal processing. He received his PhD from the University of Maryland, MS from University of Illinois, and BS from the Rose-Hulman Institute of Technology. He is co-author to over 200 peer-reviewed academic publications, six books, over twenty patents, and co-founder to five venture-backed startup companies. |
Keynote |
![]() | 2019 |
|
Breakout Constructing Designs for Fault Location (Abstract)
Abstract. While fault testing a system with many factors each appearing at some number of levels, it may not be possible to test all combinations of factor levels. Most faults are caused by interactions of only a few factors, so testing interactions up to size t will often find all faults in the system without executing an exhaustive test suite. Call an assignment of levels to t of the factors a t-way interaction. A covering array is a collection of tests that ensures that every t-way interaction is covered by at least one test in the test suite. Locating arrays extend covering arrays with the additional feature that they not only indicate the presence of faults but locate the faulty interactions when there are no more than d faults in the system. If an array is (d, t)-locating, for every pair of sets of t-way interactions of size d, the interactions do not appear in exactly the same tests. This ensures that the faulty interactions can be differentiated from non-faulty interactions by the results of some test in which interactions from one set or the other but not both are tested. When the property holds for t-way interaction sets of size up to d, the notation (d, t ¯ ) is used. In addition to fault location, locating arrays have also been used to identify significant effects in screening experiments. Locating arrays are fairly new and few techniques have been explored for their construction. Most of the available work is limited to finding only one fault (d = 1). Known general methods require a covering array of strength t + d and produce many more tests than are needed. In this talk, we present Partitioned Search with Column Resampling (PSCR), a computational search algorithm to verify if an array is (d, t ¯ )-locating by partitioning the search space to decrease the number of comparisons. If a candidate array is not locating, random resampling is performed until a locating array is constructed or an iteration limit is reached. Algorithmic parameters determine which factor columns to resample and when to add additional tests to the candidate array. We use a 5 × 5 × 3 × 2 × 2 full factorial design to analyze the performance of the algorithmic parameters and provide guidance on how to tune parameters to prioritize speed, accuracy, or a combination of both. Last, we compare our results to the number of tests in locating arrays constructed for the factors and levels of real-world systems produced by other methods. |
Erin Lanus | Breakout |
![]() | 2019 |
|
Breakout Software Reliability and Security Assessment: Automation and Frameworks (Abstract)
Software reliability models enable several quantitative predictions such as the number of faults remaining, failure rate, and reliability (probability of failure free operation for a specified period of time in a specified environment). This talk will describe recent efforts in collaboration with NASA, including (1) the development of an automated script for the SFRAT (Software Failure and Reliability Assessment Tool) to streamline application of software reliability methods to ongoing programs, (2) application to a NASA program, (3) lessons learned, (4) and future directions for model and tool development to support the practical needs of the software reliability and security assessment frameworks. |
Lance Fiondella | Breakout |
![]() | 2019 |
|
Breakout SLS Structural Dynamics Sensor Optimization Study (Abstract)
A crucial step in the design and development of a fight vehicle, such as NASA’s Space Launch System (SLS), is understanding its vibration behavior while in fight. Vehicle designers rely on low-cost finite element analysis (FEA) to predict the vibration behavior of the vehicle. During ground and flight tests, sensors are strategically placed at predefined locations that contribute the most vibration information under the assumption that FEA is accurate, producing points to validate the FEA models. This collaborative work focused on developing optimal sensor placement algorithms to validate FEA models against test data, and to characterize the vehicles vibration characteristics. |
Ken Toro & Jon Stallrich | Breakout | 2019 |
||
Short Course Statistical Methods for Modeling and Simulation Verification and Validation (Abstract)
Statistical Methods for Modeling and Simulation Verification and Validation is a 1-day tutorial in applied statistical methods for the planning, designing and analysis of simulation experiments and live test events for the purposes of verifying and validating models and simulations. The course covers the fundamentals of verification and validation of models and simulations, as it is currently practiced and as is suggested for future applications. The first session is largely an introduction to modeling and simulation concepts, verification and validation policies, along with a basic introduction to data visualization and statistical methods. Session 2 covers the essentials of experiment design for simulations and live test events. The final session focuses on analysis techniques appropriate for designed experiments and tests, as well as observational data, for the express purpose of simulation validation. We look forward to your participation in this course. |
Dr. Jim Simpson, Dr. Jim Wisnowski, and Dr. Stargel Doane JK Analytics LLC, Adsurgo LLC, Analytical Arts LLC |
Short Course | Materials | 2019 |
|
Breakout Statistical Engineering and M&S in the Design and Development of DoD Systems (Abstract)
This presentation will use a notional armament system case-study to illustrate the use of M&S DOE, surrogate modeling, sensitivity analysis, multi-objective optimization and model calibration during early lifecycle development and design activities in the context of a new armament system. In addition to focusing on the statistician’s, data scientist’s, or analyst’s role and the key statistical techniques in engineering DoD systems, this presentation will also emphasize the non-statistical / engineering domain-specific aspects in a multidisciplinary design and development process which make uses of these statistical approaches at the subcomponent and subsystem-level as well as the end-to-end system modeling. A statistical engineering methodology which emphasizes the use of ‘virtual’ DOE-based model emulators developed at the subsystem-level and integrated using a systems-engineering architecture framework can yield a more tractable engineering problem compared to traditional ‘design-build-test-fix’ cycles or direct simulation of computationally expensive models. This supports a more informed prototype design for physical experimentation while providing a greater variety of materiel solutions thereby reducing development and testing cycles and time to field complex systems. |
Doug Ray & Melissa Jablonski | Breakout | 2019 |
||
Breakout Statistical Process Control and Capability Study on the Water Content Measurements in NASA Glenn’s Icing Research Tunnel (Abstract)
The Icing Research Tunnel (IRT) at NASA Glenn Research Center follows the recommended practice for icing tunnel calibration outlined in SAE’s ARP5905 document. The calibration team has followed the schedule of a full calibration every five years with a check calibration done every six months following. The liquid water content of the IRT has maintained stability within in the specifications presented to customers that the variation is within +/- 10% of the calibrated, target measurement. With recent measurements and instrumentation errors, a more thorough assessment of error source was desired. By constructing statistical process control charts, the ability to determine how the instrument varies in the short term, mid term, and long term was gained. The control charts offer a view of instrument error, facility error, or installation changes. It was discovered that there was a shift from target to mean baseline thus leading to the study of the overall capability indices of the liquid water content measuring instrument to perform within specifications defined in the IRT. This presentation describes data processing procedures for the Multi-Element Sensor in the IRT, including collision efficiency corrections, canonical correlation analysis, Chauvenet’s Criterion for rejection of data, distribution check of data, and mean, median and mode for construction of control charts. Further data is presented to describe the repeatability of the IRT with the Multi-Element Sensor and the ability to maintain a stable process for the defined calibration schedule. |
Emily Timko | Breakout | 2019 |
||
Keynote Thursday Keynote Speaker I |
Wendy Martinez Director, Mathematical Statistics Research Center, Bureau of Labor Statistics ASA President-Elect (2020) ![]() (bio)
Wendy Martinez has been serving as the Director of the Mathematical Statistics Research Center at the Bureau of Labor Statistics (BLS) for six years. Prior to this, she served in several research positions throughout the Department of Defense. She held the position of Science and Technology Program Officer at the Office of Naval Research, where she established a research portfolio comprised of academia and industry performers developing data science products for the future Navy and Marine Corps. Her areas of interest include computational statistics, exploratory data analysis, and text data mining. She is the lead author of three books on MATLAB and statistics. Dr. Martinez was elected as a Fellow of the American Statistical Association (ASA) in 2006 and is an elected member of the International Statistical Institute. She was honored by the American Statistical Association when she received the ASA Founders Award at the JSM 2017 conference. Wendy is also proud and grateful to have been elected as the 2020 ASA President. |
Keynote | Materials | 2019 |
|
Breakout Exploring Problems in Shipboard Air Defense with Modeling (Abstract)
One of the primary roles of navy surface combatants is defending high-value units against attack by anti-ship cruise missiles (ASCMs). They accomplish this either by launching their interceptor missiles and shooting the ASCMs down with rapid-firing guns (hard kill), or through the use of deceptive jamming, decoys, or other non-kinetic means (soft kill) to defeat the threat. The wide range of hostile ASCM capabilities and the different properties of friendly defenses, combined with the short time-scale for defeating these ASCMs, makes this a difficult problem to study. IDA recently completed a study focusing on the extent to which friendly forces were vulnerable to massed ASCM attacks, and possible avenues for improvement. To do this we created a pair of complementary models with the combined flexibility to explore a wide range of questions. The first model employed a set of closed-form equations, and the second a time-dependent Monte Carlo simulation. This presentation discusses the thought processes behind the models and their relative strengths and weaknesses. |
Ralph Donnelly & Benjamin Ashwell | Breakout |
![]() | 2019 |
|
Breakout Accelerating Uncertainty Quantification for Complex Computational Models (Abstract)
Scientific computing has undergone extraordinary growth in sophistication in recent years, enabling the simulation of a wide range of complex multiphysics and multiscale phenomena. Along with this increase in computational capability is the growing recognition that uncertainty quantification (UQ) must go hand-in-hand with numerical simulation in order to generate meaningful and reliable predictions for engineering applications. If not rigorously considered, uncertainties due to manufacturing defects, material variability, modeling assumptions, etc. can cause a substantial disconnect between simulation and reality. Packaging these complex computational models within an UQ framework, however, can be a significant challenge due to the need to repeatedly evaluate the model when even a single evaluation is time-consuming. This talk discusses efforts at NASA Langley Research Center (LaRC) to enable rapid UQ for problems with expensive computational models. Under the High Performance Computing Incubator (HPCI) program at LaRC, several open-source software libraries are being developed and released to provide access to general-purpose, state-of-the-art UQ algorithms. The common denominator of these methods is that they all expose parallelism among the model evaluations needed for UQ and, as such, are implemented to leverage HPC resources when available to achieve tremendous computational speedup. While the methods and software presented are broadly applicable, they will be demonstrated in the context of applications that have particular interest at NASA, including structural health management and trajectory simulation. |
James Warner | Breakout |
![]() | 2019 |
Session Title | Speaker | Type | Recording | Materials | Year |
---|---|---|---|---|---|
Breakout Probabilistic Data Synthesis to Provide a Defensible Risk Assessment for Army Munition |
Kevin Singer | Breakout |
![]() | 2019 |
|
Breakout Decentralized Signal Processing and Distributed Control for Collaborative Autonomous Sensor Networks |
Ryan Goldhahn | Breakout | 2019 |
||
Breakout Design and Analysis of Experiments for Europa Clipper’s Science Sensitivity Model |
Amy Braverman | Breakout |
![]() | 2019 |
|
Short Course Categorical Data Analysis |
Christopher Franck Virginia Tech |
Short Course | Materials | 2019 |
|
Tutorial Tutorial: Learning Python and Julia |
Douglas Hodson Associate Professor Air Force Institute of Technology |
Tutorial | 2019 |
||
Breakout Demystifying the Black Box: A Test Strategy for Autonomy |
Dan Porter | Breakout |
![]() | 2019 |
|
Keynote Tuesday Keynote |
David Chu President Institute for Defense Analyses ![]() |
Keynote | 2019 |
||
Breakout A Statistical Approach for Uncertainty Quantification with Missing Data |
Mark Andrews | Breakout | 2019 |
||
Breakout Using Bayesian Neural Networks for Uncertainty Quantification of Hyperspectral Image Target Detection |
Daniel Ries | Breakout |
![]() | 2019 |
|
Breakout An Overview of Uncertainty-Tolerant Decision Support Modeling for Cybersecurity |
Samrat Chatterjee | Breakout | 2019 |
||
Keynote Wednesday Lunchtime Keynote Speaker |
Jared Freeman Chief Scientist of Aptima and Chair of the Human Systems Division National Defense Industry Association ![]() |
Keynote |
![]() | 2019 |
|
Short Course Multivariate Data Analysis |
Doug Steinley University of Missouri |
Short Course | Materials | 2019 |
|
Breakout Human in the Loop Experiment Series Evaluating Synthetic Vision Displays for Enhanced Airplane State Awareness |
Kathryn Ballard | Breakout |
![]() | 2019 |
|
Breakout Satellite Affordability in LEO (SAL) |
Matthew Avery | Breakout |
![]() | 2019 |
|
Keynote Thursday Keynote Speaker II |
Michael Little Program Manager, Advanced Information Systems Technology Earth Science Technology Office, NASA Headquarters ![]() |
Keynote |
![]() | 2019 |
|
Keynote Thursday Lunchtime Keynote Speaker |
T. Charles Clancy Bradley Professor of Electrical and Computer Engineering Virginia Tech ![]() |
Keynote |
![]() | 2019 |
|
Breakout Constructing Designs for Fault Location |
Erin Lanus | Breakout |
![]() | 2019 |
|
Breakout Software Reliability and Security Assessment: Automation and Frameworks |
Lance Fiondella | Breakout |
![]() | 2019 |
|
Breakout SLS Structural Dynamics Sensor Optimization Study |
Ken Toro & Jon Stallrich | Breakout | 2019 |
||
Short Course Statistical Methods for Modeling and Simulation Verification and Validation |
Dr. Jim Simpson, Dr. Jim Wisnowski, and Dr. Stargel Doane JK Analytics LLC, Adsurgo LLC, Analytical Arts LLC |
Short Course | Materials | 2019 |
|
Breakout Statistical Engineering and M&S in the Design and Development of DoD Systems |
Doug Ray & Melissa Jablonski | Breakout | 2019 |
||
Breakout Statistical Process Control and Capability Study on the Water Content Measurements in NASA Glenn’s Icing Research Tunnel |
Emily Timko | Breakout | 2019 |
||
Keynote Thursday Keynote Speaker I |
Wendy Martinez Director, Mathematical Statistics Research Center, Bureau of Labor Statistics ASA President-Elect (2020) ![]() |
Keynote | Materials | 2019 |
|
Breakout Exploring Problems in Shipboard Air Defense with Modeling |
Ralph Donnelly & Benjamin Ashwell | Breakout |
![]() | 2019 |
|
Breakout Accelerating Uncertainty Quantification for Complex Computational Models |
James Warner | Breakout |
![]() | 2019 |