Session Title | Speaker | Type | Materials | Year |
---|---|---|---|---|
Breakout NASA’s Human Exploration Research Analog (HERA): An analog mission for isolation, confinement, and remote conditions in space exploration scenarios (Abstract)
Shelley Cazares served as a crewmember of the 14th mission of NASA’s Human Exploration Research Analog (HERA). In August 2017, Dr. Cazares and her three crewmates were enclosed in an approximately 600-sq. ft. simulated spacecraft for an anticipated 45 days of confined isolation at Johnson Space Center, Houston, TX. In preparation for long-duration missions to Mars in the 2030s and beyond, NASA seeks to understand what types of diets, habitats, and activities can keep astronauts healthy and happy on deep space voyages. To collect this information, NASA is conducting several analog missions simulating the conditions astronauts face in space. HERA is a set of experiments to investigate the effects of isolation, confinement, and remote conditions in space exploration scenarios. Dr. Cazares will discuss the application procedure, the pre-mission training process, the life and times inside the habitat during the mission, and her crew’s emergency evacuation from the habitat due to the risk of rising floodwaters in Hurricane Harvey. |
Shelley Cazares NASA |
Breakout | 2018 |
|
Breakout An Overview of Uncertainty-Tolerant Decision Support Modeling for Cybersecurity (Abstract)
Cyber system defenders face the challenging task of continually protecting critical assets and information from a variety of malicious attackers. Defenders typically function within resource constraints, while attackers operate at relatively low costs. As a result, design and development of resilient cyber systems that support mission goals under attack, while accounting for the dynamics between attackers and defenders, is an important research problem. This talk will highlight decision support modeling challenges under uncertainty within non-cooperative cybersecurity settings. Multiple attacker-defender game formulations under uncertainty are discussed with steps for further research. |
Samrat Chatterjee | Breakout | 2019 |
|
Contributed Optimizing for Mission Success in Highly Uncertain Scenarios (Abstract)
Optimization under uncertainty increases the complexity of a problem as well as the computing resources required to solve it. As the amount of uncertainty is increased, these difficulties are exacerbated. However, when optimizing for mission-level objectives, rather than component- or system-level objectives, an increase in uncertainty is inevitable. Previous research has found methods to perform optimization under uncertainty, such as robust design optimization or reliability-based design optimization. These are generally executed at a product component quality level, to minimize variability and stay within design tolerances but are not tailored to capture the high amount of variability in a mission-level problem. . In this presentation, an approach for formulating and solving highly stochastic mission-level optimization problems is described. A case study is shown using an unmanned aerial system (UAS) on a search mission while an “enemy” UAS attempts to interfere. This simulation, modeled in the Unity Game Engine, has highly stochastic outputs, where the time to mission success varies by multiple orders of magnitude, but the ultimate goal is a binary output representing mission success or failure. The results demonstrate the capabilities and challenges of optimization in these types of mission scenarios. |
Brian Chell | Contributed | 2018 |
|
Breakout Testing and Estimation in Sequential High-Dimension Data (Abstract)
Many modern processes generate complex data records not readily analyzed by traditional techniques. For example, a single observation from a process might be a radar signal consisting of n pairs of bivariate data described via some functional relation between reflection and direction. Methods are examined here for detecting changes in such sequences from some known or estimated nominal state. Additionally, estimates of the degree of change (scale, location, frequency, etc.) are desirable and discussed. The proposed methods are designed to take advantage of all available data in a sequence. This can become unwieldy for long sequences of large-sized observations, so dimension reduction techniques are needed. In order for these methods to be as widely applicable as possible, we make limited distributional assumptions and so we propose new nonparametric and Bayesian tools to implement these estimators. |
Eric Chicken Florida State University |
Breakout | Materials | 2017 |
Breakout Introduction of Uncertainty Quantification and Industry Challenges (Abstract)
Uncertainty is an inescapable reality that can be found in nearly all types of engineering analyses. It arises from sources like measurement inaccuracies, material properties, boundary and initial conditions, and modeling approximations. For example, the increasing use of numerical simulation models throughout industry promises improved design and insight at significantly lower costs and shorter timeframes than purely physical testing. However, the addition of numerical modeling has also introduced complexity and uncertainty to the process of generating actionable results. It has become not only possible, but vital to include Uncertainty Quantification (UQ) in engineering analysis. The competitive benefits of UQ include reduced development time and cost, improved designs, better understanding of risk, and quantifiable confidence in analysis results and engineering decisions. Unfortunately, there are significant cultural and technical challenges which prevent organizations from utilizing UQ methods and techniques in their engineering practice. This presentation will introduce UQ methodology and discuss the past and present strategies for addressing these challenges, making it possible to use UQ to enhance engineering processes with fewer resources and in more situations. Looking to the future, anticipated challenges will be discussed along with an outline of the path towards making UQ a common practice in engineering. |
Peter Chien | Breakout | 2018 |
|
Contributed Bayesian Calibration and Uncertainty Analysis: A Case Study Using a 2-D CFD Turbulence Model (Abstract)
The growing use of simulations in the engineering design process promises to reduce the need for extensive physical testing, decreasing both development time and cost. However, as mathematician and statistician George E. P. Box said, “Essentially, all models are wrong, but some are useful.” There are many factors that determine simulation or, more broadly, model accuracy. These factors can be condensed into noise, bias, parameter uncertainty, and model form uncertainty. To counter these effects and ensure that models faithfully match reality to the extent required, simulation models must be calibrated to physical measurements. Further, the models must be validated, and their accuracy must be quantified before they can be relied on in lieu of physical testing. Bayesian calibration provides a solution for both requirements: it optimizes tuning of model parameters to improve simulation accuracy, and estimates any remaining discrepancy which is useful for model diagnosis and validation. Also, because model discrepancy is assumed to exist in this framework, it enables robust calibration even for inaccurate models. In this paper, we present a case study to investigate the potential benefits of using Bayesian calibration, sensitivity analyses, and Monte Carlo analyses for model improvement and validation. We will calibrate a 7-parameter k-𝜎 CFD turbulence model simulated in COMSOL Multiphysics®. The model predicts coefficient of lift and drag for an airfoil defined using a 6049-series airfoil parameterization from the National Advisory Committee for Aeronautics (NACA). We will calibrate model predictions using publicly available wind tunnel data from the University of Illinois Urbana-Champaign’s (UIUC) database. Bayesian model calibration requires intensive sampling of the simulation model to determine the most likely distribution of calibration parameters, which can be a large computational burden. We greatly reduce this burden by following a surrogate modeling approach, using Gaussian process emulators to mimic the CFD simulation. We train the emulator by sampling the simulation space using a Latin Hypercube (LHD) Design of Experiment (DOE), and assess the accuracy of the emulator using leave-oneout Cross Validation (CV) error. The Bayesian calibration framework involves calculating the discrepancy between simulation results and physical test results. We also use Gaussian process emulators to model this discrepancy. The discrepancy emulator will be used as a tool for model validation; characteristic trends in residual errors after calibration can indicate underlying model form errors which were not addressed via tuning the model calibration parameters. In this way, we will separate and quantify model form uncertainty and parameter uncertainty. The results of a Bayesian calibration include a posterior distribution of calibration parameter values. These distributions will be sampled using Monte Carlo methods to generate model predictions, whereby new predictions have a distribution of values which reflects the uncertainty in the tuned calibrated parameter. The resulting output distributions will be compared against physical data and the uncalibrated model to assess the effects of the calibration and discrepancy model. We will also perform global, variance based sensitivity analysis on the uncalibrated model and the calibrated models, and investigate any changes in the sensitivity indices from uncalibrated to calibrated. |
Peter Chien | Contributed | 2018 |
|
Short Course Survey Construction and Analysis (Abstract)
In this course, we introduce the main concepts of the survey methodology process – from survey sampling design to analyzing the data obtained from complex survey designs. The course topics include: 1. Introduction to the Survey Process 2. R Tools 3. Sampling Designs – Simple Random Sampling, Cluster Sampling, Stratified Sampling, and more 4. Weighting and Variance Estimation 5. Exploratory Data Analysis 6. Complex Survey Analysis We use a combination of lectures and hands-on exercises using R. Students are expected to have R and associated packages installed on their computers. We will send a list of required packages before the course. We also use data from Department of Defense surveys, where appropriate. |
MoonJung Cho Bureau of Labor and Statistics |
Short Course | Materials | 2018 |
Breakout Machine Learning Prediction With Streamed Sensor Data: Fitting Neural Networks using Functional Principal Components (Abstract)
Sensors that record sequences of measurements are now embedded in many products from wearable exercise watches to chemical and semiconductor manufacturing equipment. There is information in the shapes of the sensor stream curves that is highly predictive of a variety of outcomes such as the likelihood of a product failure event or batch yield. Despite this data now being common and readily available, it is often being used either inefficiently or not at all due to lack of knowledge and tools for how to properly leverage it. In this presentation, we will propose fitting splines to sensor streams and extracting features called functional principal component scores that offer a highly efficient low dimensional compression of the signal data. Then, we use these features as inputs into machine learning models like neural networks and LASSO regression models. Once one sees sensor data in this light, answering a wide variety of applied questions becomes a straightforward two stage process of data cleanup/functional feature extraction followed by modeling using those features as inputs. |
Chris Gotwalt | Breakout |
![]() | 2019 |
Breakout Initial Investigation into the Psychoacoustic Properties of Small Unmanned Aerial System Noise (Abstract)
For the past several years, researchers at NASA Langley have been engaged in a series of projects to study the degree to which existing facilities and capabilities, originally created for work on full-scale aircraft, are extensible to smaller scales – those of the small unmanned aerial systems (sUAS, also UAVs and, colloquially, `drones’) that have been showing up in the nation’s airspace. This paper follows an effort that has led to an initial human-subject psychoacoustic test regarding the annoyance generated by sUAS noise. This effort spans three phases: 1. the collection of the sounds through field recordings, 2. the formulation and execution of a psychoacoustic test using those recordings, 3. the analysis of the data from that test. The data suggests a lack of parity between the noise of the recorded sUAS and that of a set of road vehicles that were also recorded and included in the test, as measured by a set of contemporary noise metrics. |
Andrew Chrisrian Structural Acoustics Branch |
Breakout | Materials | 2018 |
Breakout Screening Designs for Resource Constrained Deterministic M&S Experiments: A Munitions Case Study (Abstract)
Abstract: In applications where modeling and simulation runs are quick and cheap, space filling designs will give the tester all the information they need to make decisions about their system. In some applications however, this luxury does not exist, and each M&S run can be time consuming and expensive. In these scenarios, a sequential test approach provides an efficient solution where an initial screening is conducted, followed by an augmentation to fit specified models of interest. Until this point, no dedicated screening designs for UQ applications in resource constrained situations existed. Due to the Army’s frequent exposure to this type of situation, the need sparked a collaboration between Picatinny’s Statistical Methods and Analysis group and Professor V. Roshan Joseph of Georgie Tech, where a new type of UQ screening design was created. This paper provides a brief introduction to the design, its intended use, and a case study in which this new methodology was applied. |
Christopher Drake | Breakout |
![]() | 2019 |
Keynote Tuesday Keynote |
David Chu President Institute for Defense Analyses ![]() (bio)
David Chu serves as President of the Institute for Defense Analyses. IDA is a non-profit corporation operating in the public interest. Its three federally funded research and development centers provide objective analyses of national security issues and related national challenges, particularly those requiring extraordinary scientific and technical expertise. As president, Dr. Chu directs the activities of more than 1,000 scientists and technologists. Together, they conduct and support research requested by federal agencies involved in advancing national security and advising on science and technology issues. Dr. Chu served in the Department of Defense as Under Secretary of Defense for Personnel and Readiness from 2001-2009, and earlier as Assistant Secretary of Defense and Director for Program Analysis and Evaluation from 1981-1993. From 1978-1981 he was the Assistant Director of the Congressional Budget Office for National Security and International Affairs. Dr. Chu served in the U. S. Army from 1968-1970. He was an economist with the RAND Corporation from 1970-1978, director of RAND’s Washington Office from 1994-1998, and vice president for its Army Research Division from 1998-2001. He earned a bachelor of arts in economics and mathematics, and his doctorate in economics, from Yale University. Dr. Chu is a member of the Defense Science Board and a Fellow of the National Academy of Public Administration. He is a recipient of the Department of Defense Medal for Distinguished Public Service with Gold Palm, the Department of Veterans Affairs Meritorious Service Award, the Department of the Army Distinguished Civilian Service Award, the Department of the Navy Distinguished Public Service Award, and the National Academy of Public Administration’s National Public Service Award. |
Keynote | 2019 |
|
Keynote Opening Keynote |
David Chu President IDA ![]() (bio)
David Chu serves as President of the Institute for Defense Analyses. IDA is a non-profit corporation operating in the public interest. Its three federally funded research and development centers provide objective analyses of national security issues and related national challenges, particularly those requiring extraordinary scientific and technical expertise. As president, Dr. Chu directs the activities of more than 1,000 scientists and technologists. Together, they conduct and support research requested by federal agencies involved in advancing national security and advising on science and technology issues. Dr. Chu served in the Department of Defense as Under Secretary of Defense for Personnel and Readiness from 2001-2009, and earlier as Assistant Secretary of Defense and Director for Program Analysis and Evaluation from 1981-1993. From 1978-1981 he was the Assistant Director of the Congressional Budget Office for National Security and International Affairs. Dr. Chu served in the U. S. Army from 1968-1970. He was an economist with the RAND Corporation from 1970-1978, director of RAND’s Washington Office from 1994-1998, and vice president for its Army Research Division from 1998-2001. He earned a bachelor of arts in economics and mathematics, and his doctorate in economics, from Yale University. Dr. Chu is a member of the Defense Science Board and a Fellow of the National Academy of Public Administration. He is a recipient of the Department of Defense Medal for Distinguished Public Service with Gold Palm, the Department of Veterans Affairs Meritorious Service Award, the Department of the Army Distinguished Civilian Service Award, the Department of the Navy Distinguished Public Service Award, and the National Academy of Public Administration’s National Public Service Award. |
Keynote |
![]() | 2018 |
Keynote Thursday Lunchtime Keynote Speaker |
T. Charles Clancy Bradley Professor of Electrical and Computer Engineering Virginia Tech ![]() (bio)
Charles Clancy is the Bradley Professor of Electrical and Computer Engineering at Virginia Tech where he serves as the Executive Director of the Hume Center for National Security and Technology. Clancy leads a range of strategic programs at Virginia Tech related to security, including the Commonwealth Cyber Initiative. Prior to joining VT in 2010, Clancy was an engineering leader in the National Security Agency, leading research programs in digital communications and signal processing. He received his PhD from the University of Maryland, MS from University of Illinois, and BS from the Rose-Hulman Institute of Technology. He is co-author to over 200 peer-reviewed academic publications, six books, over twenty patents, and co-founder to five venture-backed startup companies. |
Keynote |
![]() | 2019 |
Breakout Asparagus is the most articulate vegetable ever (Abstract)
During the summer of 2001, Microsoft launched Windows XP, which was lauded by many users as the most reliable and usable operating system at the time. Miami Herald columnist, Dave Berry, responded to this praise by stating that “this is like saying asparagus is the most articulate vegetable ever.” Whether you agree or disagree with Dave Berry, these users’ reactions are relative (to other operating systems and to other past and future versions). This is due to an array of technological factors that have facilitated human-system improvements. Automation is often cited as improving human-system performance across many domains. It is true that when the human and automation are aligned, performance improves. But, what about the times that this is not the case? This presentation will describe the myths and facts about human-system performance and increasing levels of automation through examples of human-system R&D conducted on a satellite ground system. Factors that affect human-system performance and a method to characterize mission performance as it relates to increasing levels of automation will also be discussed. |
Kerstan Cole | Breakout | 2018 |
|
Breakout Case Studies for Statistical Engineering Applied to Powered Rotorcraft Wind-Tunnel Tests (Abstract)
Co-Authors: Sean A. Commo, Ph.D., P.E. and Peter A. Parker, Ph.D., P.E. NASA Langley Research Center, Hampton, Virginia, USA Austin D. Overmeyer, Philip E. Tanner, and Preston B. Martin, Ph.D. U.S. Army Research, Development, and Engineering Command, Hampton, Virginia, USA. The application of statistical engineering to helicopter wind-tunnel testing was explored during two powered rotor entries. The U.S. Army Aviation Development Directorate Joint Research Program Office and the NASA Revolutionary Vertical Lift Project performed these tests jointly at the NASA Langley Research Center. Both entries were conducted in the 14- by 22-Foot Subsonic Tunnel with a small segment of the overall tests devoted to developing case studies of a statistical engineering approach. Data collected during each entry were used to estimate response surface models characterizing vehicle performance, a novel contribution of statistical engineering applied to powered rotor-wing testing. Additionally, a 16- to 47-times reduction in the number of data points required was estimated when comparing a statistically-engineered approach to a conventional one-factor-at-a-time approach. |
Sean Commo NASA |
Breakout | 2016 |
|
Breakout Structured Decision Making (Abstract)
Difficult choices are often required in a decision-making process where resources and budgets are increasingly constrained. This talk demonstrates a structured decision-making approach using layered Pareto fronts to prioritize the allocation of funds between munitions stockpiles based on their estimated reliability, the urgency of needing available units, and the consequences if adequate numbers of units are . This case study illustrates the process of first identifying appropriate metrics that summarize important dimensions of the decision, and then eliminating non-contenders from further consideration in an objective stage. The final subjective stage incorporates subject matter expert priorities to select the four stockpiles to receive additional maintenance and surveillance funds based on understanding the trade-offs and robustness to various user priorities. |
Christine Anderson Cook LANL |
Breakout | Materials | 2017 |
Breakout B-52 Radar Modernization Test Design Considerations (Abstract)
Inherent system processes, restrictions on collection, or cost may impact the practical execution of an operational test. This study presents the use of blocking and split-plot designs when complete randomization is not feasible in operational test. Specifically, the USAF B-52 Radar Modernization Program test design is used to present tradeoffs of different design choices and the impacts of those choices on cost, operational relevance, and analytical rigor. |
Stuart Corbett AFOTEC |
Breakout | Materials | 2018 |
Breakout How do the Framework and Design of Experiments Fundamentally Help? (Abstract)
The Military Global Positioning System (GPS) User Equipment (MGUE) program is the user segment of the GPS Enterprise—a program on the Deputy Assistant Secretary of Defense for Developmental Test and Evaluation (DASD(DT&E)) Space and Missile Defense Systems portfolio. The MGUE program develops and test GPS cards capable of using Military-Code (M Code) and legacy signals. The program’s DT&E strategy is challenging. The GPS cards provide new, untested capabilities. Milestone A was approved on 2012 with sole source contracts released to three vendors for Increment 1. An Acquisition Decision Memorandum directs the program to support a Congressional Mandate to provide GPS M Code-capable equipment for use after FY17. Increment 1 provides GPS receiver form factors for the ground domain interface as well as for the aviation and maritime domain interface. When reviewing DASD(DT&E) Milestone B (MS B) Assessment Report, Mr. Kendall expressed curiosity about how the Developmental Evaluation Framework (DEF) and Design of Experiments (DOE) help. This presentation describes how the DEF and DOE methods help producing more informative and more economical developmental tests than what was originally under consideration by the test community—decision-quality information with a 60% reduction in test cycle time. It provides insight into how the integration of the DEF and DOE improved the overall effectiveness of the DT&E strategy, illustrates the role of modeling and simulation (M&S) in the test design process, provides examples of experiment designs for different functional and performance areas, and illustrates the logic involved in balancing risks and test resources. The DEF and DOE methods enables the DT&E strategy to fully exploit early discovery, to maximize verification and validation opportunities, and to characterize system behavior across the technical requirements space. |
Luis A. Cortes | Breakout | 2017 |
|
Breakout Cases of Second-Order Split-Plot Designs (Abstract)
The fundamental principles of experiment design are factorization, replication, randomization, and local control of error. In many industries, however, departure from these principles is commonplace. Often in our experiments complete randomization is not feasible because the factor level settings are hard, impractical, or inconvenient to change or the resources available to execute under homogeneous conditions are limited. These restrictions in randomization lead to split-plot experiments. We are also often interested in fitting second-order models leading to second-order split-plot experiments. Although response surface methodology has grown tremendously since 1951, the lack of alternatives for second-order split-plots remains largely unexplored. The literature and textbooks offer limited examples and provide guidelines that often are too general. This deficit of information leaves practitioners ill prepared to face the many roadblocks associated with these types of designs. This presentation provides practical strategies to help practitioners in dealing with second-order split-plot and by extension, split-split-plot experiments, including an innovative approach for the construction of a response surface design referred to as second-order sub-array Cartesian product split-plot design. This new type of design, which is an alternative to other classes of split-plot designs that are currently in use in defense and industrial applications, is economical, has a low prediction variance of the regression coefficients, and low aliasing between model terms. Based on an assessment using well accepted key design evaluation criterion, second-order sub-array Cartesian product split-plot designs perform as well as historical designs that have been considered standards up to this point. |
Luis Cortes MITRE |
Breakout | Materials | 2018 |
Breakout Uncertainty Quantification and Sensitivity Analysis Methodology for AJEM (Abstract)
The Advanced Joint Effectiveness Model (AJEM) is a joint forces model developed by the U.S. Army that is used in vulnerability and lethality (V/L) predictions for threat/target interactions. This complex model primarily generates a probability response for various components, scenarios, loss of capabilities, or summary conditions. Sensitivity analysis (SA) and uncertainty quantification (UQ), referred to jointly as SA/UQ, are disciplines that provide the working space for how model estimates changes with respect to changes in input variables. A comparative measure that will be used to characterize the effect of an input change on the predicted outcome was developed and is reviewed and illustrated in this presentation. This measure provides a practical context that stakeholders can better understand and utilize. We show graphical and tabular results using this measure. |
Craig Andres Mathematical Statistician U.S. Army CCDC Data & Analysis Center ![]() (bio)
Craig Andres is a Mathematical Statistician at the recently formed DEVCOM Data & Analysis Center in the Materiel M&S Branch working primarily on the uncertainty quantification, as well as the verification and validation, of the AJEM vulnerability model. He is currently on developmental assignment with the Capabilities Projection Team. He has a master’s degrees in Applied Statistics from Oakland University and a master’s degree in Mathematics from Western Michigan University. |
Breakout | 2021 |
|
Breakout Multivariate Density Estimation and Data-enclosing Sets Using Sliced-Normal Distributions (Abstract)
This talk focuses on a means to characterize the variability in multivariate data. This characterization, given in terms of both probability distributions and closed sets, is instrumental in assessing and improving the robustness/reliability properties of system designs. To this end, we propose the Sliced-Normal (SN) class of distributions. The versatility of SNs enables modeling complex multivariate dependencies with minimal modeling effort. A polynomial mapping is defined which injects the physical space into a higher dimensional (so-called) feature space on which a suitable normal distribution is defined. Optimization-based strategies for the estimation of SNs from data in both physical and feature space are proposed. The formulations in physical space yield non-convex optimization programs whose solutions often outperform the solutions in feature space. However, the formulations in feature space yield either an analytical solution or a convex program thereby facilitating their application to problems in high dimensions. The superlevel sets of a SN density have a closed semi-algebraic form making them amenable to rigorous uncertainty quantification methods. Furthermore, we propose a chance-constrained optimization framework for identifying and eliminating the effects of outliers in the prescription of such regions. These strategies can be used to mitigate the conservatism intrinsic to many methods in system identification, fault detection, robustness/reliability analysis, and robust design caused by assuming parameter independence and by including outliers in the dataset. |
Luis Crespo | Breakout |
![]() | 2019 |
Keynote Wednesday Keynote Speaker III |
Timothy Dare Deputy Director, Developmental Test, Evaluation, and Prototyping SES OUSD(R&E) ![]() (bio)
Mr. Timothy S. Dare is the Deputy Director for Developmental Test, Evaluation and Prototyping (DD(DTEP)). As the DD(DTEP), he serves as the principal advisor on developmental test and evaluation (DT&E) to the Secretary of Defense, Under Secretary of Defense for Research and Engineering, and Director of Defense Research and Engineering for Advanced Capabilities. Mr. Dare is responsible for DT&E policy and guidance in support of the acquisition of major Department of Defense (DoD) systems, and providing advocacy, oversight, and guidance to the DT&E acquisition workforce. He informs policy and advances leading edge technologies through the development of advanced technology concepts, and developmental and operational prototypes. By working closely with interagency partners, academia, industry and governmental labs, he identifies, develops and demonstrates multi-domain technologies and concepts that address high-priority DoD, multi-Service, and Combatant Command warfighting needs. Prior to his appointment in December 2018, Mr. Dare was a Senior Program Manager for program management and capture at Lockheed Martin (LM) Space. In this role he was responsible for the capture and execution phases of multiple Intercontinental Ballistic Missile programs for Minuteman III, including a new airborne Nuclear Command and Control (NC2) development program. His major responsibilities included establishing program working environments at multiple locations, policies, processes, staffing, budget and technical baselines. Mr. Dare has extensive T&E and prototyping experience. As the Engineering Program Manager for the $1.8B Integrated Space C2 programs for NORAD/NORTHCOM systems at Cheyenne Mountain, Mr. Dare was the Integration and Test lead focusing on planning, executing, and evaluating the integration and test phases (developmental and operational T&E) for Missile Warning and Space Situational Awareness (SSA) systems. Mr. Dare has also been the Engineering Lead/Integration and Test lead on other systems such as the Hubble Space Telescope; international border control systems; artificial intelligence (AI) development systems (knowledge-based reasoning); Service-based networking systems for the UK Ministry of Defence; Army C2 systems; Space Fence C2; and foreign intelligence, surveillance, and reconnaissance systems. As part of the Department’s strategic defense portfolio, Mr. Dare led the development of advanced prototypes in SSA C2 (Space Fence), Information Assurance (Single Sign-on), AI systems, and was the sponsoring program manager for NC2 capability development. Mr. Dare is a graduate of Purdue University and is a member of both the Association for Computing Machinery and Program Management Institute. He has been recognized by the U.S. Air Force for his contributions supporting NORAD/NORTHCOM’s strategic defense missions, and the National Aeronautics and Space Administration for his contributions to the original Hubble Space Telescope program. Mr. Dare holds a U.S. Patent for Single Sign-on architectures. |
Keynote |
![]() | 2019 |
Breakout Sequential Testing for Fast Jet Life Support Systems (Abstract)
The concept of sequential testing has many disparate meanings. Often, for statisticians it takes on a purely mathematical context while possibly meaning multiple disconnected test events for some practitioners. Here we present a pedagogical approach to creating test designs involving constrained factors using JMP software. Recent experiences testing one of the U.S. military’s fast jet life support systems (LSS) serves as a case study and back drop to support the presentation. The case study discusses several lessons learned during LSS testing, applicable to all practitioners of scientific test and analysis techniques (STAT) and design of experiments (DOE). We conduct a short analysis to specifically determine a test region with a set of factors pertinent to modeling human breathing and the use of breathing machines as part of the laboratory setup. A comparison of several government and industry laboratory test points and regions with governing documentation is made, along with the our proposal for determining a necessary and sufficient test region for tests involving human breathing as a factor. |
Darryl Ahner
(bio)
Steven Thorsen, Sarah Burke & |
Breakout |
![]() | 2019 |
Breakout Uncertainty Quantification: Combining Large Scale Computational Models with Physical Data for Inference (Abstract)
Combining physical measurements with computational models is key to many investigations involving validation and uncertainty quantification (UQ). This talk surveys some of the many approaches taken for validation and UQ, with large-scale computational models. Experience with such applications suggests classifications of different types of problems with common features (e.g. data size, amount of empiricism in the model, computational demands, availability of data, extent of extrapolation required, etc.). More recently, social and social-technical systems are being considered for similar analyses, bringing new challenges to this area. This talk will approaches for such problems and will highlight what might be new research directions for application and methodological development in UQ. |
Dave Higdon | Breakout |
![]() | 2019 |
Breakout Toward Real-Time Decision Making in Experimental Settings (Abstract)
Materials scientists, computer scientists and statisticians at LANL have teamed up to investigate how to make near real time decisions during fast-paced experiments. For instance, a materials scientist at a beamline typically has a short window in which to perform a number of experiments, after which they analyze the experimental data, determine interesting new experiments and repeat. In typical circumstances, that cycle could take a year. The goal of this research and development project is to accelerate that cycle so that interesting leads are followed during the short window for experiments, rather than in years to come. We detail some of our UQ work in materials science, including emulation, sensitivity analysis, and solving inverse problems, with an eye toward real-time decision making in experimental settings. |
Devin Francom | Breakout | 2019 |
Session Title | Speaker | Type | Materials | Year |
---|---|---|---|---|
Breakout NASA’s Human Exploration Research Analog (HERA): An analog mission for isolation, confinement, and remote conditions in space exploration scenarios |
Shelley Cazares NASA |
Breakout | 2018 |
|
Breakout An Overview of Uncertainty-Tolerant Decision Support Modeling for Cybersecurity |
Samrat Chatterjee | Breakout | 2019 |
|
Contributed Optimizing for Mission Success in Highly Uncertain Scenarios |
Brian Chell | Contributed | 2018 |
|
Breakout Testing and Estimation in Sequential High-Dimension Data |
Eric Chicken Florida State University |
Breakout | Materials | 2017 |
Breakout Introduction of Uncertainty Quantification and Industry Challenges |
Peter Chien | Breakout | 2018 |
|
Contributed Bayesian Calibration and Uncertainty Analysis: A Case Study Using a 2-D CFD Turbulence Model |
Peter Chien | Contributed | 2018 |
|
Short Course Survey Construction and Analysis |
MoonJung Cho Bureau of Labor and Statistics |
Short Course | Materials | 2018 |
Breakout Machine Learning Prediction With Streamed Sensor Data: Fitting Neural Networks using Functional Principal Components |
Chris Gotwalt | Breakout |
![]() | 2019 |
Breakout Initial Investigation into the Psychoacoustic Properties of Small Unmanned Aerial System Noise |
Andrew Chrisrian Structural Acoustics Branch |
Breakout | Materials | 2018 |
Breakout Screening Designs for Resource Constrained Deterministic M&S Experiments: A Munitions Case Study |
Christopher Drake | Breakout |
![]() | 2019 |
Keynote Tuesday Keynote |
David Chu President Institute for Defense Analyses ![]() |
Keynote | 2019 |
|
Keynote Opening Keynote |
David Chu President IDA ![]() |
Keynote |
![]() | 2018 |
Keynote Thursday Lunchtime Keynote Speaker |
T. Charles Clancy Bradley Professor of Electrical and Computer Engineering Virginia Tech ![]() |
Keynote |
![]() | 2019 |
Breakout Asparagus is the most articulate vegetable ever |
Kerstan Cole | Breakout | 2018 |
|
Breakout Case Studies for Statistical Engineering Applied to Powered Rotorcraft Wind-Tunnel Tests |
Sean Commo NASA |
Breakout | 2016 |
|
Breakout Structured Decision Making |
Christine Anderson Cook LANL |
Breakout | Materials | 2017 |
Breakout B-52 Radar Modernization Test Design Considerations |
Stuart Corbett AFOTEC |
Breakout | Materials | 2018 |
Breakout How do the Framework and Design of Experiments Fundamentally Help? |
Luis A. Cortes | Breakout | 2017 |
|
Breakout Cases of Second-Order Split-Plot Designs |
Luis Cortes MITRE |
Breakout | Materials | 2018 |
Breakout Uncertainty Quantification and Sensitivity Analysis Methodology for AJEM |
Craig Andres Mathematical Statistician U.S. Army CCDC Data & Analysis Center ![]() |
Breakout | 2021 |
|
Breakout Multivariate Density Estimation and Data-enclosing Sets Using Sliced-Normal Distributions |
Luis Crespo | Breakout |
![]() | 2019 |
Keynote Wednesday Keynote Speaker III |
Timothy Dare Deputy Director, Developmental Test, Evaluation, and Prototyping SES OUSD(R&E) ![]() |
Keynote |
![]() | 2019 |
Breakout Sequential Testing for Fast Jet Life Support Systems |
Darryl Ahner | Breakout |
![]() | 2019 |
Breakout Uncertainty Quantification: Combining Large Scale Computational Models with Physical Data for Inference |
Dave Higdon | Breakout |
![]() | 2019 |
Breakout Toward Real-Time Decision Making in Experimental Settings |
Devin Francom | Breakout | 2019 |