Session Title | Speaker | Type | Recording | Materials | Year |
---|---|---|---|---|---|
USE OF DESIGN & ANALYSIS OF COMPUTER EXPERIMENTS (DACE) IN SPACE MISSION TRAJECTORY DESIGN (Abstract)
Numerical astrodynamics simulations are characterized by a large input space and com-plex, nonlinear input-output relationships. Standard Monte Carlo runs of these simulations are typically time-consuming and numerically costly. We adapt the Design and Analysis of Com-puter Experiments (DACE) approach to astrodynamics simulations to improve runtimes and increase information gain. Space-filling designs such as the Latin Hypercube Sampling (LHS) methods, Maximin and Maximum Projection Sampling, with the Surrogate modelling tech-niques of DACE such as Radial Basis Functions and Gaussian Process Regression, gave sig-nificant improvements for astrodynamics simulations, including: reduced run time of Monte Carlo simulations, improved speed of sensitivity analysis, confidence intervals for non-Gaussian behavior, determination of outliers, and identifying extreme output cases not found by standard simulation and sampling methods. Four case studies are presented on novel applications of DACE to mission trajectory design & conjunction assessments with space debris: 1) Gaussian Process regression modelling of maneuvers and navigation uncertainties for commercial cislunar and NASA CLPS lunar missions; 2) Development of a Surrogate model for predficting collision risk and miss distance volatility between debris and satellites in Low Earth orbit; 3) Prediction of the displace-ment of an object in orbit using laser photon pressure; 4) Prediction of eclipse durations for the NASA IBEX-extended mission. The surrogate models are assessed by k-fold cross validation. The relative selection of sur-rogate model performance is verified by the Root Mean Square Error (RMSE) of predic-tions at untried points. To improve the sampling of manoeuvre and navigational uncertain-ties within trajectory design for lunar missions, a maximin LHS was used, in combination with the Gates model for thrusting uncertainty. This led to improvements in simulation ef-ficiency, producing a non-parametric ΔV distribution that was processed with Kernel Density Estimation to resolve a ΔV99.9 prediction with confidence bounds. In a collaboration with the NASA Conjunction Assessment Risk Analysis (CARA) group, the changes in probability of collision (Pc) for two objects in LEO was predicted using a network of 13 Gaussian Process Regression-based surrogate models that deter-mined the future trends in covariance and miss distance volatility, given the data provided within a conjunction data message. This allowed for determination of the trend in the prob-ability distribution of Pc up to three days from the time of closest approach, as well as the interpretation of this prediction in the form of an urgency metric that can assist satellite operators in the manoeuvre decision process. The main challenge in adapting the methods of DACE to astrodynamics simulations was to deliver a direct benefit to mission planning and design. This was achieved by delivering improvements in confidence and predictions for metrics including propellant required to complete a lunar mission (expressed as ΔV); statistical validation of the simulation models used and advising when a sufficient number of simulation runs have been made to verify convergence to an adequate confidence interval. Future applications of DACE for mission design include determining an optimal tracking schedule plan for a lunar mission, and ro-bust trajectory design for low thrust propulsion. |
David Shteinman CEO/Managing Director Industrial Sciences Group ![]() (bio)
David Shteinman is a professional engineer and industrial entrepreneur with 34 years’ experience in manufacturing, mining and transport. David has a passion for applying advanced mathematics and statistics to improve business outcomes. He leads The Industrial Sciences Group, a company formed from two of Australia’s leading research centres: The Australian Research Council and the University of NSW. He has been responsible for over 30 projects to date that combine innovative applications of mathematics and statistics to several branches of engineering (astronautics, space missions, geodesy, transport, mining & mineral processing, plant control systems and energy) in Australia, the US and Israel. Major projects in the Space sector include projects with Google Lunar X and Space IL Lunar Mission; NASA Goddard; The Australian Space Agency; Space Environment Research Centre and EOS Space Systems; Geoscience Australia; AGI, the University of Colorado (Boulder), Space Exploration Engineering (contractors to NASA CLPS Missions). |
![]() | 2022 |
||
Breakout Uncertainty Quantification: Combining Large Scale Computational Models with Physical Data for Inference (Abstract)
Combining physical measurements with computational models is key to many investigations involving validation and uncertainty quantification (UQ). This talk surveys some of the many approaches taken for validation and UQ, with large-scale computational models. Experience with such applications suggests classifications of different types of problems with common features (e.g. data size, amount of empiricism in the model, computational demands, availability of data, extent of extrapolation required, etc.). More recently, social and social-technical systems are being considered for similar analyses, bringing new challenges to this area. This talk will approaches for such problems and will highlight what might be new research directions for application and methodological development in UQ. |
Dave Higdon | Breakout |
![]() | 2019 |
|
Breakout Sequential Testing for Fast Jet Life Support Systems (Abstract)
The concept of sequential testing has many disparate meanings. Often, for statisticians it takes on a purely mathematical context while possibly meaning multiple disconnected test events for some practitioners. Here we present a pedagogical approach to creating test designs involving constrained factors using JMP software. Recent experiences testing one of the U.S. military’s fast jet life support systems (LSS) serves as a case study and back drop to support the presentation. The case study discusses several lessons learned during LSS testing, applicable to all practitioners of scientific test and analysis techniques (STAT) and design of experiments (DOE). We conduct a short analysis to specifically determine a test region with a set of factors pertinent to modeling human breathing and the use of breathing machines as part of the laboratory setup. A comparison of several government and industry laboratory test points and regions with governing documentation is made, along with the our proposal for determining a necessary and sufficient test region for tests involving human breathing as a factor. |
Darryl Ahner
(bio)
Steven Thorsen, Sarah Burke & |
Breakout |
![]() | 2019 |
|
Keynote Wednesday Keynote Speaker III |
Timothy Dare Deputy Director, Developmental Test, Evaluation, and Prototyping SES OUSD(R&E) ![]() (bio)
Mr. Timothy S. Dare is the Deputy Director for Developmental Test, Evaluation and Prototyping (DD(DTEP)). As the DD(DTEP), he serves as the principal advisor on developmental test and evaluation (DT&E) to the Secretary of Defense, Under Secretary of Defense for Research and Engineering, and Director of Defense Research and Engineering for Advanced Capabilities. Mr. Dare is responsible for DT&E policy and guidance in support of the acquisition of major Department of Defense (DoD) systems, and providing advocacy, oversight, and guidance to the DT&E acquisition workforce. He informs policy and advances leading edge technologies through the development of advanced technology concepts, and developmental and operational prototypes. By working closely with interagency partners, academia, industry and governmental labs, he identifies, develops and demonstrates multi-domain technologies and concepts that address high-priority DoD, multi-Service, and Combatant Command warfighting needs. Prior to his appointment in December 2018, Mr. Dare was a Senior Program Manager for program management and capture at Lockheed Martin (LM) Space. In this role he was responsible for the capture and execution phases of multiple Intercontinental Ballistic Missile programs for Minuteman III, including a new airborne Nuclear Command and Control (NC2) development program. His major responsibilities included establishing program working environments at multiple locations, policies, processes, staffing, budget and technical baselines. Mr. Dare has extensive T&E and prototyping experience. As the Engineering Program Manager for the $1.8B Integrated Space C2 programs for NORAD/NORTHCOM systems at Cheyenne Mountain, Mr. Dare was the Integration and Test lead focusing on planning, executing, and evaluating the integration and test phases (developmental and operational T&E) for Missile Warning and Space Situational Awareness (SSA) systems. Mr. Dare has also been the Engineering Lead/Integration and Test lead on other systems such as the Hubble Space Telescope; international border control systems; artificial intelligence (AI) development systems (knowledge-based reasoning); Service-based networking systems for the UK Ministry of Defence; Army C2 systems; Space Fence C2; and foreign intelligence, surveillance, and reconnaissance systems. As part of the Department’s strategic defense portfolio, Mr. Dare led the development of advanced prototypes in SSA C2 (Space Fence), Information Assurance (Single Sign-on), AI systems, and was the sponsoring program manager for NC2 capability development. Mr. Dare is a graduate of Purdue University and is a member of both the Association for Computing Machinery and Program Management Institute. He has been recognized by the U.S. Air Force for his contributions supporting NORAD/NORTHCOM’s strategic defense missions, and the National Aeronautics and Space Administration for his contributions to the original Hubble Space Telescope program. Mr. Dare holds a U.S. Patent for Single Sign-on architectures. |
Keynote |
![]() | 2019 |
|
Exploring the behavior of Bayesian adaptive design of experiments (Abstract)
Physical experiments in the national security arena, including nuclear deterrence, are often expensive and time-consuming resulting in small sample sizes which make it difficult to achieve desired statistical properties. Bayesian adaptive design of experiments (BADE) is a sequential design of experiment approach which updates the test design in real time, in order to optimally collect data. BADE recommends ending experiments early by either concluding that the experiment would have ended in efficacy or futility, had the testing completely finished, with sufficiently high probability. This is done by using data already collected and marginalizing over the remaining uncollected data and updating the Bayesian posterior distribution in near real-time. BADE has seen successes in clinical trials, resulting in quicker and more effective assessments of drug trials while also reducing ethical concerns. BADE has typically only been used in futility studies rather than efficacy studies for clinical trials, although there hasn’t been much debate for this current paradigm. BADE has been proposed for testing in the national security space for similar reasons of quicker and cheaper test series. Given the high-consequence nature of the tests performed in the national security space, a strong understanding of new methods is required before being deployed. The main contribution of this research was to reproduce results seen in previous studies, for different aspects of model performance. A large simulation inspired by a real testing problem at Sandia National Laboratories was performed to understand the behavior of BADE under various scenarios, including shifts to mean, standard deviation, and distributional family, all in addition to the presence of outliers. The results help explain the behavior of BADE under various assumption violations. Using the results of this simulation, combined with previous work related to BADE in this field, it is argued this approach could be used as part of an “evidence package” for deciding to stop testing early due to futility, or with stronger evidence, efficacy. The combination of expert knowledge with statistical quantification provides the stronger evidence necessary for a method in its infancy in a high-consequence, new application area such as national security. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology & Engineering Solutions of Sandia, LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration under contract DE-NA0003525. |
Daniel Ries Sandia National Laboratories ![]() (bio)
Daniel Ries is a Senior Member of the Technical Staff at Sandia National Laboratories in the Statistics and Data Analytics Department. As an applied research statistician, Daniel collaborates with scientists and engineers in fields including nuclear deterrence, nuclear forensics, nuclear non-proliferation, global security, and climate science. His statistical work spans the topics of experimental design, inverse modeling, uncertainty quantification for machine learning and deep learning, spatio-temporal data analysis, and Bayesian methodology. Daniel completed his PhD in statistics at Iowa State University. |
Session Recording |
![]() Recording | 2022 |
|
Poster Nonparametric multivariate profile monitoring using regression trees (Abstract)
Monitoring noisy profiles for changes in the behavior can be used to validate whether the process is operating under normal conditions over time. Change-point detection and estimation in sequences of multivariate functional observations is a common method utilized in monitoring such profiles. A nonparametric method utilizing Classification and Regression Trees (CART) to build a sequence of regression trees is proposed which makes use of the Kolmogorov-Smirnov statistic to monitor profile behavior. Our novel method compares favorably to existing methods in the literature. |
Daniel A. Timme PhD Candidate Florida State University ![]() (bio)
Daniel A. Timme is currently a student pursuing his PhD in Statistics from Florida State University. Mr. Timme graduated with a BS in Mathematics from the University of Houston and a BS in Business Management from the University of Houston-Clear Lake. He earned an MS in Systems Engineering with a focus in Reliability and a second MS in Space Systems with focuses in Space Vehicle Design and Astrodynamics, both from the Air Force Institute of Technology. Mr. Timme’s research interest is primarily focused in the areas of reliability engineering, applied mathematicsstatistics, optimization, and regression.
|
Poster | Session Recording |
![]() Recording | 2022 |
Tutorial Introducing git for reproducible research (Abstract)
Version control software manages different versions of files, providing both an archive of files, a means to manage multiple versions of a file, and perhaps distribution. Perhaps the most popular program in the computer science community for version control is git, which serves as the backbone for websites such as Github, Bitbucket, and others. In this mini-tutorial we will introduce basics of version control in general, git in particular. We explain what role git plays in a reproducible research context. The goal of the course is to get participants started using git. We will create and clone repositories, add and track files in a repository, and manage git branches. We also discuss a few git best practices. |
Curtis Miller Research Staff Member IDA (bio)
Curtis Miller is a Research Staff Member at the Institute for Defense Analyses in the Operational Evaluation Division, where he is a member of the Test Science and Naval Warfare groups. He obtained a PhD in Mathematics at the University of Utah in 2020, where he studied mathematical statistics. He provides statistical expertise to the rest of OED and works primarily on design of experiments and analysis of modeling and simulation data. |
Tutorial | Session Recording |
![]() Recording | 2022 |
Breakout Multivariate Density Estimation and Data-enclosing Sets Using Sliced-Normal Distributions (Abstract)
This talk focuses on a means to characterize the variability in multivariate data. This characterization, given in terms of both probability distributions and closed sets, is instrumental in assessing and improving the robustness/reliability properties of system designs. To this end, we propose the Sliced-Normal (SN) class of distributions. The versatility of SNs enables modeling complex multivariate dependencies with minimal modeling effort. A polynomial mapping is defined which injects the physical space into a higher dimensional (so-called) feature space on which a suitable normal distribution is defined. Optimization-based strategies for the estimation of SNs from data in both physical and feature space are proposed. The formulations in physical space yield non-convex optimization programs whose solutions often outperform the solutions in feature space. However, the formulations in feature space yield either an analytical solution or a convex program thereby facilitating their application to problems in high dimensions. The superlevel sets of a SN density have a closed semi-algebraic form making them amenable to rigorous uncertainty quantification methods. Furthermore, we propose a chance-constrained optimization framework for identifying and eliminating the effects of outliers in the prescription of such regions. These strategies can be used to mitigate the conservatism intrinsic to many methods in system identification, fault detection, robustness/reliability analysis, and robust design caused by assuming parameter independence and by including outliers in the dataset. |
Luis Crespo | Breakout |
![]() | 2019 |
|
Breakout Uncertainty Quantification and Sensitivity Analysis Methodology for AJEM (Abstract)
The Advanced Joint Effectiveness Model (AJEM) is a joint forces model developed by the U.S. Army that is used in vulnerability and lethality (V/L) predictions for threat/target interactions. This complex model primarily generates a probability response for various components, scenarios, loss of capabilities, or summary conditions. Sensitivity analysis (SA) and uncertainty quantification (UQ), referred to jointly as SA/UQ, are disciplines that provide the working space for how model estimates changes with respect to changes in input variables. A comparative measure that will be used to characterize the effect of an input change on the predicted outcome was developed and is reviewed and illustrated in this presentation. This measure provides a practical context that stakeholders can better understand and utilize. We show graphical and tabular results using this measure. |
Craig Andres Mathematical Statistician U.S. Army CCDC Data & Analysis Center ![]() (bio)
Craig Andres is a Mathematical Statistician at the recently formed DEVCOM Data & Analysis Center in the Materiel M&S Branch working primarily on the uncertainty quantification, as well as the verification and validation, of the AJEM vulnerability model. He is currently on developmental assignment with the Capabilities Projection Team. He has a master’s degrees in Applied Statistics from Oakland University and a master’s degree in Mathematics from Western Michigan University. |
Breakout | 2021 |
||
Breakout How do the Framework and Design of Experiments Fundamentally Help? (Abstract)
The Military Global Positioning System (GPS) User Equipment (MGUE) program is the user segment of the GPS Enterprise—a program on the Deputy Assistant Secretary of Defense for Developmental Test and Evaluation (DASD(DT&E)) Space and Missile Defense Systems portfolio. The MGUE program develops and test GPS cards capable of using Military-Code (M Code) and legacy signals. The program’s DT&E strategy is challenging. The GPS cards provide new, untested capabilities. Milestone A was approved on 2012 with sole source contracts released to three vendors for Increment 1. An Acquisition Decision Memorandum directs the program to support a Congressional Mandate to provide GPS M Code-capable equipment for use after FY17. Increment 1 provides GPS receiver form factors for the ground domain interface as well as for the aviation and maritime domain interface. When reviewing DASD(DT&E) Milestone B (MS B) Assessment Report, Mr. Kendall expressed curiosity about how the Developmental Evaluation Framework (DEF) and Design of Experiments (DOE) help. This presentation describes how the DEF and DOE methods help producing more informative and more economical developmental tests than what was originally under consideration by the test community—decision-quality information with a 60% reduction in test cycle time. It provides insight into how the integration of the DEF and DOE improved the overall effectiveness of the DT&E strategy, illustrates the role of modeling and simulation (M&S) in the test design process, provides examples of experiment designs for different functional and performance areas, and illustrates the logic involved in balancing risks and test resources. The DEF and DOE methods enables the DT&E strategy to fully exploit early discovery, to maximize verification and validation opportunities, and to characterize system behavior across the technical requirements space. |
Luis A. Cortes | Breakout | 2017 |
||
Breakout Cases of Second-Order Split-Plot Designs (Abstract)
The fundamental principles of experiment design are factorization, replication, randomization, and local control of error. In many industries, however, departure from these principles is commonplace. Often in our experiments complete randomization is not feasible because the factor level settings are hard, impractical, or inconvenient to change or the resources available to execute under homogeneous conditions are limited. These restrictions in randomization lead to split-plot experiments. We are also often interested in fitting second-order models leading to second-order split-plot experiments. Although response surface methodology has grown tremendously since 1951, the lack of alternatives for second-order split-plots remains largely unexplored. The literature and textbooks offer limited examples and provide guidelines that often are too general. This deficit of information leaves practitioners ill prepared to face the many roadblocks associated with these types of designs. This presentation provides practical strategies to help practitioners in dealing with second-order split-plot and by extension, split-split-plot experiments, including an innovative approach for the construction of a response surface design referred to as second-order sub-array Cartesian product split-plot design. This new type of design, which is an alternative to other classes of split-plot designs that are currently in use in defense and industrial applications, is economical, has a low prediction variance of the regression coefficients, and low aliasing between model terms. Based on an assessment using well accepted key design evaluation criterion, second-order sub-array Cartesian product split-plot designs perform as well as historical designs that have been considered standards up to this point. |
Luis Cortes MITRE |
Breakout | Materials | 2018 |
|
Breakout B-52 Radar Modernization Test Design Considerations (Abstract)
Inherent system processes, restrictions on collection, or cost may impact the practical execution of an operational test. This study presents the use of blocking and split-plot designs when complete randomization is not feasible in operational test. Specifically, the USAF B-52 Radar Modernization Program test design is used to present tradeoffs of different design choices and the impacts of those choices on cost, operational relevance, and analytical rigor. |
Stuart Corbett AFOTEC |
Breakout | Materials | 2018 |
|
Breakout Structured Decision Making (Abstract)
Difficult choices are often required in a decision-making process where resources and budgets are increasingly constrained. This talk demonstrates a structured decision-making approach using layered Pareto fronts to prioritize the allocation of funds between munitions stockpiles based on their estimated reliability, the urgency of needing available units, and the consequences if adequate numbers of units are . This case study illustrates the process of first identifying appropriate metrics that summarize important dimensions of the decision, and then eliminating non-contenders from further consideration in an objective stage. The final subjective stage incorporates subject matter expert priorities to select the four stockpiles to receive additional maintenance and surveillance funds based on understanding the trade-offs and robustness to various user priorities. |
Christine Anderson Cook LANL |
Breakout | Materials | 2017 |
|
Breakout Case Studies for Statistical Engineering Applied to Powered Rotorcraft Wind-Tunnel Tests (Abstract)
Co-Authors: Sean A. Commo, Ph.D., P.E. and Peter A. Parker, Ph.D., P.E. NASA Langley Research Center, Hampton, Virginia, USA Austin D. Overmeyer, Philip E. Tanner, and Preston B. Martin, Ph.D. U.S. Army Research, Development, and Engineering Command, Hampton, Virginia, USA. The application of statistical engineering to helicopter wind-tunnel testing was explored during two powered rotor entries. The U.S. Army Aviation Development Directorate Joint Research Program Office and the NASA Revolutionary Vertical Lift Project performed these tests jointly at the NASA Langley Research Center. Both entries were conducted in the 14- by 22-Foot Subsonic Tunnel with a small segment of the overall tests devoted to developing case studies of a statistical engineering approach. Data collected during each entry were used to estimate response surface models characterizing vehicle performance, a novel contribution of statistical engineering applied to powered rotor-wing testing. Additionally, a 16- to 47-times reduction in the number of data points required was estimated when comparing a statistically-engineered approach to a conventional one-factor-at-a-time approach. |
Sean Commo NASA |
Breakout | 2016 |
||
Breakout Asparagus is the most articulate vegetable ever (Abstract)
During the summer of 2001, Microsoft launched Windows XP, which was lauded by many users as the most reliable and usable operating system at the time. Miami Herald columnist, Dave Berry, responded to this praise by stating that “this is like saying asparagus is the most articulate vegetable ever.” Whether you agree or disagree with Dave Berry, these users’ reactions are relative (to other operating systems and to other past and future versions). This is due to an array of technological factors that have facilitated human-system improvements. Automation is often cited as improving human-system performance across many domains. It is true that when the human and automation are aligned, performance improves. But, what about the times that this is not the case? This presentation will describe the myths and facts about human-system performance and increasing levels of automation through examples of human-system R&D conducted on a satellite ground system. Factors that affect human-system performance and a method to characterize mission performance as it relates to increasing levels of automation will also be discussed. |
Kerstan Cole | Breakout | 2018 |
||
Keynote Thursday Lunchtime Keynote Speaker |
T. Charles Clancy Bradley Professor of Electrical and Computer Engineering Virginia Tech ![]() (bio)
Charles Clancy is the Bradley Professor of Electrical and Computer Engineering at Virginia Tech where he serves as the Executive Director of the Hume Center for National Security and Technology. Clancy leads a range of strategic programs at Virginia Tech related to security, including the Commonwealth Cyber Initiative. Prior to joining VT in 2010, Clancy was an engineering leader in the National Security Agency, leading research programs in digital communications and signal processing. He received his PhD from the University of Maryland, MS from University of Illinois, and BS from the Rose-Hulman Institute of Technology. He is co-author to over 200 peer-reviewed academic publications, six books, over twenty patents, and co-founder to five venture-backed startup companies. |
Keynote |
![]() | 2019 |
|
Keynote Tuesday Keynote |
David Chu President Institute for Defense Analyses ![]() (bio)
David Chu serves as President of the Institute for Defense Analyses. IDA is a non-profit corporation operating in the public interest. Its three federally funded research and development centers provide objective analyses of national security issues and related national challenges, particularly those requiring extraordinary scientific and technical expertise. As president, Dr. Chu directs the activities of more than 1,000 scientists and technologists. Together, they conduct and support research requested by federal agencies involved in advancing national security and advising on science and technology issues. Dr. Chu served in the Department of Defense as Under Secretary of Defense for Personnel and Readiness from 2001-2009, and earlier as Assistant Secretary of Defense and Director for Program Analysis and Evaluation from 1981-1993. From 1978-1981 he was the Assistant Director of the Congressional Budget Office for National Security and International Affairs. Dr. Chu served in the U. S. Army from 1968-1970. He was an economist with the RAND Corporation from 1970-1978, director of RAND’s Washington Office from 1994-1998, and vice president for its Army Research Division from 1998-2001. He earned a bachelor of arts in economics and mathematics, and his doctorate in economics, from Yale University. Dr. Chu is a member of the Defense Science Board and a Fellow of the National Academy of Public Administration. He is a recipient of the Department of Defense Medal for Distinguished Public Service with Gold Palm, the Department of Veterans Affairs Meritorious Service Award, the Department of the Army Distinguished Civilian Service Award, the Department of the Navy Distinguished Public Service Award, and the National Academy of Public Administration’s National Public Service Award. |
Keynote | 2019 |
||
Keynote Opening Keynote |
David Chu President IDA ![]() (bio)
David Chu serves as President of the Institute for Defense Analyses. IDA is a non-profit corporation operating in the public interest. Its three federally funded research and development centers provide objective analyses of national security issues and related national challenges, particularly those requiring extraordinary scientific and technical expertise. As president, Dr. Chu directs the activities of more than 1,000 scientists and technologists. Together, they conduct and support research requested by federal agencies involved in advancing national security and advising on science and technology issues. Dr. Chu served in the Department of Defense as Under Secretary of Defense for Personnel and Readiness from 2001-2009, and earlier as Assistant Secretary of Defense and Director for Program Analysis and Evaluation from 1981-1993. From 1978-1981 he was the Assistant Director of the Congressional Budget Office for National Security and International Affairs. Dr. Chu served in the U. S. Army from 1968-1970. He was an economist with the RAND Corporation from 1970-1978, director of RAND’s Washington Office from 1994-1998, and vice president for its Army Research Division from 1998-2001. He earned a bachelor of arts in economics and mathematics, and his doctorate in economics, from Yale University. Dr. Chu is a member of the Defense Science Board and a Fellow of the National Academy of Public Administration. He is a recipient of the Department of Defense Medal for Distinguished Public Service with Gold Palm, the Department of Veterans Affairs Meritorious Service Award, the Department of the Army Distinguished Civilian Service Award, the Department of the Navy Distinguished Public Service Award, and the National Academy of Public Administration’s National Public Service Award. |
Keynote |
![]() | 2018 |
|
Breakout Screening Designs for Resource Constrained Deterministic M&S Experiments: A Munitions Case Study (Abstract)
Abstract: In applications where modeling and simulation runs are quick and cheap, space filling designs will give the tester all the information they need to make decisions about their system. In some applications however, this luxury does not exist, and each M&S run can be time consuming and expensive. In these scenarios, a sequential test approach provides an efficient solution where an initial screening is conducted, followed by an augmentation to fit specified models of interest. Until this point, no dedicated screening designs for UQ applications in resource constrained situations existed. Due to the Army’s frequent exposure to this type of situation, the need sparked a collaboration between Picatinny’s Statistical Methods and Analysis group and Professor V. Roshan Joseph of Georgie Tech, where a new type of UQ screening design was created. This paper provides a brief introduction to the design, its intended use, and a case study in which this new methodology was applied. |
Christopher Drake | Breakout |
![]() | 2019 |
|
Tutorial An Introduction to Data Visualization (Abstract)
Data visualization can be used to present findings, explore data, and use the human eye to find patterns that a computer would struggle to locate. Borrowing tools from art, storytelling, data analytics and software development, data visualization is an indispensable part of the analysis process. While data visualization usage spans across multiple disciplines and sectors, most never receive formal training in the subject. As such, this tutorial will introduce key data visualization building blocks and how to best use those building blocks for different scenarios and audiences. We will also go over tips on accessibility, design and interactive elements. While this will by no means be a complete overview of the data visualization field, by building a foundation and introducing some rules of thumb, attendees will be better equipped for communicating their findings to their audience. |
Christina Heinich AST, Data Systems NASA ![]() (bio)
Chris Heinich is a software engineer data visualization at NASA Langley Research Center. Working in a diverse set of fields like web development, software testing, and data analysis, Chris currently acts as the data visualization expert for the Office of Chief Information Officer (OCIO) Data Science Team. In the past they have developed and maintained dashboards for use cases such as COVID case/vaccine tracking or project management, as well as deploying open source data visualization tools like Dash/Plotly to traditional government servers. Currently they lead an agency-wide data visualization community of practice, which provides training and online collaboration space for NASA employees to learn and share experiences in data visualization. |
Tutorial | Session Recording |
![]() Recording | 2022 |
Assurance Techniques for Learning Enabled Autonomous Systems which Aid Systems Engineering (Abstract)
It is widely recognized that the complexity and resulting capabilities of autonomous systems created using machine learning methods, which we refer to as learning enabled autonomous systems (LEAS), pose new challenges to systems engineering test, evaluation, verification, and validation (TEVV) compared to their traditional counterparts. This presentation provides a preliminary attempt to map recently developed technical approaches in the assurance and TEVV of learning enabled autonomous systems (LEAS) literature to a traditional systems engineering v-model. This mapping categorizes such techniques into three main approaches: development, acquisition, and sustainment. This mapping reviews the latest techniques to develop safe, reliable, and resilient learning enabled autonomous systems, without recommending radical and impractical changes to existing systems engineering processes. By performing this mapping, we seek to assist acquisition professionals by (i) informing comprehensive test and evaluation planning, and (ii) objectively communicating risk to leaders. The inability to translate qualitative assessments to quantitative metrics which measure system performance hinder adoption. Without understanding the capabilities and limitations of existing assurance techniques, defining safety and performance requirements that are both clear and testable remains out of reach. We accompany recent literature reviews on autonomy assurance and TEVV by mapping such developments to distinct steps of a well known systems engineering model chosen due to its prevalence, namely the v-model. For three top-level lifecycle phases: development, acquisition, and sustainment, a section of the presentation has been dedicated to outlining recent technical developments for autonomy assurance. This representation helps identify where the latest methods for TEVV fit in the broader systems engineering process while also enabling systematic consideration of potential sources of defects, faults, and attacks. Note that we use the v-model only to assist the classification of where TEVV methods fit. This is not a recommendation to use a certain software development lifecycle over another. |
Christian Ellis Journeyman Fellow Army Research Laboratory / University of Mass. Dartmouth ![]() (bio)
Christian Ellis is a PhD student in the Department of Electrical and Computer Engineering at the University of Massachusetts Dartmouth, focused on building safe and robust autonomous ground systems for the United States Department of Defense. His research interests include unstructured wheeled ground autonomy, autonomous systems assurance, and safe autonomous navigation from human demonstrations. Recently, Christian received a student based paper award in 2020 for the paper titled “Software and System Reliability Engineering for Autonomous Systems Incorporating Machine Learning”. |
Session Recording |
![]() Recording | 2022 |
|
Breakout Initial Investigation into the Psychoacoustic Properties of Small Unmanned Aerial System Noise (Abstract)
For the past several years, researchers at NASA Langley have been engaged in a series of projects to study the degree to which existing facilities and capabilities, originally created for work on full-scale aircraft, are extensible to smaller scales – those of the small unmanned aerial systems (sUAS, also UAVs and, colloquially, `drones’) that have been showing up in the nation’s airspace. This paper follows an effort that has led to an initial human-subject psychoacoustic test regarding the annoyance generated by sUAS noise. This effort spans three phases: 1. the collection of the sounds through field recordings, 2. the formulation and execution of a psychoacoustic test using those recordings, 3. the analysis of the data from that test. The data suggests a lack of parity between the noise of the recorded sUAS and that of a set of road vehicles that were also recorded and included in the test, as measured by a set of contemporary noise metrics. |
Andrew Chrisrian Structural Acoustics Branch |
Breakout | Materials | 2018 |
|
Breakout Machine Learning Prediction With Streamed Sensor Data: Fitting Neural Networks using Functional Principal Components (Abstract)
Sensors that record sequences of measurements are now embedded in many products from wearable exercise watches to chemical and semiconductor manufacturing equipment. There is information in the shapes of the sensor stream curves that is highly predictive of a variety of outcomes such as the likelihood of a product failure event or batch yield. Despite this data now being common and readily available, it is often being used either inefficiently or not at all due to lack of knowledge and tools for how to properly leverage it. In this presentation, we will propose fitting splines to sensor streams and extracting features called functional principal component scores that offer a highly efficient low dimensional compression of the signal data. Then, we use these features as inputs into machine learning models like neural networks and LASSO regression models. Once one sees sensor data in this light, answering a wide variety of applied questions becomes a straightforward two stage process of data cleanup/functional feature extraction followed by modeling using those features as inputs. |
Chris Gotwalt | Breakout |
![]() | 2019 |
|
Short Course Survey Construction and Analysis (Abstract)
In this course, we introduce the main concepts of the survey methodology process – from survey sampling design to analyzing the data obtained from complex survey designs. The course topics include: 1. Introduction to the Survey Process 2. R Tools 3. Sampling Designs – Simple Random Sampling, Cluster Sampling, Stratified Sampling, and more 4. Weighting and Variance Estimation 5. Exploratory Data Analysis 6. Complex Survey Analysis We use a combination of lectures and hands-on exercises using R. Students are expected to have R and associated packages installed on their computers. We will send a list of required packages before the course. We also use data from Department of Defense surveys, where appropriate. |
MoonJung Cho Bureau of Labor and Statistics |
Short Course | Materials | 2018 |
|
Breakout Introduction of Uncertainty Quantification and Industry Challenges (Abstract)
Uncertainty is an inescapable reality that can be found in nearly all types of engineering analyses. It arises from sources like measurement inaccuracies, material properties, boundary and initial conditions, and modeling approximations. For example, the increasing use of numerical simulation models throughout industry promises improved design and insight at significantly lower costs and shorter timeframes than purely physical testing. However, the addition of numerical modeling has also introduced complexity and uncertainty to the process of generating actionable results. It has become not only possible, but vital to include Uncertainty Quantification (UQ) in engineering analysis. The competitive benefits of UQ include reduced development time and cost, improved designs, better understanding of risk, and quantifiable confidence in analysis results and engineering decisions. Unfortunately, there are significant cultural and technical challenges which prevent organizations from utilizing UQ methods and techniques in their engineering practice. This presentation will introduce UQ methodology and discuss the past and present strategies for addressing these challenges, making it possible to use UQ to enhance engineering processes with fewer resources and in more situations. Looking to the future, anticipated challenges will be discussed along with an outline of the path towards making UQ a common practice in engineering. |
Peter Chien | Breakout | 2018 |
Session Title | Speaker | Type | Recording | Materials | Year |
---|---|---|---|---|---|
USE OF DESIGN & ANALYSIS OF COMPUTER EXPERIMENTS (DACE) IN SPACE MISSION TRAJECTORY DESIGN |
David Shteinman CEO/Managing Director Industrial Sciences Group ![]() |
![]() | 2022 |
||
Breakout Uncertainty Quantification: Combining Large Scale Computational Models with Physical Data for Inference |
Dave Higdon | Breakout |
![]() | 2019 |
|
Breakout Sequential Testing for Fast Jet Life Support Systems |
Darryl Ahner | Breakout |
![]() | 2019 |
|
Keynote Wednesday Keynote Speaker III |
Timothy Dare Deputy Director, Developmental Test, Evaluation, and Prototyping SES OUSD(R&E) ![]() |
Keynote |
![]() | 2019 |
|
Exploring the behavior of Bayesian adaptive design of experiments |
Daniel Ries Sandia National Laboratories ![]() |
Session Recording |
![]() Recording | 2022 |
|
Poster Nonparametric multivariate profile monitoring using regression trees |
Daniel A. Timme PhD Candidate Florida State University ![]() |
Poster | Session Recording |
![]() Recording | 2022 |
Tutorial Introducing git for reproducible research |
Curtis Miller Research Staff Member IDA |
Tutorial | Session Recording |
![]() Recording | 2022 |
Breakout Multivariate Density Estimation and Data-enclosing Sets Using Sliced-Normal Distributions |
Luis Crespo | Breakout |
![]() | 2019 |
|
Breakout Uncertainty Quantification and Sensitivity Analysis Methodology for AJEM |
Craig Andres Mathematical Statistician U.S. Army CCDC Data & Analysis Center ![]() |
Breakout | 2021 |
||
Breakout How do the Framework and Design of Experiments Fundamentally Help? |
Luis A. Cortes | Breakout | 2017 |
||
Breakout Cases of Second-Order Split-Plot Designs |
Luis Cortes MITRE |
Breakout | Materials | 2018 |
|
Breakout B-52 Radar Modernization Test Design Considerations |
Stuart Corbett AFOTEC |
Breakout | Materials | 2018 |
|
Breakout Structured Decision Making |
Christine Anderson Cook LANL |
Breakout | Materials | 2017 |
|
Breakout Case Studies for Statistical Engineering Applied to Powered Rotorcraft Wind-Tunnel Tests |
Sean Commo NASA |
Breakout | 2016 |
||
Breakout Asparagus is the most articulate vegetable ever |
Kerstan Cole | Breakout | 2018 |
||
Keynote Thursday Lunchtime Keynote Speaker |
T. Charles Clancy Bradley Professor of Electrical and Computer Engineering Virginia Tech ![]() |
Keynote |
![]() | 2019 |
|
Keynote Tuesday Keynote |
David Chu President Institute for Defense Analyses ![]() |
Keynote | 2019 |
||
Keynote Opening Keynote |
David Chu President IDA ![]() |
Keynote |
![]() | 2018 |
|
Breakout Screening Designs for Resource Constrained Deterministic M&S Experiments: A Munitions Case Study |
Christopher Drake | Breakout |
![]() | 2019 |
|
Tutorial An Introduction to Data Visualization |
Christina Heinich AST, Data Systems NASA ![]() |
Tutorial | Session Recording |
![]() Recording | 2022 |
Assurance Techniques for Learning Enabled Autonomous Systems which Aid Systems Engineering |
Christian Ellis Journeyman Fellow Army Research Laboratory / University of Mass. Dartmouth ![]() |
Session Recording |
![]() Recording | 2022 |
|
Breakout Initial Investigation into the Psychoacoustic Properties of Small Unmanned Aerial System Noise |
Andrew Chrisrian Structural Acoustics Branch |
Breakout | Materials | 2018 |
|
Breakout Machine Learning Prediction With Streamed Sensor Data: Fitting Neural Networks using Functional Principal Components |
Chris Gotwalt | Breakout |
![]() | 2019 |
|
Short Course Survey Construction and Analysis |
MoonJung Cho Bureau of Labor and Statistics |
Short Course | Materials | 2018 |
|
Breakout Introduction of Uncertainty Quantification and Industry Challenges |
Peter Chien | Breakout | 2018 |