Session Title | Speaker | Type | Materials | Year |
---|---|---|---|---|
Breakout Statistical Engineering in Practice (Abstract)
Problems faced in defense and aerospace often go well beyond textbook problems presented in academic settings. Textbook problems are typically very well defined, and can be solved through application of one “correct” tool. Conversely, many defense and aerospace problems are ill-defined, at least initially, and require an overall strategy to attack, one involving multiple tools and often multiple disciplines. Statistical engineering is an approach recently developed for addressing large, complex, unstructured problems, particularly those for which data can be effectively utilized. This session will present a brief overview of statistical engineering, and how it can be applied to engineer solutions to complex problems. Following this introduction, two case studies of statistical engineering will be presented, to illustrate the concepts. |
Roger Hoerl Associate Professor of Statistics Union College ![]() (bio)
Dr. Roger W. Hoerl is the Brate-Peschel Associate Professor of Statistics at Union College in Schenectady, NY. Previously, he led the Applied Statistics Lab at GE Global Research. While at GE, Dr. Hoerl led a team of statisticians, applied mathematicians, and computational financial analysts who worked on some of GE’s most challenging research problems, such as developing personalized medicine protocols, enhancing the reliability of aircraft engines, and management of risk for a half-trillion dollar portfolio. Dr. Hoerl has been named a Fellow of the American Statistical Association and the American Society for Quality, and has been elected to the International Statistical Institute and the International Academy for Quality. He has received the Brumbaugh and Hunter Awards, as well as the Shewhart Medal, from the American Society for Quality, and the Founders Award and Deming Lectureship Award from the American Statistical Association. While at GE Global Research, he received the Coolidge Fellowship, honoring one scientist a year from among the four global GE Research and Development sites for lifetime technical achievement. His book with Ron Snee, Statistical Thinking: Improving Business Performance, now in its 3rd edition, was called “the most practical introductory statistics textbook ever published in a business context” by the journal Technometrics. |
Breakout |
![]() Recording | 2021 |
Breakout Statistical Engineering in Practice (Abstract)
Problems faced in defense and aerospace often go well beyond textbook problems presented in academic settings. Textbook problems are typically very well defined, and can be solved through application of one “correct” tool. Conversely, many defense and aerospace problems are ill-defined, at least initially, and require an overall strategy to attack, one involving multiple tools and often multiple disciplines. Statistical engineering is an approach recently developed for addressing large, complex, unstructured problems, particularly those for which data can be effectively utilized. This session will present a brief overview of statistical engineering, and how it can be applied to engineer solutions to complex problems. Following this introduction, two case studies of statistical engineering will be presented, to illustrate the concepts. |
Angie Patterson Chief Consulting Engineer GE Aviation |
Breakout |
Recording | 2021 |
Breakout Certification by Analysis: A 20-year Vision for Virtual Flight and Engine Testing (Abstract)
Analysis-based means of compliance for airplane and engine certification, commonly known as “Certification by Analysis” (CbA), provides a strong motivation for the development and maturation of current and future flight and engine modeling technology. The most obvious benefit of CbA is streamlined product certification testing programs at lower cost while maintaining equivalent levels of safety. The current state of technologies and processes for analysis is not sufficient to adequately address most aspects of CbA today, and concerted efforts to drastically improve analysis capability are required to fully bring the benefits of CbA to fruition. While the short-term cost and schedule benefits of reduced flight and engine testing are clearly visible, the fidelity of analysis capability required to realize CbA across a much larger percentage of product certification is not yet sufficient. Higher-fidelity analysis can help reduce the product development cycle and avoid costly and unpredictable performance and operability surprises that sometimes happen late in the development cycle. Perhaps the greatest long-term value afforded by CbA is the potential to accelerate the introduction of more aerodynamically and environmentally efficient products to market, benefitting not just manufacturers, but also airlines, passengers, and the environment. A far-reaching vision for CbA has been constructed to offer guidance in developing lofty yet realizable expectations regarding technology development and maturity through stakeholder involvement. This vision is composed of the following four elements: The ability to numerically simulate the integrated system performance and response of full-scale airplane and engine configurations in an accurate, robust, and computationally efficient manner. The development of quantified flight and engine modeling uncertainties to establish appropriate confidence in the use of numerical analysis for certification. The rigorous validation of flight and engine modeling capabilities against full-scale data from critical airplane and engine testing. The use of flight and engine modeling to enable Certification by Simulation. Key technical challenges include the ability to accurately predict airplane and engine performance for a single discipline, the robust and efficient integration of multiple disciplines, and the appropriate modeling of system-level assessment. Current modeling methods lack the capability to adequately model conditions that exist at the edges of the operating envelope where the majority of certification testing generally takes place. Additionally, large-scale engine or airplane multidisciplinary integration has not matured to the level where it can be reliably used to efficiently model the intricate interactions that exist in current or future aerospace products. Logistical concerns center primarily on the future High Performance Computing capability needed to perform the large number of computationally intensive simulations needed for CbA. Complex, time-dependent, multidisciplinary analyses will require a computing capacity increase several orders of magnitude greater than is currently available. Developing methods to ensure credible simulation results is critically important for regulatory acceptance of CbA. Confidence in analysis methodology and solutions is examined so that application validation cases can be properly identified. Other means of measuring confidence such as uncertainty quantification and “validation-domain” approaches may increase the credibility and trust in the predictions. Certification by Analysis is a challenging long-term endeavor that will motivate many areas of simulation technology development, while driving the potential to decrease cost, improve safety, and improve airplane and engine efficiency. Requirements to satisfy certification regulations provide a measurable definition for the types of analytical capabilities required for success. There is general optimism that CbA is a goal that can be achieved, and that a significant amount of flight testing can be reduced in the next few decades. |
Timothy Mauery Boeing ![]() (bio)
For the past 20 years, Timothy Mauery has been involved in the development of low-speed CFD design processes. In this capacity, he has had the opportunity to interact with users and provide CFD support and training throughout the product development cycle. Prior to moving to the Commercial Airplanes division of The Boeing Company, he worked at the Lockheed Martin Aircraft Center, providing aerodynamic liaison support on a variety of military modification and upgrade programs. At Boeing, he has had the opportunity to support both future products as well as existing programs with CFD analysis and wind tunnel testing. Over the past ten years, he has been closely involved in the development and evaluation of analysis-based certification processes for commercial transport vehicles, for both derivative programs as well as new airplanes. Most recently he was the principal investigator on a NASA research announcement for developing requirements for airplane certification by analysis. Timothy received his bachelor’s degree from Brigham Young University, and his master’s degree from The George Washington University, where he was also a research assistant at NASA-Langley. |
Breakout |
![]() | 2021 |
Breakout Dashboard for Equipment Failure Reports (Abstract)
Equipment Failure Reports (EFRs) describe equipment failures and the steps taken as a result of these failures. EFRs contain both structured and unstructured data. Currently, analysts manually read through EFRs to understand failure modes and make recommendations to reduce future failures. This is a tedious process where important trends and information can get lost. This motivated the creation of an interactive dashboard that extracts relevant information from the unstructured (i.e. free-form text) data and combines it with structured data like failure date, corrective action and part number. The dashboard is an RShiny application that utilizes numerous text mining and visualization packages, including tm, plotly, edgebundler, and topicmodels. It allows the end-user to filter to the EFRs that they care about and visualize meta-data, such as geographic region where the failure occurred, over time allowing previously unknown trends to be seen. The dashboard also applies topic modeling to the unstructured data to identify key themes. Analysts are now able to quickly identify frequent failure modes and look at time and region-based trends in these common equipment failures. |
Robert Cole Molloy Johns Hopkins University Applied Physics Laboratory ![]() (bio)
Robert Molloy is a data scientist for the Johns Hopkins University Applied Physic Laboratory’s Systems Analysis Group, where he supports a variety of projects including text mining on unstructured text data, applying machine learning techniques to text and signal data, and implementing and modifying existing natural language models. He graduated from the University of Maryland, College Park in May 2020 with a dual degree in computer science and mathematics with a concentration in statistics. |
Breakout |
![]() | 2021 |
Breakout A Framework for Efficient Operational Testing through Bayesian Adaptive Design (Abstract)
When developing a system, it is important to consider system performance from a user perspective. This can be done through operational testing—assessing the ability of representative users to satisfactorily accomplish tasks or missions with the system in operationally-representative environments. This process can be expensive and time-consuming, but is critical for evaluating a system. We show how an existing design of experiments (DOE) process for operational testing can be leveraged to construct a Bayesian adaptive design. This method, nested within the larger design created by the DOE process, allows interim analyses using predictive probabilities to stop testing early for success or futility. Furthermore, operational environments with varying probabilities of encountering are directly used in product evaluation. Representative simulations demonstrate how these interim analyses can be used in an operational test setting, and reductions in necessary test events are shown. The method allows for using either weakly informative priors when data from previous testing is not available, or for priors built using developmental testing data when it is available. The proposed method for creating priors using developmental testing data allows for more flexibility in which data can be incorporated into analysis than the current process does, and demonstrates that it is possible to get more precise parameter estimates. This method will allow future testing to be conducted in less time and at less expense, on average, without compromising the ability of the existing process to verify the system meets the user’s needs. |
Victoria Sieck Student / Operations Research Analyst University of New Mexico / Air Force Institute of Technology ![]() (bio)
Victoria R.C. Sieck is a PhD Candidate in Statistics at the University of New Mexico. She is also an Operations Research Analyst in the US Air Force (USAF), with experiences in the USAF testing community as a weapons and tactics analyst and an operational test analyst. Her research interests include design of experiments and improving operational testing through the use of Bayesian methods. |
Breakout |
![]() | 2021 |
Breakout Cybersecurity Metrics and Quantification: Problems, Some Results, and Research Directions (Abstract)
Cybersecurity Metrics and Quantification is a fundamental but notoriously hard problem. It is one of the pillars underlying the emerging Science of Cybersecurity. In this talk, I will describe a number of cybersecurity metrics quantification research problems that are encountered in evaluating the effectiveness of a range of cyber defense tools. I will review the research results we have obtained over the past years. I will also discuss future research directions, including the ones that are undertaken in my research group. |
Shouhuai Xu Professor University of Colorado Colorado Springs ![]() (bio)
Shouhuai Xu is the Gallogly Chair Professor in the Department of Computer Science, University of Colorado Colorado Springs (UCCS). Prior to joining UCCS, he was with the Department of Computer Science, University of Texas at San Antonio. He pioneered a systematic approach, dubbed Cybersecurity Dynamics, to modeling and quantifying cybersecurity from a holistic perspective. This approach has three orthogonal research thrusts: metrics (for quantifying security, resilience and trustworthiness/uncertainty, to which this talk belongs), cybersecurity data analytics, and cybersecurity first-principle modeling (for seeking cybersecurity laws). His research has won a number of awards, including the 2019 worldwide adversarial malware classification challenge organized by the MIT Lincoln Lab. His research has been funded by AFOSR, AFRL, ARL, ARO, DOE, NSF and ONR. He co-initiated the International Conference on Science of Cyber Security (SciSec) and is serving as its Steering Committee Chair. He has served as Program Committee co-chair for a number of international conferences and as Program Committee member for numerous international conferences. Â He is/was an Associate Editor of IEEE Transactions on Dependable and Secure Computing (IEEE TDSC), IEEE Transactions on Information Forensics and Security (IEEE T-IFS), and IEEE Transactions on Network Science and Engineering (IEEE TNSE). More information about his research can be found at https://xu-lab.org. |
Breakout | Materials | 2021 |
Breakout An Adaptive Approach to Shock Train Detection (Abstract)
Development of new technology always incorporates model testing. This is certainly true for hypersonics, where flight tests are expensive and testing of component- and system-level models has significantly advanced the field. Unfortunately, model tests are often limited in scope, being only approximations of reality and typically only partially covering the range of potential realistic conditions. In this talk, we focus on the problem of real-time detection of the shock train leading edge in high-speed air-breathing engines, such as dual-mode scramjets. Detecting and controlling the shock train leading edge is important to the performance and stability of such engines, and a problem that has seen significant model testing on the ground and some flight testing. Often, methods developed for shock train detection are specific to the model used. Thus, they may not generalize well when tested in another facility or in flight as they typically require a significant amount of prior characterization of the model and flow regime. A successful method for shock train detection needs to be robust to changes in features like isolator geometry, inlet and combustor states, flow regimes, and available sensors. Such data can be difficult or impossible to obtain if the isolator operating regime is large. To this end, we propose the an approach for real-time detection of the isolator shock train. Our approach uses real-time pressure measurements to adaptively estimate the shock train position in a data-driven manner. We show that the method works well across different isolator models, placement of pressure transducers, and flow regimes. We believe that a data-driven approach is the way forward for bridging the gap between testing and reality, saving development time and money. |
Greg Hunt Assistant Professor William & Mary ![]() (bio)
Greg is an interdisciplinary researcher that builds scientific tools. He is trained as a statistician, mathematician and computer scientist. Currently he work on a diverse set of problems in biology, physics, and engineering. |
Breakout |
![]() | 2021 |
Breakout Estimating Pure-Error from Near Replicates in Design of Experiments (Abstract)
In design of experiments, setting exact replicates of factor settings enables estimation of pure-error; a model-independent estimate of experimental error useful in communicating inherent system noise and testing of model lack-of-fit. Often in practice, the factor levels for replicates are precisely measured rather than precisely set, resulting in near-replicates. This can result in inflated estimates of pure-error due to uncompensated set-point variation. In this article, we review previous strategies for estimating pure-error from near-replicates and propose a simple alternative. We derive key analytical properties and investigate them via simulation. Finally, we illustrate the new approach with an application. |
Caleb King Research Statistician Developer SAS Institute ![]() |
Breakout |
![]() | 2021 |
Breakout Uncertainty Quantification and Sensitivity Analysis Methodology for AJEM (Abstract)
The Advanced Joint Effectiveness Model (AJEM) is a joint forces model developed by the U.S. Army that is used in vulnerability and lethality (V/L) predictions for threat/target interactions. This complex model primarily generates a probability response for various components, scenarios, loss of capabilities, or summary conditions. Sensitivity analysis (SA) and uncertainty quantification (UQ), referred to jointly as SA/UQ, are disciplines that provide the working space for how model estimates changes with respect to changes in input variables. A comparative measure that will be used to characterize the effect of an input change on the predicted outcome was developed and is reviewed and illustrated in this presentation. This measure provides a practical context that stakeholders can better understand and utilize. We show graphical and tabular results using this measure. |
Craig Andres Mathematical Statistician U.S. Army CCDC Data & Analysis Center ![]() (bio)
Craig Andres is a Mathematical Statistician at the recently formed DEVCOM Data & Analysis Center in the Materiel M&S Branch working primarily on the uncertainty quantification, as well as the verification and validation, of the AJEM vulnerability model. He is currently on developmental assignment with the Capabilities Projection Team. He has a master’s degrees in Applied Statistics from Oakland University and a master’s degree in Mathematics from Western Michigan University. |
Breakout | 2021 |
|
Breakout Surrogate Models and Sampling Plans for Multi-fidelity Aerodynamic Performance Databases (Abstract)
Generating aerodynamic coefficients can be computationally expensive, especially for the viscous CFD solvers in which multiple complex models are iteratively solved. When filling large design spaces, utilizing only a high accuracy viscous CFD solver can be infeasible. We apply state-of-the-art methods for design and analysis of computer experiments to efficiently develop an emulator for high-fidelity simulations. First, we apply a cokriging model to leverage information from fast low-fidelity simulations to improve predictions with more expensive high-fidelity simulations. Combining space-filling designs with a Gaussian process model-based sequential sampling criterion allows us to efficiently generate sample points and limit the number of costly simulations needed to achieve the desired model accuracy. We demonstrate the effectiveness of these methods with an aerodynamic simulation study using a conic shape geometry. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. Release Number: LLNL-ABS-818163 |
Kevin Quinlan Applied Statistician Lawrence Livermore National Laboratory ![]() |
Breakout |
![]() | 2021 |
Session Title | Speaker | Type | Materials | Year |
---|---|---|---|---|
Breakout Statistical Engineering in Practice |
Roger Hoerl Associate Professor of Statistics Union College ![]() |
Breakout |
![]() Recording | 2021 |
Breakout Statistical Engineering in Practice |
Angie Patterson Chief Consulting Engineer GE Aviation |
Breakout |
Recording | 2021 |
Breakout Certification by Analysis: A 20-year Vision for Virtual Flight and Engine Testing |
Timothy Mauery Boeing ![]() |
Breakout |
![]() | 2021 |
Breakout Dashboard for Equipment Failure Reports |
Robert Cole Molloy Johns Hopkins University Applied Physics Laboratory ![]() |
Breakout |
![]() | 2021 |
Breakout A Framework for Efficient Operational Testing through Bayesian Adaptive Design |
Victoria Sieck Student / Operations Research Analyst University of New Mexico / Air Force Institute of Technology ![]() |
Breakout |
![]() | 2021 |
Breakout Cybersecurity Metrics and Quantification: Problems, Some Results, and Research Directions |
Shouhuai Xu Professor University of Colorado Colorado Springs ![]() |
Breakout | Materials | 2021 |
Breakout An Adaptive Approach to Shock Train Detection |
Greg Hunt Assistant Professor William & Mary ![]() |
Breakout |
![]() | 2021 |
Breakout Estimating Pure-Error from Near Replicates in Design of Experiments |
Caleb King Research Statistician Developer SAS Institute ![]() |
Breakout |
![]() | 2021 |
Breakout Uncertainty Quantification and Sensitivity Analysis Methodology for AJEM |
Craig Andres Mathematical Statistician U.S. Army CCDC Data & Analysis Center ![]() |
Breakout | 2021 |
|
Breakout Surrogate Models and Sampling Plans for Multi-fidelity Aerodynamic Performance Databases |
Kevin Quinlan Applied Statistician Lawrence Livermore National Laboratory ![]() |
Breakout |
![]() | 2021 |
2019-06-24