Session Title | Speaker | Type | Recording | Materials | Year |
---|---|---|---|---|---|
*Virtual Speaker* Epistemic and Aleatoric Uncertainty Quantification for Gaussian Processes (Abstract)
One of the major advantages of using Gaussian Processes for regression and surrogate modeling is the availability of uncertainty bounds for the predictions. However, the validity of these bounds depends on using an appropriate covariance function, which is usually learned from the data out of a parametric family through maximum likelihood estimation or cross-validation methods. In practice, the data might not contain enough information to select the best covariance function, generating uncertainty in the hyperparameters, which translates into an additional layer of uncertainty in the predictions (epistemic uncertainty) on top of the usual posterior covariance (aleatoric uncertainty). In this talk, we discuss considering both uncertainties for UQ, and we quantify them by extending the MLE paradigm using a game theoretical framework that identifies the worst-case prior under a likelihood constraint specified by the practitioner. |
Pau Batlle PhD Student California Institute of Technology (bio)
Pau is a PhD student in Computing and Mathematical Sciences at Caltech, advised by Houman Owhadi. His main research area is Game-theoretical Uncertainty Quantification (UQ) and Gaussian Process Regression (GPR), both from a theoretical point of view and with applications to the Physical Sciences, including collaboration with scientists from the Machine Learning and Uncertainty Quantification groups at NASA Jet Propulsion Laboratory. Before joining Caltech, he graduated from Universitat Politècnica de Catalunya with a double degree in Mathematics and Engineering Physics as part of the CFIS program and he was a research intern at the Center for Data Science at NYU for 9 months. |
2023 |
|||
Presentation *Virtual Speaker* Using Changepoint Detection and Artificial Intelligence to Classify Fuel Pressure Behaviors in Aerial Refueling (Abstract)
An open question in aerial refueling system test and evaluation is how to classify fuel pressure states and behaviors reproducibly and defensibly when visually inspecting the data stream. This question exists because fuel pressure data streams are highly stochastic, may exhibit multiple types of troublesome behavior simultaneously in a single stream, and may exhibit unique platform-dependent discernable behaviors. These data complexities result in differences in fuel pressure behavior classification determinations between engineers based on experience level and individual judgement. In addition to consuming valuable time, discordant judgements between engineers reduce confidence in metrics and other derived analytic products that are used to evaluate the system's performance. The Pruned Exact Linear Time (PELT) changepoint detection algorithm is an unsupervised machine learning method that, when coupled with an expert system AI, has provided a consistent and reproducible solution in classifying various fuel pressure states and behaviors with adjustable sensitivity. |
Nelson Walker and Michelle Ouellette Mathematical Statistician United States Air Force (bio)
Dr. Walker is a statistician for the United States Air Force at the 412th Test Wing at Edwards AFB, California. He graduate with a PhD in statistics from Kansas State University in 2021. Ms. Ouellette is the Data Science and Statistics Lead at the 418th Global Reach Flight Test Squadron for the United States Air Force at the 412th Test Wing at Edwards AFB, California. She graduated with a M.S. in Statistics from California State University, Fullerton in 2018. Ms. Ouellette is the Data Science and Statistics Lead at the 418th Global Reach Flight Test Squadron for the United States Air Force at the 412th Test Wing at Edwards AFB, California. She graduated with a M.S. in Statistics from California State University, Fullerton in 2018. |
Presentation | Materials | 2023 |
|
Speed Presentation A Bayesian Approach for Nonparametric Multivariate Process Monitoring using Universal Residuals (Abstract)
In Quality Control, monitoring sequential-functional observations for characteristic changes through change-point detection is a common practice to ensure that a system or process produces high-quality outputs. Existing methods in this field often only focus on identifying when a process is out-of-control without quantifying the uncertainty of the underlying decision-making processes. To address this issue, we propose using universal residuals under a Bayesian paradigm to determine if the process is out-of-control and assess the uncertainty surrounding that decision. The universal residuals are computed by combining two non-parametric techniques: regression trees and kernel density estimation. These residuals have the key feature of being uniformly distributed when the process is in control. To test if the residuals are uniformly distributed across time (i.e., that the process is in-control), we use a Bayesian approach for hypothesis testing, which outputs posterior probabilities for events such as the process being out-of-control at the current time, in the past, or in the future. We perform a simulation study and demonstrate that the proposed methodology has remarkable detection and a low false alarm rate. |
Daniel Timme PhD Candidate Florida State University (bio)
Daniel A. Timme is currently a PhD candidate in Statistics at Florida State University. Mr. Timme graduated with a BS in Mathematics from the University of Houston and a BS in Business Management from the University of Houston-Clear Lake. He earned an MS in Systems Engineering with a focus in Reliability and a second MS in Space Systems with focuses in Space Vehicle Design and Astrodynamics, both from the Air Force Institute of Technology. Mr. Timme’s research interest is primarily focused in the areas of reliability engineering, applied mathematics and statistics, optimization, and regression. |
Speed Presentation | Materials | 2023 |
|
Presentation A Bayesian Decision Theory Framework for Test & Evaluation (Abstract)
Decisions form the core of T&E: decisions about which tests to conduct and, especially, decisions on whether to accept or reject a system at its milestones. The traditional approach to acceptance is based on conducting tests under various conditions to ensure that key performance parameters meet certain thresholds with the required degree of confidence. In this approach, data is collected during testing, then analyzed with techniques from classical statistics in a post-action report. This work explores a new Bayesian paradigm for T&E based on one simple principle: maintaining a model of the probability distribution over system parameters at every point during testing. In particular, the Bayesian approach posits a distribution over parameters prior to any testing. This prior distribution provides (a) the opportunity to incorporate expert scientific knowledge into the inference procedure, and (b) transparency regarding all assumptions being made. Once a prior distribution is specified, it can be updated as tests are conducted to maintain a probability distribution over the system parameters at all times. One can leverage this probability distribution in a variety of ways to produce analytics with no analog in the traditional T&E framework. In particular, having a probability distribution over system parameters at any time during testing enables one to implement an optimal decision-making procedure using Bayesian Decision Theory (BDT). BDT accounts for the cost of various testing options relative to the potential value of the system being tested. When testing is expensive, it provides guidance on whether to conserve resources by ending testing early. It evaluates the potential benefits of testing for both its ability to inform acceptance decisions and for its intrinsic value to the commander of an accepted system. This talk describes the BDT paradigm for T&E and provides examples of how it performs in simple scenarios. In future work we plan to extend the paradigm to include the features, the phenomena, and the SME elicitation protocols necessary to address realistic T&E cases. |
James Ferry Senior Research Scientist Metron, Inc. (bio)
Dr. James Ferry has been developing Bayesian analytics at Metron for 18 years. He has been the Principal Investigator for a variety of R&D projects that apply Bayesian methods to data fusion, network science, and machine learning. These projects range from associating disparate data types from multiple sensors for missile defense, developing methods to track hidden structures on dynamically changing networks, computing incisive analytics efficiently from information in large databases, and countering adversarial attacks on neural-network-based image classifiers. Dr. Ferry was active in the network science community in the 2010's. He organized a full-day special session on network science at FUSION 2015, co-organized WIND 2016 (Workshop on Incomplete Network Data), and organized a multi-day session on the Frontiers of Networks at the MORS METSM meeting in December 2016. Since then, his focus has been Bayesian analytics and network science algorithms for the Intelligence Community. Prior to Metron, Dr. Ferry was a computational fluid dynamicist. He developed models and supercomputer simulations of the multiphase fluid dynamics of rocket engines at the Center for Simulation of Advanced Rockets at UIUC. He has 30+ technical publications in fluid dynamics, network science, and Bayesian analytics. He holds a B.S. in Mathematics from M.I.T. and a Ph.D. in Applied Mathematics from Brown University. |
Presentation | Materials | 2023 |
|
Speed Presentation A Bayesian Optimal Experimental Design for High-dimensional Physics-based Models (Abstract)
Many scientific and engineering experiments are developed to study specific questions of interest. Unfortunately, time and budget constraints make operating these controlled experiments over wide ranges of conditions intractable, thus limiting the amount of data collected. In this presentation, we discuss a Bayesian approach to identify the most informative conditions, based on the expected information gain. We will present a framework for finding optimal experimental designs that can be applied to physics-based models with high-dimensional inputs and outputs. We will study a real-world example where we aim to infer the parameters of a chemically reacting system, but there are uncertainties in both the model and the parameters. A physics-based model was developed to simulate the gas-phase chemical reactions occurring between highly reactive intermediate species in a high-pressure photolysis reactor coupled to a vacuum-ultraviolet (VUV) photoionization mass spectrometer. This time-of-flight mass spectrum evolves in both kinetic time and VUV energy producing a high-dimensional output at each design condition. The high-dimensional nature of the model output poses a significant challenge for optimal experimental design, as a surrogate model is built for each output. We discuss how accurate low-dimensional representations of the high-dimensional mass spectrum are necessary for computing the expected information gain. Bayesian optimization is employed to maximize the expected information gain by efficiently exploring a constrained design space, taking into account any constraint on the operating range of the experiment. Our results highlight the trade-offs involved in the optimization, the advantage of using optimal designs, and provide a workflow for computing optimal experimental designs for high-dimensional physics-based models. |
James Oreluk Postdoctoral Researcher Sandia National Laboratories (bio)
James Oreluk is a postdoctoral researcher at Sandia National Laboratories in Livermore, CA. He earned his Ph.D. in Mechanical Engineering from UC Berkeley with research on developing optimization methods for validating physics-based models. His current research focuses on advancing uncertainty quantification and machine learning methods to efficiently solve complex problems, with recent work on utilizing low-dimensional representation for optimal decision making. |
Speed Presentation | Materials | 2023 |
|
Speed Presentation A Data-Driven Approach of Uncertainty Quantification on Reynolds Stress Based on DNS Turbulence Data (Abstract)
High-fidelity simulation capabilities have progressed rapidly over the past decades in computational fluid dynamics (CFD), resulting in plenty of high-resolution flow field data. Uncertainty quantification remains an unsolved problem due to the high-dimensional input space and the intrinsic complexity of turbulence. Here we developed an uncertainty quantification method to model the Reynolds stress based on Karhunen-Loeve Expansion(KLE) and Project Pursuit basis Adaptation polynomial chaos expansion(PPA). First, different representative volume elements (RVEs) were randomly drawn from the flow field, and KLE was used to reduce them into a moderate dimension. Then, we build polynomial chaos expansions of Reynolds stress using PPA. Results show that this method can yield a surrogate model with a test accuracy of up to 90%. PCE coefficients also show that Reynolds stress strongly depends on second-order KLE random variables instead of first-order terms. Regarding data efficiency, we built another surrogate model using a neural network(NN) and found that our method outperforms NN in limited data cases. |
Zheming Gou Graduate Research assistant University of Southern California (bio)
Zheming Gou, a Ph.D. student in the Department of Mechanical Engineering at the University of Southern California, is a highly motivated individual with a passion for high-fidelity simulations, uncertainty quantification, and machine learning, especially in high dimensions and rare data scenarios. Currently, Zheming Gou is engaged in research to build probabilistic models for multiscale simulations using tools including polynomial chaos expansion (PCE) and state-of-art machine learning methods. |
Speed Presentation | Materials | 2023 |
|
Speed Presentation A generalized influence maximization problem (Abstract)
The influence maximization problem is a popular topic in social networks with several applications in viral marketing and epidemiology. One possible way to understand the problem is from the perspective of a marketer who wants to achieve the maximum influence on a social network by choosing an optimum set of nodes of a given size as seeds. The marketer actively influences these seeds, followed by a passive viral process based on a certain influence diffusion model, in which influenced nodes influence other nodes without external intervention. Kempe et al. showed that a greedy algorithm-based approach can provide a (1-1/e)-approximation guarantee compared to the optimal solution if the influence spreads according to the Triggering model. In our current work, we consider a much more general problem where the goal is to maximize the total expected reward obtained from the nodes that are influenced by a given time (that may be finite or infinite) where the reward obtained by influencing a set of nodes can depend on the set (and not necessarily a sum of rewards from the individual nodes) as well as the times at which each node gets influenced, we can restrict ourself to a subset of the network from where the seeds can be chosen, we can choose to assign multiple units of our budget to a single node (where the maximum number of budget units that may be assigned on a node can depend on the node), and a seeded node will actually get influenced with a certain probability where the probability is a non-decreasing function of the number of budget units assigned to that node. We have formulated a greedy algorithm that provides a (1-1/e)-approximation guarantee compared to the optimal solution of this generalized influence maximization problem if the influence spreads according to the Triggering model. |
Sumit Kumar Kar Ph.D. candidate University of North Carolina at Chapel Hill (bio)
I am currently a Ph.D. candidate and a Graduate Teaching Assistant in the department of Statistics & Operations Research at the University of North Carolina at Chapel Hill (UNC-CH). My Ph.D. advisors are Prof. Nilay Tanik Argon, Prof. Shankar Bhamidi, and Prof. Serhan Ziya. I broadly work in probability, statistics, and operations research. My current research interests include (but are not confined to) working on problems in random networks which find interesting practical applications such as in viral marketing or Word-Of-Mouth (WOM) marketing, efficient immunization during epidemic outbreaks, finding the root of an epidemic, and so on. Apart from research, I have worked on two statistical consulting projects with the UNC Dept. of Medicine and Quantum Governance, L3C, CUES. Also, I am extremely passionate about teaching. I have been the primary instructor as well as a teaching assistant for several courses at UNC-CH STOR. You can find more about me on my website here: https://sites.google.com/view/sumits-space/home. |
Speed Presentation | 2023 |
||
Speed Presentation A Stochastic Petri Net Model of Continuous Integration and Continuous Delivery (Abstract)
Modern software development organizations rely on continuous integration and continuous delivery (CI/CD), since it allows developers to continuously integrate their code in a single shared repository and automates the delivery process of the product to the user. While modern software practices improve the performance of the software life cycle, they also increase the complexity of this process. Past studies make improvements to the performance of the CI/CD pipeline. However, there are fewer formal models to quantitatively guide process and product quality improvement or characterize how automated and human activities compose and interact asynchronously. Therefore, this talk develops a stochastic Petri net model to analyze a CI/CD pipeline to improve process performance in terms of the probability of successfully delivering new or updated functionality by a specified deadline. The utility of the model is demonstrated through a sensitivity analysis to identify stages of the pipeline where improvements would most significantly improve the probability of timely product delivery. In addition, this research provided an enhanced version of the conventional CI/CD pipeline to examine how it can improve process performance in general. The results indicate that the augmented model outperforms the conventional model, and sensitivity analysis suggests that failures in later stages are more important and can impact the delivery of the final product. |
Sushovan Bhadra Master's student University of Massachusetts Dartmouth (bio)
Sushovan Bhadra is a MS student in the Department of Electrical & Computer Engineering at the University of Massachusetts Dartmouth (UMassD). He received his BS (2016) in Electrical Engineering from Ahsanullah University of Science and Technology (AUST), Bangladesh. |
Speed Presentation | Materials | 2023 |
|
Mini-Tutorial A Tour of JMP Reliability Platforms and Bayesian Methods for Reliability Data (Abstract)
JMP is a comprehensive, visual, and interactive statistical discovery software with a carefully curated graphical user interface designed for statistical discovery. The software is packed with traditional and modern statistical analysis capabilities and many unique innovative features. The software hosts several suites of tools that are especially valuable to the DATAWorks’s audience. The software includes suites for Design of Experiments, Quality Control, Process Analysis, and Reliability Analysis. JMP has been building its reliability suite for the past fifteen years. The reliability suite in JMP is a comprehensive and mature collections of JMP platforms. The suite empowers reliability engineers with tools for analyzing time-to-event data, accelerated life test data, observational reliability data, competing cause data, warranty data, cumulative damage data, repeated measures degradation data, destructive degradation data, and recurrence data. For each type of data, there are numerous models and one or more methodologies that are applicable based on the nature of data. In addition to reliability data analysis platforms, the suite also provides capabilities of reliability engineering for system reliability from two distinct perspectives, one for non-repairable systems and the other for repairable systems. The capability of JMP reliability suite is also at the frontier of advanced research on reliability data analysis. Inspired by the research by Prof. William Meeker at Iowa State University, we have implemented Bayesian inference methodologies for analyzing three most important types of reliability data. The tutorial will start with an overall introduction to JMP’s reliability platforms. Then the tutorial will focus on analyzing time-to-event data, accelerated life test data, and repeated measures degradation data. The tutorial will present analyzing these types of reliability data using traditional methods, and highlight when, why, and how to analyze them in JMP using Bayesian methods. |
Peng Liu Principal Research Statistician Developer JMP Statistical Discovery (bio)
Peng Liu is a Principal Research Statistician Developer at JMP Statistical Discovery LLC. He holds a Ph.D. in statistics from NCSU. He has been working at JMP since 2007. He specializes in computational statistics, software engineering, reliability data analysis, reliability engineering, time series analysis, and time series forecasting. He is responsible for developing and maintaining all JMP platforms in the above areas. He has broad interest in statistical analysis research and software product development. |
Mini-Tutorial | Materials | 2023 |
|
Presentation Advanced Automated Machine Learning System for Cybersecurity (Abstract)
Florida International University (FIU) has developed an Advanced Automated Machine Learning System (AAMLS) under the sponsored research from Department of Defense - Test Resource Management Center (DOD-TRMC), to provide Artificial Intelligence based advanced analytics solutions in the area of Cyber, IOT, Network, Energy, Environment etc. AAMLS is a Rapid Modeling & Testing Tool (RMTT) for developing machine learning and deep learning models in few steps by subject matter experts from various domains with minimum machine learning knowledge using auto & optimization workflows. AAMLS allows analysis of data collected from different test technology domains by using machine learning / deep learning and ensemble learning approaches to generate models, make predictions, then apply advanced analytics and visualization to perform analysis. This system enables automated machine learning using AI based Advanced Analytics and the Analytics Control Center platforms by connecting to multiple Data Sources. Artificial Intelligence based Advanced Analytics Platform: This platform is the analytics engine of AAML which provides pre-processing, feature engineering, model building and predictions. Primary components of this platform include:
|
Himanshu Dayaram Upadhyay Associate Professor (ECE) Florida International University (bio)
Dr. Himanshu Upadhyay is serving Florida International University’s Applied Research Center for the past 21 years, leading the Artificial Intelligence / Cybersecurity / Big Data research group. He is currently working as Associate Professor in Electrical & Computer Engineering teaching Artificial Intelligence and Cybersecurity courses. He is mentoring DOE Fellows, AI Fellows, Cyber Fellows, undergraduate and graduate students supporting multiple cybersecurity & AI research projects from various federal agencies. He brings more than 30 years of experience in artificial intelligence/machine learning, big data, cybersecurity, information technology, management and engineering to his role, serving as co-principal investigator for multimillion - dollar cybersecurity and artificial intelligence projects for the Department of Defense and Defense Intelligence Agency. He is also serving as co-principal investigator for Department of Energy’s Office of Environmental Management research projects focused on knowledge/waste management, cybersecurity, artificial intelligence and big data technologies. He has published multiple papers in the area of cybersecurity, machine learning, deep learning, big data, knowledge / nuclear waste management and service-oriented architecture. His current research focuses on artificial intelligence, machine learning, deep learning, cyber security, big data, cyber analytics/visualization, cyber forensics, malware analysis and blockchain. He has architected a range of tiered and distributed application system to address strategic business needs, managing a team of researchers and scientists building secured enterprise information systems. |
Presentation | 2023 |
||
Speed Presentation An Evaluation Of Periodic Developmental Reviews Using Natural Language Processing (Abstract)
As an institution committed to developing leaders of character, the United States Military Academy (USMA) holds a vested interest in measuring character growth. One such tool, the Periodic Developmental Review (PDR), has been used by the Academy’s Institutional Effectiveness Office for over a decade. PDRs are written counseling statements evaluating how a cadet is developing with respect to his/her peers. The objective of this research was to provide an alternate perspective of the PDR system by using statistical and natural language processing (NLP) based approaches to find whether certain dimensions of PDR data were predictive of a cadet’s overall rating. This research implemented multiple NLP tasks and techniques, including sentiment analysis, named entity recognition, tokenization, part-of-speech tagging, and word2vec, as well as statistical models such as linear regression and ordinal logistic regression. The ordinal logistic regression model concluded PDRs with optional written summary statements had more predictable overall scores than those without summary statements. Additionally, those who wrote the PDR on the cadet (Self, Instructor, Peer, Subordinate) held strong predictive value towards the overall rating. When compared to a self-reflecting PDR, instructor-written PDRs were 62.40% more probable to have a higher overall score, while subordinate-written PDRs had a probability of improvement of 61.65%. These values were amplified to 70.85% and 73.12% respectively when considering only those PDRs with summary statements. These findings indicate that different writer demographics have a different understanding of the meaning of each rating level. Recommendations for the Academy would be implementing a forced distribution or providing a deeper explanation of overall rating in instructions. Additionally, no written language facets analyzed demonstrated predictive strength, meaning written statements do not introduce unwanted bias and could be made a required field for more meaningful feedback to cadets. |
Dominic Rudakevych Cadet United States Military Academy (bio)
Cadet Dominic Rudakevych is from Hatboro, Pennsylvania, and is a senior studying mathematics at the United States Military Academy (USMA). Currently, CDT Rudakevych serves as the Society for Industrial and Applied Mathematics president at USMA and the captain of the chess team. He is involved in research within the Mathematics department, using artificial intelligence methods to analyze how written statements about cadets impact overall cadet ratings. He will earn a Bachelor of Science Degree in mathematics upon graduation. CDT Rudakevych will commission as a Military Intelligence officer in May and is excited to serve his country as a human-intelligence platoon leader. |
Speed Presentation | Materials | 2023 |
|
Presentation An Introduction to Ranking Data and a Case Study of a National Survey of First Responders (Abstract)
Ranking data are collected by presenting a respondent with a list of choices, and then asking what are the respondent's favorite, second favorite, and so on. The rankings may be complete, the respondent rank orders the complete list, or partial, only the respondent's favorite two or three, etc. Given a sample of rankings from a population, one goal may be to estimate the most favored choice from the population. Another may be to compare the preferences of one subpopulation to another. In this presentation I will introduce ranking data and probability models that form the foundation for statistical inference for them. The Plackette-Luce model will be the main focus. After that I will introduce a real data set containing ranking data assembled by the National Institute of Standards and Technology (NIST) based on the results of a national survey of first responders. The survey asked about how first responders use communication technology. With this data set, questions such as do rural and urban/suburban first responders prefer the same types communication devices, can be explored. I will conclude with some ideas for incorporating rankning data into test and evaluation settings. |
Adam Pintar Mathematical Statistician National Institute of Standards and Technology (bio)
Adam earned a Ph.D. in Statistics from Iowa State University in 2010, and has been a Mathematical Statistician with NIST's Statistical Engineering Division since. His primary focus is providing statistical and machine learning expertise and insight on multidisciplinary research teams. He has collaborated with researchers from very diverse backgrounds such as social science, engineering, chemistry, and physics. He is a Past Chair of the Statistics Division of the American Society for Quality (ASQ), he currently serves on the editorial board of the journal Transactions on Mathematical Software, and he is a member of the American Statistical Association and a senior member of ASQ. |
Presentation | Materials | 2023 |
|
Mini-Tutorial An Introduction to Uncertainty Quantification for Modeling & Simulation (Abstract)
Predictions from modeling and simulation (M&S) are increasingly relied upon to inform critical decision making in a variety of industries including defense and aerospace. As such, it is imperative to understand and quantify the uncertainties associated with the computational models used, the inputs to the models, and the data used for calibration and validation of the models. The rapidly evolving field of uncertainty quantification (UQ) combines elements of statistics, applied mathematics, and discipline engineering to provide this utility for M&S. This mini tutorial provides an introduction to UQ for M&S geared towards engineers and analysts with little-to-no experience in the field but with some knowledge of probability and statistics. A brief review of basic probability will be provided before discussing some core UQ concepts in more detail, including uncertainty propagation and the use of Monte Carlo simulation for making probabilistic predictions with computational models, model calibration to estimate uncertainty in model input parameters using experimental data, and sensitivity analysis for identifying the most important and influential model inputs parameters. Examples from relevant NASA applications are included and references are provided throughout to point viewers to resources for further study. |
James Warner Computational Scientist NASA Langley Research Center (bio)
Dr. James (Jim) Warner joined NASA Langley Research Center (LaRC) in 2014 as a Research Computer Engineer after receiving his PhD in Computational Solid Mechanics from Cornell University. Previously, he received his B.S. in Mechanical Engineering from SUNY Binghamton University and held temporary research positions at the National Institute of Standards and Technology and Duke University. Dr. Warner is a member of the Durability, Damage Tolerance, and Reliability Branch (DDTRB) at LaRC, where he focuses on developing computationally-efficient approaches for uncertainty quantification for a range of applications including structural health management, additive manufacturing, and trajectory simulation. Additionally, he works to bridge the gap between UQ research and NASA mission impact, helping to transition state-of-the-art methods to solve practical engineering problems. To that end, he has recently been involved in efforts to certify the xEMU spacesuit and develop guidance systems for entry, descent, and landing for Mars landing. His other research interests include machine learning, high performance computing, and topology optimization. |
Mini-Tutorial | Materials | 2023 |
|
Mini-Tutorial An Overview of Methods, Tools, and Test Capabilities for T&E of Autonomous Systems (Abstract)
This tutorial will give an overview of selected methodologies, tools and test capabilities discussed in the draft “Test and Evaluation Companion Guide for Autonomous Military Systems.” This test and evaluation (T&E) companion guide is being developed to provide guidance to test and evaluation practitioners, to include program managers, test planners, test engineers, and analysts with test strategies, applicable methodologies, and tools that will help to improve rigor in addressing the challenges unique to the T&E of autonomy. It will also cover selected capabilities of test laboratories and ranges that support autonomous systems. The companion guide is intended to be a living document contributed by the entire community and will adapt to ensure the right information reaches the right audience. |
Leonard Truett and Charlie Middleton Senior STAT Expert STAT COE (bio)
Dr. Truett has been a member of the Scientific Test and Analysis Techniques Center of Excellence (STAT COE) located at WPAFB, OH since 2012 and is currently the is the Senior STAT Expert. He began his career as a civilian for the 46th Test Wing supporting Live Fire Test and Evaluation (LFT&E) specializing in fire and explosion suppression for Aircraft. He has also worked for the Institute for Defense Analyses (IDA) supporting the Director, Operational Test and Evaluation (DOT&E) in LFT&E and Operational Test and Evaluation (OT&E) for air systems. He holds a Bachelor’s of Science in Aerospace Engineering and a Master’s of Science in Aerospace Engineering from the Georgia Institute of Technology, and Doctorate of Aerospace Engineering from The University of California, San Diego. Charlie Middleton currently leads the Advancements in Test and Evaluation (T&E) of Autonomous Systems team for the OSD STAT Center of Excellence. His responsibilities include researching autonomous system T&E methods and tools; collaborating with Department of Defense program and project offices developing autonomous systems; leading working groups of autonomy testers, staffers, and researchers; and authoring a handbook, reports, and papers related to T&E of autonomous systems. Previously, Mr. Middleton led development of a live-fire T&E risk-based framework for survivability and lethality evaluation for the office of the Director, Operational T&E; led a multi-domain modeling and simulation team supporting Air Force Research Labs future space efforts; and developed a Bayesian reliability analysis toolset for the National Air and Space Intelligence Center. While an active-duty Air Force officer, he was a developmental and operational test pilot leading several aircraft and weapons T&E programs and projects, and piloted 291 combat hours in the F-16 aircraft, employing precision munitions in Close Air Support, Time-Sensitive Targeting, and Suppression of Enemy Air Defense combat missions. Mr. Middleton is a distinguished graduate of the U.S. Naval Test Pilot School, and holds undergraduate and graduate degrees in operations research and operations analysis from Princeton University and the Air Force Institute of Technology. |
Mini-Tutorial | Materials | 2023 |
|
Presentation An Overview of the NASA Quesst Community Test Campaign with the X-59 Aircraft (Abstract)
In its mission to expand knowledge and improve aviation, NASA conducts research to address sonic boom noise, the prime barrier to overland supersonic flight. For half a century civilian aircraft have been required to fly slower than the speed of sound when over land to prevent sonic boom disturbances to communities under the flight path. However, lower noise levels may be achieved via new aircraft shaping techniques that reduce the merging of shockwaves generated during supersonic flight. As part of its Quesst mission, NASA is building a piloted, experimental aircraft called the X-59 to demonstrate low noise supersonic flight. After initial flight testing to ensure the aircraft performs as designed, NASA will begin a national campaign of community overflight tests to collect data on how people perceive the sounds from this new design. The data collected will support national and international noise regulators’ efforts as they consider new standards that would allow supersonic flight over land at low noise levels. This presentation provides an overview of the community test campaign, including the scope, key objectives, stakeholders, and challenges. |
Jonathan Rathsam Senior Research Engineer NASA Langley Research Center (bio)
Jonathan Rathsam is a Senior Research Engineer at NASA’s Langley Research Center in Hampton, Virginia. He conducts laboratory and field research on human perceptions of low noise supersonic overflights. He currently serves as Technical Lead of Survey Design and Analysis for Community Test Planning and Execution within NASA’s Commercial Supersonic Technology Project. He has previously served as NASA co-chair for DATAWorks and as chair for a NASA Source Evaluation Board. He holds a Ph.D. in Engineering from the University of Nebraska, a B.A. in Physics from Grinnell College in Iowa, and completed postdoctoral research in acoustics at Ben-Gurion University in Israel. |
Presentation | Materials | 2023 |
|
Presentation Analysis of Surrogate Strategies and Regularization with Application to High-Speed Flows (Abstract)
Surrogate modeling is an important class of techniques used to reduce the burden of resource-intensive computational models by creating fast and accurate approximations. In aerospace engineering, surrogates have been used to great effect in design, optimization, exploration, and uncertainty quantification (UQ) for a range of problems, like combustor design, spacesuit damage assessment, and hypersonic vehicle analysis. Consequently, the development, analysis, and practice of surrogate modeling is of broad interest. In this talk, several widely used surrogate modeling strategies are studied as archetypes in a discussion on parametric/nonparametric surrogate strategies, local/global model forms, complexity regularization, uncertainty quantification, and relative strengths/weaknesses. In particular, we consider several variants of two widely used classes of methods: polynomial chaos and Gaussian process regression. These surrogate models are applied to several synthetic benchmark test problems and examples of real high-speed flow problems, including hypersonic inlet design, thermal protection systems, and shock-wave/boundary-layer interactions. Through analysis of these concrete examples, we analyze the trade-offs that modelers must navigate to create accurate, flexible, and robust surrogates. |
Gregory Hunt Assistant Professor William & Mary (bio)
Greg is an interdisciplinary researcher that helps advance science with statistical and data-analytic tools. He is trained as a statistician, mathematician and computer scientist, and currently works on a diverse set of problems in engineering, physics, and microbiology. |
Presentation | Materials | 2023 |
|
Presentation Analysis of Surrogate Strategies and Regularization with Application to High-Speed Flows (Abstract)
Surrogate modeling is an important class of techniques used to reduce the burden of resource-intensive computational models by creating fast and accurate approximations. In aerospace engineering, surrogates have been used to great effect in design, optimization, exploration, and uncertainty quantification (UQ) for a range of problems, like combustor design, spacesuit damage assessment, and hypersonic vehicle analysis. Consequently, the development, analysis, and practice of surrogate modeling is of broad interest. In this talk, several widely used surrogate modeling strategies are studied as archetypes in a discussion on parametric/nonparametric surrogate strategies, local/global model forms, complexity regularization, uncertainty quantification, and relative strengths/weaknesses. In particular, we consider several variants of two widely used classes of methods: polynomial chaos and Gaussian process regression. These surrogate models are applied to several synthetic benchmark test problems and examples of real high-speed flow problems, including hypersonic inlet design, thermal protection systems, and shock-wave/boundary-layer interactions. Through analysis of these concrete examples, we analyze the trade-offs that modelers must navigate to create accurate, flexible, and robust surrogates. |
Gregory Hunt Assistant Professor William & Mary (bio)
Greg is an interdisciplinary researcher that helps advance science with statistical and data-analytic tools. He is trained as a statistician, mathematician and computer scientist, and currently works on a diverse set of problems in engineering, physics, and microbiology. |
Presentation | 2023 |
||
Presentation Analysis Opportunities for Missile Trajectories (Abstract)
Contractor analysis teams are given 725 monte-carlo generated trajectories for thermal heating analysis. The analysis team currently uses a reduced-order model to evaluate all 725 options for the worst cases. The worst cases are run with the full model for handoff of predicted temperatures to the design team. Two months later, the customer arrives with yet another set of trajectories updated for the newest mission options. This presentation will question each step in this analysis process for opportunities to improve the cost, schedule, and fidelity of the effort. Using uncertainty quantification and functional data analysis processes, the team should be able to improve the analysis coverage and power with reduced (or at least a similar number) of model runs. |
Kelsey Cannon Research Scientist Lockheed Martin Space (bio)
Kelsey Cannon is a senior research scientist at Lockheed Martin Space in Denver, CO. She earned a bachelors degree from the CO School of Mines in Metallurgical and Materials Engineering and a masters degree in Computer Science and Data Science. In her current role, Kelsey works across aerospace and DoD programs to advise teams on effective use of statistical and uncertainty quantification techniques to save time and budget. |
Presentation | Materials | 2023 |
|
Speed Presentation Application of Recurrent Neural Network for Software Defect Prediction (Abstract)
Traditional software reliability growth models (SRGM) characterize software defect detection as a function of testing time. Many of those SRGM are modeled by the non-homogeneous Poisson process (NHPP). However, those models are parametric in nature and do not explicitly encode factors driving defect or vulnerability discovery. Moreover, NHPP models are characterized by a mean value function that predicts the average of the number of defects discovered by a certain point in time during the testing interval, but may not capture all changes and details present in the data and do not consider them. More recent studies proposed SRGM incorporating covariates, where defect discovery is a function of one or more test activities documented and recorded during the testing process. These covariate models introduce an additional parameter per testing activity, which adds a high degree of non-linearity to traditional NHPP models, and parameter estimation becomes complex since it is limited to maximum likelihood estimation or expectation maximization. Therefore, this talk assesses the potential use of neural networks to predict software defects due to their ability to remember trends. Three different neural networks are considered, including (i) Recurrent neural networks (RNNs), (ii) Long short-term memory (LSTM), and (iii) Gated recurrent unit (GRU) to predict software defects. The neural network approaches are compared with the covariate model to evaluate the ability in predictions. Results suggest that GRU and LSTM present better goodness-of-fit measures such as SSE, PSSE, and MAPE compared to RNN and covariate models, indicating more accurate predictions. |
Fatemeh Salboukh phd student University of Massachusetts Dartmouth (bio)
I am a Ph.D. student in the Department of Engineering and Applied Science at the University of Massachusetts Dartmouth. I received my Master’s in from the University of Allame Tabataba’i in Mathematical Statistics (September, 2020) and my Bachelor’s degree from Yazd University (July, 2018) in Applied Statistics. |
Speed Presentation | Materials | 2023 |
|
Speed Presentation Application of Software Reliability and Resilience Models to Machine Learning (Abstract)
Machine Learning (ML) systems such as Convolutional Neural Networks (CNNs) are susceptible to adversarial scenarios. In these scenarios, an attacker attempts to manipulate or deceive a machine learning model by providing it with malicious input, necessitating quantitative reliability and resilience evaluation of ML algorithms. This can result in the model making incorrect predictions or decisions, which can have severe consequences in applications such as security, healthcare, and finance. Failure in the ML algorithm can lead not just to failures in the application domain but also to the system to which they provide functionality, which may have a performance requirement, hence the need for the application of software reliability and resilience. This talk demonstrates the applicability of software reliability and resilience tools to ML algorithms providing an objective approach to assess recovery after a degradation from known adversarial attacks. The results indicate that software reliability growth models and tools can be used to monitor the performance and quantify the reliability and resilience of ML models in the many domains in which machine learning algorithms are applied. |
Zakaria Faddi Master Student University of Massachusetts Dartmouth (bio)
Zakaria Faddi is a master's student at the University of Massachusetts Dartmouth in the Electrical and Computer Engineering department. He completed his undergraduate in the Spring of 2022 at the same institute in Electrical and Computer Engineering with a concentration in Cybersecurity. |
Speed Presentation | Materials | 2023 |
|
Presentation Applications of Network Methods for Supply Chain Review (Abstract)
The DoD maintains a broad array of systems, each one sustained by an often complex supply chain of components and suppliers. The ways that these supply chains are interlinked can have major implications for the resilience of the defense industrial base as a whole, and the readiness of multiple weapon systems. Finding opportunities to improve overall resilience requires gaining visibility of potential weak links in the chain, which requires integrating data across multiple disparate sources. By using open-source data pipeline software to enhance reproducibility, and flexible network analysis methods, multiple stovepiped data sources can be brought together to develop a more complete picture of the supply chain across systems. |
Zed Fashena Research Associate IDA (bio)
Zed Fashena is currently a Research Associate in the Information Technology and Systems Division at the Institute for Defense Analyses. He holds a Master of Science in Statistics from the University of Wisconsin - Madison and a Bachelor of Arts in Economics from Carleton College (MN). |
Presentation | Materials | 2023 |
|
Short Course Applied Bayesian Methods for Test Planning and Evaluation (Abstract)
Bayesian methods have been promoted as a promising way for test and evaluation analysts to leverage previous information across a continuum-of-testing approach to system evaluation. This short course will cover how to identify when Bayesian methods might be useful within a test and evaluation context, components required to accomplish a Bayesian analysis, and provide an understanding of how to interpret the results of that analysis. The course will apply these concepts to two hands-on examples (code and applications provided): one example focusing on system reliability and one focusing on system effectiveness. Furthermore, individuals will gain an understanding of the sequential nature of a Bayesian approach to test and evaluation, the limitations thereof, and gain a broad understanding of questions to ask to ensure a Bayesian analysis is appropriately accomplished. Additional Information:
|
Victoria Sieck Deputy Director STAT COE/AFIT (bio)
Dr. Victoria R. C. Sieck is the Deputy Director of the Scientific Test & Analysis Center of Excellence (STAT COE), where she works with major acquisition programs within the Department of Defense (DoD) to apply rigor and efficiency to current and emerging test and evaluation methodologies through the application of the STAT process. Additionally, she is an Assistant Professor of Statistics at the Air Force Institute of Technology (AFIT), where her research interests include design of experiments, and developing innovate Bayesian approaches to DoD testing. As an Operations Research Analyst in the US Air Force (USAF), her experiences in the USAF testing community include being a weapons and tactics analyst and an operational test analyst. Dr. Sieck has a M.S. in Statistics from Texas A&M University, and a Ph.D. in Statistics from the University of New Mexico. |
Short Course | 2023 |
||
Short Course Applied Bayesian Methods for Test Planning and Evaluation (Abstract)
Bayesian methods have been promoted as a promising way for test and evaluation analysts to leverage previous information across a continuum-of-testing approach to system evaluation. This short course will cover how to identify when Bayesian methods might be useful within a test and evaluation context, components required to accomplish a Bayesian analysis, and provide an understanding of how to interpret the results of that analysis. The course will apply these concepts to two hands-on examples (code and applications provided): one example focusing on system reliability and one focusing on system effectiveness. Furthermore, individuals will gain an understanding of the sequential nature of a Bayesian approach to test and evaluation, the limitations thereof, and gain a broad understanding of questions to ask to ensure a Bayesian analysis is appropriately accomplished. Additional Information
|
Cory Natoli Applied Statistician Huntington Ingalls Industries/STAT COE (bio)
Dr. Cory Natoli works as an applied statistician at Huntington Ingalls Industries as a part of the Scientific Test and Analysis Techniques Center of Excellence (STAT COE). He received his MS in Applied Statistics from The Ohio State University and his Ph.D. in Statistics from The Air Force Institute of Technology. His emphasis lies in design of experiments, regression modeling, statistical analysis, and teaching. |
Short Course | 2023 |
||
Short Course Applied Bayesian Methods for Test Planning and Evaluation (Abstract)
Bayesian methods have been promoted as a promising way for test and evaluation analysts to leverage previous information across a continuum-of-testing approach to system evaluation. This short course will cover how to identify when Bayesian methods might be useful within a test and evaluation context, components required to accomplish a Bayesian analysis, and provide an understanding of how to interpret the results of that analysis. The course will apply these concepts to two hands-on examples (code and applications provided): one example focusing on system reliability and one focusing on system effectiveness. Furthermore, individuals will gain an understanding of the sequential nature of a Bayesian approach to test and evaluation, the limitations thereof, and gain a broad understanding of questions to ask to ensure a Bayesian analysis is appropriately accomplished. Additional Information
To follow along with examples, it is recommended you bring a computer that… o Can access the internet o Already has R Studio loaded (code will be provided; however, not having R Studio will not be a detriment to those who are interested in understanding the overall process but not executing an analysis themselves) |
Corey Thrush Statistician Huntington Ingalls Industries/STAT COE (bio)
Mr. Corey Thrush is a statistician at Huntington Ingalls Industries within the Scientific Test and Analysis Techniques Center of Excellence (STAT COE). He received a B.S. in Applied Statistics from Ohio Northern University and an M.A. in Statistics from Bowling Green State University. His interests are data exploration, statistical programming, and Bayesian Statistics. |
Short Course | 2023 |
||
Keynote April 26th Morning Keynote |
Mr. Peter Coen Mission Integration Manager NASA (bio)
Peter Coen currently serves as the Mission Integration Manager for NASA’s Quesst Mission. His primary responsibility in this role is to ensure that the X-59 aircraft development, in-flight acoustic validation and community test elements of the Mission stay on track toward delivering on NASA’s Critical Commitment to provide quiet supersonic overflight response data to the FAA and the International Civil Aviation Organization. Previously, Peter was the manager for the Commercial Supersonic Technology Project in NASA’s Aeronautics Research Mission, where he led a team from the four NASA Aero Research Centers in the development of tools and technologies for a new generation of quiet and efficient supersonic civil transport aircraft. Peter’s NASA career spans almost 4 decades. During this time, he has studied technology integration in practical designs for many different types of aircraft and has made technical and management contributions to all of NASA’s supersonics related programs over the past 30 years. As Project Manager, he led these efforts for 12 years. Peter is a licensed private pilot who has amassed nearly 30 seconds of supersonic flight time. |
Keynote | Session Recording | Materials | 2023 |
Session Title | Speaker | Type | Recording | Materials | Year |
---|---|---|---|---|---|
*Virtual Speaker* Epistemic and Aleatoric Uncertainty Quantification for Gaussian Processes |
Pau Batlle PhD Student California Institute of Technology |
2023 |
|||
Presentation *Virtual Speaker* Using Changepoint Detection and Artificial Intelligence to Classify Fuel Pressure Behaviors in Aerial Refueling |
Nelson Walker and Michelle Ouellette Mathematical Statistician United States Air Force |
Presentation | Materials | 2023 |
|
Speed Presentation A Bayesian Approach for Nonparametric Multivariate Process Monitoring using Universal Residuals |
Daniel Timme PhD Candidate Florida State University |
Speed Presentation | Materials | 2023 |
|
Presentation A Bayesian Decision Theory Framework for Test & Evaluation |
James Ferry Senior Research Scientist Metron, Inc. |
Presentation | Materials | 2023 |
|
Speed Presentation A Bayesian Optimal Experimental Design for High-dimensional Physics-based Models |
James Oreluk Postdoctoral Researcher Sandia National Laboratories |
Speed Presentation | Materials | 2023 |
|
Speed Presentation A Data-Driven Approach of Uncertainty Quantification on Reynolds Stress Based on DNS Turbulence Data |
Zheming Gou Graduate Research assistant University of Southern California |
Speed Presentation | Materials | 2023 |
|
Speed Presentation A generalized influence maximization problem |
Sumit Kumar Kar Ph.D. candidate University of North Carolina at Chapel Hill |
Speed Presentation | 2023 |
||
Speed Presentation A Stochastic Petri Net Model of Continuous Integration and Continuous Delivery |
Sushovan Bhadra Master's student University of Massachusetts Dartmouth |
Speed Presentation | Materials | 2023 |
|
Mini-Tutorial A Tour of JMP Reliability Platforms and Bayesian Methods for Reliability Data |
Peng Liu Principal Research Statistician Developer JMP Statistical Discovery |
Mini-Tutorial | Materials | 2023 |
|
Presentation Advanced Automated Machine Learning System for Cybersecurity |
Himanshu Dayaram Upadhyay Associate Professor (ECE) Florida International University |
Presentation | 2023 |
||
Speed Presentation An Evaluation Of Periodic Developmental Reviews Using Natural Language Processing |
Dominic Rudakevych Cadet United States Military Academy |
Speed Presentation | Materials | 2023 |
|
Presentation An Introduction to Ranking Data and a Case Study of a National Survey of First Responders |
Adam Pintar Mathematical Statistician National Institute of Standards and Technology |
Presentation | Materials | 2023 |
|
Mini-Tutorial An Introduction to Uncertainty Quantification for Modeling & Simulation |
James Warner Computational Scientist NASA Langley Research Center |
Mini-Tutorial | Materials | 2023 |
|
Mini-Tutorial An Overview of Methods, Tools, and Test Capabilities for T&E of Autonomous Systems |
Leonard Truett and Charlie Middleton Senior STAT Expert STAT COE |
Mini-Tutorial | Materials | 2023 |
|
Presentation An Overview of the NASA Quesst Community Test Campaign with the X-59 Aircraft |
Jonathan Rathsam Senior Research Engineer NASA Langley Research Center |
Presentation | Materials | 2023 |
|
Presentation Analysis of Surrogate Strategies and Regularization with Application to High-Speed Flows |
Gregory Hunt Assistant Professor William & Mary |
Presentation | Materials | 2023 |
|
Presentation Analysis of Surrogate Strategies and Regularization with Application to High-Speed Flows |
Gregory Hunt Assistant Professor William & Mary |
Presentation | 2023 |
||
Presentation Analysis Opportunities for Missile Trajectories |
Kelsey Cannon Research Scientist Lockheed Martin Space |
Presentation | Materials | 2023 |
|
Speed Presentation Application of Recurrent Neural Network for Software Defect Prediction |
Fatemeh Salboukh phd student University of Massachusetts Dartmouth |
Speed Presentation | Materials | 2023 |
|
Speed Presentation Application of Software Reliability and Resilience Models to Machine Learning |
Zakaria Faddi Master Student University of Massachusetts Dartmouth |
Speed Presentation | Materials | 2023 |
|
Presentation Applications of Network Methods for Supply Chain Review |
Zed Fashena Research Associate IDA |
Presentation | Materials | 2023 |
|
Short Course Applied Bayesian Methods for Test Planning and Evaluation |
Victoria Sieck Deputy Director STAT COE/AFIT |
Short Course | 2023 |
||
Short Course Applied Bayesian Methods for Test Planning and Evaluation |
Cory Natoli Applied Statistician Huntington Ingalls Industries/STAT COE |
Short Course | 2023 |
||
Short Course Applied Bayesian Methods for Test Planning and Evaluation |
Corey Thrush Statistician Huntington Ingalls Industries/STAT COE |
Short Course | 2023 |
||
Keynote April 26th Morning Keynote |
Mr. Peter Coen Mission Integration Manager NASA |
Keynote | Session Recording | Materials | 2023 |