|Dr. Catherine Warner|
|Institutionalizing a Culture of Statistical Thinking in DoD Testing||September 25th, 2017 1500 EST|
|Dr. Catherine Warner serves as the Science Advisor for the Director, Operational Test and Evaluation within the Office of the Secretary of Defense. Dr. Warner has been involved with operational test and evaluation since 1991, when she became a research staff member at the Institute for Defense Analyses (IDA). In that capacity, Dr. Warner performed and directed analysis of operational tests for Army, Navy, and Air Force systems in support of DOT&E. From 2005 – 2010, Dr. Warner was an Assistant Director at IDA and also served as the lead for the Air Warfare group. Her analysis portfolio included major aircraft systems such as the F-22, F/A-18E/F, V-22, and H-1. Prior to that, Dr. Warner was the lead analyst for Unmanned Aerial Vehicle (UAV) systems including Predator, Shadow, Hunter, and Global Hawk. In 2013 at the request of the Defense Information Systems Agency (DISA), Dr. Warner deployed to Kabul, Afghanistan for 16 months in support of NATO’s International Security Assistance Force (ISAF) and the US Operation Enduring Freedom (OEF).
Dr. Warner previously worked at the Lawrence Livermore National Laboratory. She grew up in Albuquerque, New Mexico, attended the University of New Mexico and San Jose State University where she earned both B.S. and M.S. degrees in Chemistry. She also earned both M.A. and Ph.D. degrees in Chemistry from Princeton University.
|Dr. Shane Reese|
Professor of Statistics
Brigham Young University
|Modeling and Simulation, Uncertainty Quantification and the Integration of Diverse Information Sources||May 9th, 2017 1200 EST|
|Dr. Reese is a Professor of Statistics at Brigham Young University. He is also a current Technical Staff Member with Los Alamos National Laboratory. Dr. Reese received his PhD from Texas A&M University. He has held multiple academic positions at Brigham Young University, University of New Mexico, Arizona State University and Iowa State University. He is a Fellow of the American Statistical Association. He is a past Editor for Chance Magazine (2011-2015), Associate Editor for JASA Reviews (2014-2016) and for the Journal of the American Statistical Association (2006-2014). He is the recipient of the American Statistical Association Excellence in Statistics in Sports Award (2010), as well as the H.O. Hartley Award (Distinguished Alumnus, TAMU) (2007). He was the past Sports Chair for ASA Statistics (2013). He is a member of the National Academy of Sciences Committee on ACWA Monitoring, National Academy of Sciences Committee on Biological Standoff Detection Systems, and a member of ASA Committee on Science and Public Affairs.
|Dr. Alyson Wilson|
Professor, Department of Statistics
North Carolina State University
|Bayesian Analysis||January 31st, 2017 1200 EST|
|Dr. Wilson is a Fellow of the American Statistical Association and the American Association for the Advancement of Science, and an elected member of the International Statistical Institute. She is a winner of the Army Wilks Award (2015), the Los Alamos National Laboratory (LANL) Director’s Distinguished Performance Award (2008), and the DOE Defense Programs Award of Excellence (2007). She is a founder and past-chair of the American Statistical Association’s Section on Statistics in Defense and National Security.
One of the most powerful features of Bayesian analyses is the ability to combine multiple sources of information in a principled way to perform inference. For example, this feature can be particularly valuable in assessing the reliability of systems where testing is limited for some reason (e.g., expense, treaty). At their most basic, Bayesian methods for reliability develop informative prior distributions using expert judgment or similar systems. Appropriate models allow the incorporation of many other sources of information, including historical data, information from similar systems, and computer models. I will introduce the approach and then consider examples from defense acquisition and lifecycle extension, focusing on the strengths and weaknesses of the Bayesian analyses.
|Dr. Anna-Maria Rivas McGowan, Ph.D.|
Senior Engineer for Complex Systems Design
National Aeronautics and Space Administration
|Infusing Statistical Thinking and Non-traditional Disciplines in the Design and Engineering of Complex Systems||Presented on January 20th, 2016|
|Dr. Anna-Maria McGowan is NASA’s Senior Engineer for Complex Systems Design leading the Agency’s initiatives to develop methodologies for designing and engineering complex systems. She serves as an Agency technical advisor and the principal strategist for collaborating with external leaders to advance interdisciplinary research, design, and development of aerospace and non-aerospace methodologies that address increasing complexities in aerospace systems. Dr. McGowan has over 23 years experience in aerospace research and leadership, conducting research in diverse areas including aeroelasticity, adaptive structures and materials, and design science. She has led and managed several large multidisciplinary aeronautics research projects that include military and commercial aerospace vehicles with advance technologies such as innovative high-lift; micro flow control; tailored, lightweight wing concepts, biologically-inspired flight, and extreme short take-off and landing. Dr. McGowan is based at the NASA Langley Research Center and has served as a project manager, NSF visiting scientist, DARPA Agent, NATO consultant, short course instructor, flight test leader, wind-tunnel test engineer, senior researcher, and NASA spokesperson. Dr. McGowan has a B.S. in Aeronautical and Astronautical Engineering from Purdue University, an M.S. in Aerospace Engineering from Old Dominion University, and a Ph.D. in Design Science in Engineering from the University of Michigan.
Infusing Statistical Thinking and Non-traditional Disciplines in the Design and Engineering of Complex Systems
|The Honorable J. Michael Gilmore|
Director, Operational Test and Evaluation
|Rigorous, Defensible, Efficient Testing||Presented on December 9, 2014|
Rigorous, Defensible, Efficient Testing
DOT&E employs rigorous statistical methodologies in its assessments of test adequacy and in its operational evaluations to provide sound analysis of test results to acquisition decision makers. DOT&E has taken the initiative to promote and require these scientific best practices in operational testing and in the analysis of the data gathered from these well-designed tests. The methodologies that I have advocated for not only provide a rigorous and defensible coverage of the operational space; they also allow us to quantify the trade-space between testing more and answering complex questions about system performance. The results have been better test plans, more confidence in the information provided to acquisition and Service officials, and improved test efficiency. These methods encapsulate the need to "do more without more," especially in light of a highly constrained fiscal environment. Indeed, the final result of this policy is to ensure that we optimize scarce resources by neither over-testing nor under-testing weapon systems; provide sound rationale for the level of testing prescribed; and gather the data needed to provide warfighters with a basis for trust in evaluations of the performance of those weapon systems.
While there has been a lot of progress, much work remains. The implementation of these techniques is still far from widespread across all DoD T&E communities. Furthermore, we are currently not leveraging these methods in a sequential fashion to improve knowledge as we move from developmental testing to operational testing. In this talk I will discuss DOT&E’s efforts to institutionalize the use of statistical methods in DoD T&E. I will share case studies highlighting successful examples and areas for improvement.