Experience - Software System Quality Measurement
35 years experience applying measurement techniques to characterize the capabilities and quality of complex software systems including large-scale information system architectures, advanced service-oriented architectures for C2, distributed simulation systems, shipboard C2 systems, C2 systems to support littoral warfare, soldier information systems, automated support of chemical, biological and nuclear weapons defense operations, paperless processing for veteran benefits, autonomous robots, large-scale multi-resolution simulation systems, theater-level warfare simulation, simulation of airborne logistics operations, simulations of the releases of hazardous aerosols through accident or attack, distributed platform-level warfare simulations, semi-automated forces, and human behavior representations
Developed objective and repeatable techniques for evaluating the quality of models and simulations, information systems, intelligent systems, distributed simulation interoperability, simulation validation processes, information system security, application source code and software development processes and applied these techniques to 30 different projects.
Developed and applied techniques for developing acceptability criteria, formulating verification and validation (V&V) test plans based upon design of experiments techniques, evaluating the impacts of interoperability upon the simulation representation fidelity, validating simulation systems and general information systems, deriving acceptability recommendations, tailoring V&V approaches for particular systems based upon the risk that the users of those systems incur, formally characterizing the representational fidelity of simulation systems, and analyzing techniques and systems for information warfare
Contributed to the US DoD Verification, Validation and Accreditation (VV&A) Recommended Practices Guide (RPG), Simulation Interoperability Standards Organization (SISO) Fidelity Glossary, US DoD Essentials of Modeling and Simulation online course and IEEE Std. 1516.4-2007, IEEE Recommended Practice for Verification, Validation, and Accreditation of a Federation – An Overlay to the High Level Architecture Federation Development and Execution Process
Acquired and exercised expertise in measuring the quality of models and simulations, information systems, intelligent systems, simulation validation processes, and software development processes
This experience in complex system modeling and simulation can be organized into the following technical areas:
The following sections provide more detailed information about the experience in these areas along with citations for the technical publications produced.
- Formulated the SISO Fidelity Conceptual Model by applying set theory to describe simulation representational capabilities [71, 76, 79, 83]
- Developed a tailorable process for evidence-based verification and validation of simulations [80, 109]; applied that process to the verification and validation of the Joint Operational Effects Federation (JOEF)
- Developed a composite model of the activities and tasks for empirical verification and validation of software and simulations [108]; applied that model to the planning of V&V activities for the Joint Effects Model (JEM), Fleet Aerial Support Simulation (FASS), Joint Warning and Reporting Network (JWARN) and Joint Expeditionary Collective Protection (JECP) System Performance Model (SPM)
- Devised a technique for constructing quantitative referents from SME knowledge using survey research techniques; applied that technique to construct 2 validation referents for JWARS; developed a recommended practices standard for the U.S. Army that describes this technique [92, 94, 100, 102]
- Developed a technique for describing simulation representational capabilities using graph theory; demonstrated that technique on a turbine simulation and human behavior representations [106]
- Developed a process for risk-based verification, validation and accreditation of models and simulations [82]; applied this process to accredit four simulations: JEM, FASS, JWARN, and the Simulation Environment & Response Program Execution Nesting Tool (SERPENT)
- Developed a technique for quantitative accreditation of models and simulations that estimates the uncertainties of the accreditation recommendations; applied that technique to the JECP SPM and JEM accreditations
- Prepared guidance for risk-based accreditation for the US DoD and NATO [108], for objectively validating simulation compositions, and for verifying, validating and accrediting distributed simulations
- Prepared the fidelity section in the DMSO Essentials of Modeling and Simulation online course [103]
- Contributed to the US DoD Modeling and Simulation VV&A RPG [105]; authored and edited core documents for the roles of V&V Agent and Accreditation Agent for new simulation developments and legacy simulations; authored special topic sections on requirements, fidelity, validation, human behavior validation, federation VV&A, simulation credibility, and legacy simulation V&V; edited the VV&A RPG Glossary [105]
- Served as technical editor for the SISO Fidelity Glossary [75] and the IEEE standard for verification, validation and accreditation of simulation federations (IEEE Std. 1516.4 - 2007) [107]
- Developed a mathematical model that measures and predicts the effectiveness of C3 systems [10, 13]
- Developed techniques for analyzing reconfigurable system designs [59] and for measuring information system performance [74]; applied those techniques to assess Joint Simulation System (JSIMS) architecture composability, Extended Littoral Battlespace C4I architecture options, Advanced Information Technology Services Reference Architecture effectiveness, and JSIMS architecture security [74, 77]
- Developed a standards-based approach to software verification and validation; applied this approach to nine projects for the VBA: Paperless Initiative, Chapter 33 – Automate GI Bill, CWNRS, Veterans On-Line Application, Veterans Benefits Management System (VBMS), VBMS – Rating, Veteran Service Network (VETSNET), Benefits Gateway Service (BGS), and eBenefits
- Developed a technique for representing and analyzing software and system requirements using graph theory; applied this technique to describe simulation representational capabilities [106]
- Developed a technique for quantitatively estimating software quality from its source code; applied this technique to > 39 samples of source code programmed in C++, Java and PL/SQL from 3 VBA application projects: Chapter 33 – Automate GI Bill, VETSNET and BGS; constructed source code quality baselines for C, C++, Java and PL/SQL from the evaluation results
- Applied concepts from discrete mathematics and irreversible thermodynamics to describe the phenomena underlying information system behavior [87-90]; applied this model to analyze information system security
- Prepared guidance describing the implementation of the standards-based approach to software verification and validation
- Developed objective techniques for rigorously describing robot tasks [12], quantitatively evaluating intelligent systems [70, 81], and validating and verifying knowledge-based simulations and human behavior representations
- Performed analyses to estimate the complexity of automation for nuclear weapons site security (NWSS); formally described the NWSS automation task
- Developed a technique for evaluating the quality of SAFOR representations; applied that technique to evaluate the SAFOR for SIMNET [56, 60]
- Developed a technique for describing the computational requirements of complex robots; applied that technique to assess the current design limitations of complex robot computing systems [54]
- Conducted and published an extensive survey of the technical literature in verification, validation, evaluation and testing of knowledge-based systems (KBSs) [98]
- Prepared guidance and a tutorial to describe the objective KBS V&V process [78, 86, 96]
- Developed a technique for consistently assessing the maturity of simulation validation processes; applied this technique to evaluate the processes for validating nine different simulations: JEM, JWARN, JOEF, SERPENT, Synthetic Theater of War Operations Research Model, Joint Multi-Resolution Model, Joint Theater Level Simulation (JTLS), Joint Conflict and Tactical Simulation and Joint Live-Virtual-Constructive System [99, 101, 104, 110]
- Developed a technique for characterizing the quality of a source code development process from evaluations of the source code it produces; applied that technique to evaluation of the VETSNET and BGS source code development processes