System Verification

Proving the Design Solution Satisfies the Requirements
 
 
Academic Press
  • 2. Auflage
  • |
  • erschienen am 7. Mai 2016
  • |
  • 414 Seiten
 
E-Book | ePUB mit Adobe DRM | Systemvoraussetzungen
978-0-12-804222-9 (ISBN)
 

System Verification: Proving the Design Solution Satisfies the Requirements, Second Edition explains how to determine what verification work must be done, how the total task can be broken down into verification tasks involving six straightforward methods, how to prepare a plan, procedure, and report for each of these tasks, and how to conduct an audit of the content of those reports for a particular product entity.

This process-centered book is applicable to engineering and computing projects of all kinds, and the lifecycle approach helps all stakeholders in the design process understand how the verification and validation stage is significant to them. In addition to many flowcharts that illustrate the verification procedures involved, the book also includes 14 verification form templates for use in practice.

The author draws on his experience of consulting for industry as well as lecturing to provide a uniquely practical and easy to use guide which is essential reading for systems and validation engineers, as well as everyone involved in the product design process.


  • Includes 14 real life templates for use in verification tasks
  • Explains concepts in the context of the entire design lifecycle, helping all project stakeholders engage
  • Contains a process-focused approach to design model verification that can be applied to all engineering design and software development projects


Jeffrey O. Grady is the Owner of JOG System Engineering, a consulting and teaching company, and an Adjunct Professor at the University of California, San Diego. He was formerly the manager of systems development at GD Space Systems. He is the author of ten books in the systems engineering field. Jeff is an INCOSE Fellow, Founder, and ESEP. Jeff worked as an employee for Librascope, Ryan Aeronautical, General Dynamics Convair, and General Dynamics Space Systems. He has consulted in systems engineering for many companies, developing military and commercial products. He has taught hundreds of system engineering courses for universities, short course companies and for his own company.
  • Englisch
  • San Diego
  • |
  • USA
Elsevier Science
  • 9,34 MB
978-0-12-804222-9 (9780128042229)
0128042222 (0128042222)
weitere Ausgaben werden ermittelt
  • Front Cover
  • System Verification: Proving the Design Solution Satisfies the Requirements
  • Copyright
  • Contents
  • About the author
  • Preface
  • Acknowledgments
  • Acronyms and Abbreviations
  • Glossary
  • Chapter 1: Setting the Stage
  • 1.1. The Enduring Truth That Need Not Be
  • 1.2. Overview of This Chapter
  • 1.3. Introductory Ideas Included in Chapter 1
  • 1.3.1. Our Verification Objective
  • 1.3.1.1. What Is Important?
  • 1.3.1.2. Customer Relationship
  • 1.3.1.3. Our Scope of Interest
  • 1.3.1.4. Verification Phasing
  • 1.3.1.5. Verification Class Composition
  • 1.3.1.6. The Two Vs
  • 1.3.1.7. Who Should Be Responsible?
  • 1.3.1.8. Assembly of the Program Verification Process
  • 1.3.2. Systems and Their Development
  • 1.3.2.1. What Is a System?
  • 1.3.2.2. Systems Development Overview
  • 1.3.2.2.1. What Is Systems Development?
  • 1.3.2.2.2. Three Steps on the Way to Great Systems
  • 1.3.2.2.3. The Key Role of Systems Engineering
  • 1.3.2.3. System Modeling Alternatives
  • 1.3.3. Characteristics of the Enterprise
  • 1.3.3.1. Development Enterprise Organizational Structure
  • 1.3.3.2. The Business of Programs
  • 1.3.3.3. Program Structures
  • 1.3.3.4. Toward a Standard Process
  • 1.3.4. Development Environments
  • 1.3.4.1. Program Phasing Models
  • 1.3.4.1.1. The Waterfall Development Model
  • 1.3.4.1.2. The Spiral Development Model
  • 1.3.4.1.3. The V Development Model
  • 1.3.4.2. Process Discipline Variations
  • 1.3.4.2.1. Ridged Process Compliance
  • 1.3.4.2.2. Rapid Prototyping and Rapid Application Development
  • 1.3.4.3. Number of Systems Delivered Variations
  • 1.3.4.3.1. High-Rate Production Program
  • 1.3.4.3.2. Low-Volume, High-Dollar Production Program
  • 1.3.4.3.3. One-of-a-Kind Production Program
  • 1.3.4.4. Development Environment Integration
  • 1.3.4.5. View from a Product Type Perspective
  • 1.4. Overview of the Rest of the Book
  • 1.4.1. Specifications and the Process of Creating Them
  • 1.4.2. Verification Process Design
  • 1.4.3. The Four Verification Processes
  • 1.4.4. Process Verification
  • 1.4.5. Closure
  • Chapter 2: Specifications and Their Content
  • 2.1. Overview of This Chapter
  • 2.2. Kinds of Program Specifications
  • 2.3. Performance Specification Structure
  • 2.3.1. Performance Specification Section 3 Structure
  • 2.3.2. Performance Specification Section 4 Structure
  • 2.3.2.1. MIL-STD-961E Structure
  • 2.3.2.2. An Alternative Section 4 Structure
  • 2.3.3. Performance Specification Section 6 Structure
  • 2.3.4. Performance Specification Section 2 Structure
  • 2.4. Detail Specification Structure
  • 2.5. Interface Specification Structure
  • 2.6. Parts, Materials, and Processes Specifications
  • 2.7. Specification Guidance Documents
  • 2.8. Paragraph Numbers
  • Chapter 3: Specification Section 3 Preparation
  • 3.1. Overview of This Chapter
  • 3.2. Requirements
  • 3.3. Four Comprehensive Models
  • 3.3.1. The Functional Model
  • 3.3.1.1. The Need and Its Initial Expansion Using Functional Analysis
  • 3.3.1.2. Structured Decomposition Using Functional Analysis
  • 3.3.1.3. Functional Analysis Continuation
  • 3.3.1.4. Performance Requirements Analysis
  • 3.3.1.5. Process Requirements Analysis
  • 3.3.2. Computer Software Derived Models
  • 3.3.2.1. Modern Structured Analysis and Process for System Architecture and Requirements Engineering
  • 3.3.2.2. Unified Modeling Language - System Modeling Language
  • 3.3.2.3. Unified Process for DoDAF MoDAF
  • 3.3.3. Product and Relationship Entity Identification Models
  • 3.3.4. Design Constraints Analysis
  • 3.3.4.1. Interface Requirements Analysis
  • 3.3.4.2. Specialty Engineering Requirements Analysis
  • 3.3.4.3. Environmental Requirements Analysis
  • 3.4. Transition Point
  • 3.5. An Important Management Responsibility
  • Chapter 4: Specification Section 4 Preparation
  • 4.1. Chapter Overview
  • 4.2. Verification Classes
  • 4.2.1. Parts, Materials, and Processes Class
  • 4.2.2. Item and Interface Qualification Class
  • 4.2.3. Item and Interface Acceptance Class
  • 4.2.4. System Development Test and Evaluation Class
  • 4.3. Items Subject to Qualification and Acceptance
  • 4.4. Verification Directionality
  • 4.5. Product Verification Layering
  • 4.6. Forming Verification Strings
  • 4.7. Verification Methods
  • 4.7.1. Analysis
  • 4.7.2. Demonstration
  • 4.7.3. Examination
  • 4.7.4. Test
  • 4.7.5. Special
  • 4.7.6. None
  • 4.8. Product and Verification Levels
  • 4.9. Verification Requirements Definition Timing
  • 4.10. Verification Requirements Analysis
  • 4.10.1. Selecting the Method
  • 4.10.2. Writing Responsibility and Support
  • 4.10.3. Writing the Verification Paragraph
  • 4.10.3.1. Example of Qualification Examination for Compliance
  • 4.10.3.2. Another Example of Qualification Examination for Compliance
  • 4.10.3.3. Example of Qualification Analysis for Compliance
  • 4.10.3.4. Example of Qualification Test for Compliance
  • 4.10.3.5. Example of Qualification Demonstration for Compliance
  • 4.11. Special Problems with Interface Verification
  • Chapter 5: Establishing and Maintaining Enterprise Readiness for Program Implementation of Verification
  • 5.1. Our Objective in This Chapter
  • 5.2. Deployable Resource Responsibilities
  • 5.3. New Program Resource Base
  • 5.3.1. Personnel
  • 5.3.2. Document Templates and Data Item Descriptions
  • 5.3.2.1. Specification Templates and DID
  • 5.3.2.2. Verification Document Templates and DID
  • 5.3.2.3. An Omitted Document
  • 5.3.3. Requirements and Verification Database
  • 5.4. Assurance of Program Compliance
  • Chapter 6: Verification Process Design for a Program
  • 6.1. Overview of This Chapter
  • 6.2. Systems Verification Process Overview
  • 6.2.1. System Level Planning
  • 6.2.2. Verification Planning During System Requirements Work
  • 6.2.3. First Expansion of Life-cycle Function F124
  • 6.2.3.1. Parts, Materials, and Processes Verification
  • 6.2.3.2. Item and Interface Qualification Verification
  • 6.2.3.3. System Test and Evaluation Verification
  • 6.2.3.4. Item and Interface Acceptance Verification
  • 6.3. Manage Verification
  • 6.3.1. Identification of Entities Requiring Verification
  • 6.3.2. Verification Task Identification
  • 6.3.3. Verification Schedule
  • 6.3.4. Verification Budget
  • 6.4. Feedback into Product Models and Design
  • 6.5. Technical Data Assessment
  • 6.6. Contractual Considerations
  • 6.7. Verification Documentation Approach
  • 6.8. In the Aggregate
  • 6.9. Summary Comparison of Classes
  • Chapter 7: Parts, Materials, and Processes Verification
  • 7.1. Chapter Overview
  • 7.2. Parts, Materials, and Processes Organizational Linkage
  • 7.3. Program Preferred PMP List
  • 7.4. PMP Verification
  • 7.5. Parts and Materials Coordination with Availability Search
  • 7.6. Long Lead List
  • 7.7. Software PMP
  • Chapter 8: Item and Interface Qualification Verification
  • 8.1. Overview of This Chapter
  • 8.2. Overall Item Qualification Process
  • 8.2.1. Where Does the Qualification Verification Work Begin?
  • 8.2.2. Good Works and Integrity Again and Again
  • 8.2.3. Process Flow Diagram Expansion
  • 8.3. Item Qualification Process Design
  • 8.3.1. Qualification Task Assignments and Integration Opportunities
  • 8.3.2. Developing the Task Plans and Procedures
  • 8.3.3. Level of Task Accomplishment
  • 8.3.4. Qualification Verification Process Design Example
  • 8.3.4.1. The Size of the Verification Process Design Chore
  • 8.3.4.2. The Compliance Matrix
  • 8.3.4.3. Task and Item Matrices
  • 8.3.5. Task Principal Engineer Assignment
  • 8.4. Item Qualification Verification Documentation
  • 8.4.1. Qualification Documentation Summary
  • 8.4.2. Verification Documentation Suite
  • 8.4.3. Item Qualification Verification Planning Documentation
  • 8.4.4. Accomplish Item or Interface M Task N
  • 8.4.4.1. Accomplish Item or Interface M Test Task N
  • 8.4.4.2. Accomplish Item or Interface M Demonstration Task N
  • 8.4.4.3. Accomplish Item or Interface M Analysis Task N
  • 8.4.4.4. Accomplish Item or Interface M Examination Task N Examination Task
  • 8.4.4.5. Accomplish Item M Special Task N
  • 8.4.5. Item Qualification Verification Task Reporting
  • 8.4.6. Item M Qualification Integration and Optimization
  • 8.4.6.1. Assuring Item M Qualification Completeness
  • 8.4.6.2. Interface Qualification Verification Integration
  • 8.5. Recover Item Qualification Task Resources
  • 8.6. Item Qualification Verification Management and Audit
  • 8.7. Item Qualification Verification Variations
  • 8.8. Item Qualification Closeout
  • Chapter 9: Item and Interface Acceptance Verification
  • 9.1. Item Acceptance Test Planning Analysis
  • 9.1.1. The Notion of Acceptance
  • 9.1.2. Where Are the Requirements?
  • 9.1.3. How Does the Detail and Performance Specification Content Differ?
  • 9.1.4. Conversion to Verification Requirements
  • 9.1.5. Acceptance Test Planning, Procedures, and Results Data Collection
  • 9.1.6. Associate Contractor Relationships
  • 9.1.7. Manufacturing or Test and Evaluation Driven Acceptance Testing
  • 9.1.8. Information Management
  • 9.1.9. Coordination between Acceptance Testing and Special Test Equipment
  • 9.1.10. Relationship between Technical Data and Acceptance
  • 9.1.11. Post-Delivery Testing Applications
  • 9.1.12. Intercontinental Acceptance
  • 9.2. Nontest Item Acceptance Methods Coordination
  • 9.2.1. Organizational Responsibilities
  • 9.2.2. The Coordination Task
  • 9.2.3. Acceptance Task Matrix
  • 9.2.4. Examination Cases
  • 9.2.4.1. Quality Acceptance Examinations
  • 9.2.4.2. Engineering Participation in Acceptance Examination
  • 9.2.4.3. Software Acceptance Examination
  • 9.2.5. Demonstration Cases
  • 9.2.5.1. Logistics Demonstrations
  • 9.2.5.2. Flight Demonstration
  • 9.2.6. Analysis Cases
  • 9.3. Product Acceptance Verification Reporting
  • 9.3.1. Program Type
  • 9.3.2. The Two Steps of Acceptance
  • 9.3.3. Significant Differences
  • 9.3.4. Reporting Structures
  • 9.3.5. First-Article Acceptance
  • 9.3.6. Recurring Acceptance Verification
  • 9.4. Product Verification Management and Audit
  • 9.4.1. The Second Stage of Verification
  • 9.4.2. The Beginning of Acceptance Verification
  • 9.4.3. The Basis of Acceptance
  • 9.4.4. Acceptance Documentation
  • 9.4.5. Management of the Work
  • 9.4.6. FRACAS
  • 9.4.6.1. Acceptance Ethics
  • 9.4.6.2. FRACAS Implementation
  • 9.4.7. Physical Configuration Audit
  • 9.4.7.1. PCA Planning and Preparation
  • 9.4.7.2. PCA Implementation
  • 9.4.7.2.1. The Physically Oriented Audit
  • 9.4.7.2.2. The Paper-Dominated Audit
  • 9.4.7.2.3. Combined Mode Audit
  • 9.4.7.3. Post-PCA Activity
  • 9.4.8. Software Acceptance
  • Chapter 10: System Test and Evaluation Verification
  • 10.1. System Verification Planning
  • 10.1.1. The Beginning
  • 10.1.2. System-Level Testing and Analysis Implementation
  • 10.1.2.1. Product Cases
  • 10.1.2.2. System Test Categories
  • 10.1.2.2.1. Development Test and Evaluation
  • 10.1.2.2.2. Operational Test and Evaluation
  • 10.1.2.2.3. Initial or Interim Operational Test and Evaluation
  • 10.1.2.2.4. Follow-On Operational Test and Evaluation
  • 10.1.3. Other Forms of System Testing
  • 10.1.3.1. Quality and Reliability Monitoring
  • 10.1.3.2. System Compatibility Test
  • 10.1.3.3. Certification
  • 10.1.4. The Grand Plan
  • 10.1.5. The Limits of Intellectual Thought
  • 10.1.6. The Details
  • 10.2. System Test and Evaluation Implementation and Management
  • 10.2.1. System Test and Evaluation in the Beginning
  • 10.2.2. Instrumentation Management
  • 10.2.3. Reaching a State of DT&E Readiness
  • 10.2.4. A Normal Evolution
  • 10.2.5. Flexibility in the Face of Obstacles
  • 10.2.6. DT&E Reporting
  • 10.2.7. Logistics Support in DT&E
  • 10.2.8. OT&E
  • 10.2.9. Test Results Applications
  • 10.2.10. System Test and Evaluation Closure
  • 10.3. Item and System Reverification
  • 10.3.1. Reverification Exposed
  • 10.3.2. Project Phase Subsets
  • 10.3.2.1. Preproduction Development
  • 10.3.2.2. Manufacturing
  • 10.3.2.3. Product Use
  • 10.3.3. Temporal Considerations
  • 10.3.4. Process Redesign
  • Chapter 11: Enterprise Process Verification
  • 11.1. Is There a Difference?
  • 11.2. Process Validation
  • 11.2.1. Completeness
  • 11.2.2. Accounting for Planning Risks
  • 11.3. Program Process Design
  • 11.4. Process Verification
  • 11.4.1. Program and Functional Metrics
  • 11.4.2. Use of Cost/Schedule Control System in Process Verification
  • 11.4.3. Progressive Planning Improvements
  • 11.5. Organizational Possibilities
  • Chapter 12: Postscript
  • 12.1. Postscript Plan
  • 12.2. Closure on Meanings
  • 12.3. Hopes for Balanced Treatment
  • 12.4. Tools
  • 12.5. Singling Up the WBS Lines
  • 12.6. Focusing on the Whole
  • Appendix A: System Verification Template and Data Item Descriptions
  • A.1. Introduction
  • A.2. Application
  • A.3. Presentation
  • A.4. Template Update
  • Appendix A1: System DT&E and Item or Interface: Qualification Task Plan Template
  • Note
  • Note
  • Appendix A2: System DT&E and Item or Interface: Qualification Task Report Template
  • Appendix A3: Program Functional Configuration Audit: Report Template
  • Appendix A4: Item or Interface Acceptance Verification: Task Plan Template
  • Note
  • Note
  • Appendix A5: Item or Interface Acceptance Verification: Task Report Template
  • Appendix A6: Program Physical Configuration Audit: Report Template
  • Appendix A7: System Development Test and Evaluation: Master Plan Template
  • Appendix A8: System Development Test and Evaluation: Report Template
  • Appendix A9: Program Parts, Materials, and Processes: Master Plan Template
  • Note
  • Index
  • Back Cover

Dateiformat: EPUB
Kopierschutz: Adobe-DRM (Digital Rights Management)

Systemvoraussetzungen:

Computer (Windows; MacOS X; Linux): Installieren Sie bereits vor dem Download die kostenlose Software Adobe Digital Editions (siehe E-Book Hilfe).

Tablet/Smartphone (Android; iOS): Installieren Sie bereits vor dem Download die kostenlose App Adobe Digital Editions (siehe E-Book Hilfe).

E-Book-Reader: Bookeen, Kobo, Pocketbook, Sony, Tolino u.v.a.m. (nicht Kindle)

Das Dateiformat EPUB ist sehr gut für Romane und Sachbücher geeignet - also für "fließenden" Text ohne komplexes Layout. Bei E-Readern oder Smartphones passt sich der Zeilen- und Seitenumbruch automatisch den kleinen Displays an. Mit Adobe-DRM wird hier ein "harter" Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.

Weitere Informationen finden Sie in unserer E-Book Hilfe.


Download (sofort verfügbar)

93,95 €
inkl. 19% MwSt.
Download / Einzel-Lizenz
ePUB mit Adobe DRM
siehe Systemvoraussetzungen
E-Book bestellen

Unsere Web-Seiten verwenden Cookies. Mit der Nutzung des WebShops erklären Sie sich damit einverstanden. Mehr Informationen finden Sie in unserem Datenschutzhinweis. Ok