Software Testing and Continuous Quality Improvement, Second Edition

by ;
Edition: 2nd
Format: Hardcover
Pub. Date: 2004-10-14
Publisher(s): Auerbach Pub
  • Free Shipping Icon

    This Item Qualifies for Free Shipping!*

    *Excludes marketplace orders.

List Price: $83.95

Rent Textbook

Select for Price
There was a problem. Please try again later.

New Textbook

We're Sorry
Sold Out

Used Textbook

We're Sorry
Sold Out

eTextbook

We're Sorry
Not Available

How Marketplace Works:

  • This item is offered by an independent seller and not shipped from our warehouse
  • Item details like edition and cover design may differ from our description; see seller's comments before ordering.
  • Sellers much confirm and ship within two business days; otherwise, the order will be cancelled and refunded.
  • Marketplace purchases cannot be returned to eCampus.com. Contact the seller directly for inquiries; if no response within two days, contact customer service.
  • Additional shipping costs apply to Marketplace purchases. Review shipping costs at checkout.

Summary

Software Testing and Continuous Quality Improvement, Second Edition, illustrates a quality framework for software testing in traditional structured and unstructured environments. Section I reviews modern QA principles and best practices. Section II examines the waterfall development methodology. The next section contrasts waterfall development methodology with the rapid application spiral environment. Section IV discusses fundamental challenges of maintaining and improving existing systems. Section V contains a history of software testing, previews future testing tools, and guides the choosing of proper tools for various environments. It provides examples of some of the most popular products, and offers a detailed methodology for evaluating them.

Table of Contents

SECTION I SOFTWARE QUALITY IN PERSPECTIVE 1(50)
1 Quality Assurance Framework
5(24)
What Is Quality?
5(1)
Prevention versus Detection
6(1)
Verification versus Validation
7(1)
Software Quality Assurance
8(1)
Components of Quality Assurance
9(1)
Software Testing
10(1)
Quality Control
11(1)
Software Configuration Management
12(1)
Elements of Software Configuration Management
12(1)
Component Identification
13(1)
Version Control
14(1)
Configuration Building
14(1)
Change Control
15(1)
Software Quality Assurance Plan
16(1)
Steps to Develop and Implement a Software Quality Assurance Plan
16(3)
Step 1. Document the Plan
16(2)
Step 2. Obtain Management Acceptance
18(1)
Step 3. Obtain Development Acceptance
18(1)
Step 4. Plan for Implementation of the SQA Plan
19(1)
Step 5. Execute the SQA Plan
19(1)
Quality Standards
19(8)
IS09000
19(1)
Capability Maturity Model (CMM)
20(3)
Level 1 Initial
21(1)
Level 2 Repeatable
21(1)
Level 3 Defined
22(1)
Level 4 Managed
22(1)
Level 5 Optimized
23(1)
PCMM
23(1)
CMMI
24(1)
Malcom Baldrige National Quality Award
24(3)
Notes
27(2)
2 Overview of Testing Techniques
29(12)
Black-Box Testing (Functional)
29(1)
White-Box Testing (Structural)
30(1)
Gray-Box Testing (Functional and Structural)
30(2)
Manual versus Automated Testing
31(1)
Static versus Dynamic Testing
31(1)
Taxonomy of Software Testing Techniques
32(9)
3 Quality through Continuous Improvement Process
41(10)
Contribution of Edward Deming
41(1)
Role of Statistical Methods
42(1)
Cause-and-Effect Diagram
42(1)
Flow Chart
42(1)
Pareto Chart
42(1)
Run Chart
42(1)
Histogram
43(1)
Scatter Diagram
43(1)
Control Chart
43(1)
Deming's 14 Quality Principles
43(5)
Point 1: Create Constancy of Purpose
43(1)
Point 2: Adopt the New Philosophy
44(1)
Point 3: Cease Dependence on Mass Inspection
44(1)
Point 4: End the Practice of Awarding Business on Price Tag Alone
44(1)
Point 5: Improve Constantly and Forever the System of Production and Service
45(1)
Point 6: Institute Training and Retraining
45(1)
Point 7: Institute Leadership
45(1)
Point 8: Drive Out Fear
46(1)
Point 9: Break Down Barriers between Staff Areas
46(1)
Point 10: Eliminate Slogans, Exhortations, and Targets for the Workforce
47(1)
Point 11: Eliminate Numerical Goals
47(1)
Point 12: Remove Barriers to Pride of Workmanship
47(1)
Point 13: Institute a Vigorous Program of Education and Retraining
48(1)
Point 14: Take Action to Accomplish the Transformation
48(1)
Continuous Improvement through the Plan, Do, Check, Act Process
48(2)
Going around the PDCA Circle
50(1)
SECTION II LIFE CYCLE TESTING REVIEW 51(46)
4 Overview
53(20)
Waterfall Development Methodology
53(1)
Continuous Improvement "Phased" Approach
54(1)
Psychology of Life Cycle Testing
54(1)
Software Testing as a Continuous Improvement Process
55(3)
The Testing Bible: Software Test Plan
58(2)
Major Steps to Develop a Test Plan
60(1)
1. Define the Test Objectives
60(1)
2. Develop the Test Approach
60(1)
3. Define the Test Environment
60(1)
4. Develop the Test Specifications
61(1)
5. Schedule the Test
61(1)
6. Review and Approve the Test Plan
61(1)
Components of a Test Plan
61(1)
Technical Reviews as a Continuous Improvement Process
61(4)
Motivation for Technical Reviews
65(1)
Types of Reviews
66(1)
Structured Walkthroughs
66(1)
Inspections
66(3)
Participant Roles
69(1)
Steps for an Effective Review
70(1)
1. Plan for the Review Process
70(1)
2. Schedule the Review
70(1)
3. Develop the Review Agenda
71(1)
4. Create a Review Report
71(2)
5 Verifying the Requirements Phase
73(6)
Testing the Requirements with Technical Reviews
74(1)
Inspections and Walkthroughs
74(1)
Checklists
74(1)
Methodology Checklist
75(1)
Requirements Traceability Matrix
76(1)
Building the System/Acceptance Test Plan
76(3)
6 Verifying the Logical Design Phase
79(4)
Data Model, Process Model, and the Linkage
79(1)
Testing the Logical Design with Technical Reviews
80(1)
Refining the System/Acceptance Test Plan
81(2)
7 Verifying the Physical Design Phase
83(4)
Testing the Physical Design with Technical Reviews
83(2)
Creating Integration Test Cases
85(1)
Methodology for Integration Testing
85(2)
Step 1: Identify Unit Interfaces
85(1)
Step 2: Reconcile Interfaces for Completeness
85(1)
Step 3: Create Integration Test Conditions
86(1)
Step 4: Evaluate the Completeness of Integration Test Conditions
86(1)
8 Verifying the Program Unit Design Phase
87(4)
Testing the Program Unit Design with Technical Reviews
87(1)
Sequence
87(1)
Selection
87(1)
Iteration
87(1)
Creating Unit Test Cases
88(3)
9 Verifying the Coding Phase
91(6)
Testing Coding with Technical Reviews
91(1)
Executing the Test Plan
91(1)
Unit Testing
92(1)
Integration Testing
93(1)
System Testing
93(1)
Acceptance Testing
94(1)
Defect Recording
95(2)
SECTION III SOFTWARE TESTING METHODOLOGY 97(1)
10 Development Methodology Overview
99(138)
Limitations of Life Cycle Development
99(1)
The Client/Server Challenge
100(1)
Psychology of Client/Server Spiral Testing
101(5)
The New School of Thought
101(1)
Tester/Developer Perceptions
102(1)
Project Goal: Integrate QA and Development
103(1)
Iterative/Spiral Development Methodology
104(2)
Role of JADs
106(1)
Role of Prototyping
107(1)
Methodology for Developing Prototypes
108(4)
1. Develop the Prototype
108(2)
2. Demonstrate Prototypes to Management
110(1)
3. Demonstrate Prototype to Users
110(1)
4. Revise and Finalize Specifications
111(1)
5. Develop the Production System
111(1)
Continuous Improvement "Spiral" Testing Approach
112(5)
11 Information Gathering (Plan)
117(1)
Step 1: Prepare for the Interview
117(1)
Task 1: Identify the Participants
117(1)
Task 2: Define the Agenda
118(1)
Step 2: Conduct the Interview
118(8)
Task 1: Understand the Project
118(3)
Task 2: Understand the Project Objectives
121(1)
Task 3: Understand the Project Status
121(1)
Task 4: Understand the Project Plans
122(1)
Task 5: Understand the Project Development Methodology
122(1)
Task 6: Identify the High-Level Business Requirements
123(1)
Task 7: Perform Risk Analysis
124(2)
Computer Risk Analysis
124(1)
Method 1 Judgment and Instinct
125(1)
Method 2 Dollar Estimation
125(1)
Method 3 Identifying and Weighting Risk Attributes
125(1)
Step 3: Summarize the Findings
126(4)
Task 1: Summarize the Interview
126(1)
Task 2: Confirm the Interview Findings
127(2)
12 Test Planning (Plan)
129(1)
Step 1: Build a Test Plan
130(19)
Task 1: Prepare an Introduction
130(1)
Task 2: Define the High-Level Functional Requirements (in Scope)
131(1)
Task 3: Identify Manual/Automated Test Types
132(1)
Task 4: Identify the Test Exit Criteria
133(1)
Task 5: Establish Regression Test Strategy
134(2)
Task 6: Define the Test Deliverables
136(1)
Task 7: Organize the Test Team
137(1)
Task 8: Establish a Test Environment
138(1)
Task 9: Define the Dependencies
139(1)
Task 10: Create a Test Schedule
139(3)
Task 11: Select the Test Tools
142(1)
Task 12: Establish Defect Recording/Tracking Procedures
143(2)
Task 13: Establish Change Request Procedures
145(2)
Task 14: Establish Version Control Procedures
147(1)
Task 15: Define Configuration Build Procedures
147(1)
Task 16: Define Project Issue Resolution Procedures
148(1)
Task 17: Establish Reporting Procedures
148(1)
Task 18: Define Approval Procedures
149(1)
Step 2: Define the Metric Objectives
149(5)
Task 1: Define the Metrics
150(1)
Task 2: Define the Metric Points
151(3)
Step 3: Review/Approve the Plan
154(3)
Task 1: Schedule/Conduct the Review
154(1)
Task 2: Obtain Approvals
154(3)
13 Test Case Design (Do)
157(1)
Step 1: Design Function Tests
157(6)
Task 1: Refine the Functional Test Requirements
157(2)
Task 2: Build a Function/Test Matrix
159(4)
Step 2: Design GUI Tests
163(4)
Ten Guidelines for Good GUI Design
164(1)
Task 1: Identify the Application GUI Components
165(1)
Task 2: Define the GUI Tests
165(2)
Step 3: Define the System/Acceptance Tests
167(2)
Task 1: Identify Potential System Tests
167(1)
Task 2: Design System Fragment Tests
168(1)
Task 3: Identify Potential Acceptance Tests
169(1)
Step 4: Review/Approve Design
169(4)
Task 1: Schedule/Prepare for Review
169(1)
Task 2: Obtain Approvals
169(4)
14 Test Development (Do)
173(1)
Step 1: Develop Test Scripts
173(1)
Task 1: Script the Manual/Automated GUI/Function Tests
173(1)
Task 2: Script the Manual/Automated System Fragment Tests
173(1)
Step 2: Review/Approve Test Development
174(4)
Task 1: Schedule/Prepare for Review
174(1)
Task 2: Obtain Approvals
174(3)
15 Test Coverage through Traceability
177(4)
Use Cases and Traceability
178(2)
Summary
180(1)
16 Test Execution/Evaluation (Do/Check)
181(1)
Step 1: Setup and Testing
181(2)
Task 1: Regression Test the Manual/Automated Spiral Fixes
181(1)
Task 2: Execute the Manual/Automated New Spiral Tests
182(1)
Task 3: Document the Spiral Test Defects
183(1)
Step 2: Evaluation
183(1)
Task 1: Analyze the Metrics
183(1)
Step 3: Publish Interim Report
184(3)
Task 1: Refine the Test Schedule
184(1)
Task 2: Identify Requirement Changes
185(2)
17 Prepare for the Next Spiral (Act)
187(1)
Step 1: Refine the Tests
187(2)
Task 1: Update the Function/GUI Tests
187(1)
Task 2: Update the System Fragment Tests
188(1)
Task 3: Update the Acceptance Tests
189(1)
Step 2: Reassess the Team, Procedures, and Test Environment
189(2)
Task 1: Evaluate the Test Team
189(1)
Task 2: Review the Test Control Procedures
189(1)
Task 3: Update the Test Environment
190(1)
Step 3: Publish Interim Test Report
191(4)
Task 1: Publish the Metric Graphics
191(4)
Test Case Execution Status
191(1)
Defect Gap Analysis
191(1)
Defect Severity Status
191(1)
Test Burnout Tracking
192(3)
18 Conduct the System Test
195(5)
Step 1: Complete System Test Plan
195(5)
Task 1: Finalize the System Test Types
195(2)
Task 2: Finalize System Test Schedule
197(1)
Task 3: Organize the System Test Team
197(1)
Task 4: Establish the System Test Environment
197(3)
Task 5: Install the System Test Tools
200(1)
Step 2: Complete System Test Cases
200(11)
Task 1: Design/Script the Performance Tests
200(3)
Monitoring Approach
201(1)
Probe Approach
202(1)
Test Drivers
202(1)
Task 2: Design/Script the Security Tests
203(1)
A Security Design Strategy
203(1)
Task 3: Design/Script the Volume Tests
204(1)
Task 4: Design/Script the Stress Tests
205(1)
Task 5: Design/Script the Compatibility Tests
206(1)
Task 6: Design/Script the Conversion Tests
206(1)
Task 7: Design/Script the Usability Tests
207(1)
Task 8: Design/Script the Documentation Tests
208(1)
Task 9: Design/Script the Backup Tests
208(1)
Task 10: Design/Script the Recovery Tests
209(1)
Task 11: Design/Script the Installation Tests
209(1)
Task 12: Design/Script Other System Test Types
210(1)
Step 3: Review/Approve System Tests
211(1)
Task 1: Schedule/Conduct the Review
211(1)
Task 2: Obtain Approvals
212(1)
Step 4: Execute the System Tests
212(3)
Task 1: Regression Test the System Fixes
212(1)
Task 2: Execute the New System Tests
213(1)
Task 3: Document the System Defects
213(2)
19 Conduct Acceptance Testing
215(1)
Step 1: Complete Acceptance Test Planning
215(3)
Task 1: Finalize the Acceptance Test Types
215(1)
Task 2: Finalize the Acceptance Test Schedule
215(1)
Task 3: Organize the Acceptance Test Team
215(2)
Task 4: Establish the Acceptance Test Environment
217(1)
Task 5: Install Acceptance Test Tools
218(1)
Step 2: Complete Acceptance Test Cases
218(1)
Task 1: Subset the System-Level Test Cases
218(1)
Task 2: Design/Script Additional Acceptance Tests
219(1)
Step 3: Review/Approve Acceptance Test Plan
219(1)
Task 1: Schedule/Conduct the Review
219(1)
Task 2: Obtain Approvals
220(1)
Step 4: Execute the Acceptance Tests
220(3)
Task 1: Regression Test the Acceptance Fixes
220(1)
Task 2: Execute the New Acceptance Tests
220(1)
Task 3: Document the Acceptance Defects
221(2)
20 Summarize/Report Spiral Test Results
223(1)
Step 1: Perform Data Reduction
223(1)
Task 1: Ensure All Tests Were Executed/Resolved
223(1)
Task 2: Consolidate Test Defects by Test Number
223(1)
Task 3: Post Remaining Defects to a Matrix
223(1)
Step 2: Prepare Final Test Report
224(9)
Task 1: Prepare the Project Overview
225(1)
Task 2: Summarize the Test Activities
225(1)
Task 3: Analyze/Create Metric Graphics
225(7)
Defects by Function
225(2)
Defects by Tester
227(1)
Defect Gap Analysis
227(1)
Defect Severity Status
227(1)
Test Burnout Tracking
227(1)
Root Cause Analysis
227(3)
Defects by How Found
230(1)
Defects by Who Found
230(1)
Functions Tested and Not
230(1)
System Testing Defect Types
230(2)
Acceptance Testing Defect Types
232(1)
Task 4: Develop Findings/Recommendations
232(1)
Step 3: Review/Approve the Final Test Report
233(6)
Task 1: Schedule/Conduct the Review
233(3)
Task 2: Obtain Approvals
236(1)
Task 3: Publish the Final Test Report
236(1)
SECTION IV TEST PROJECT MANAGEMENT 237(1)
21 Overview of General Project Management
239(48)
Define the Objectives
239(1)
Define the Scope of the Project
240(1)
Identify the Key Activities
240(1)
Estimate Correctly
240(1)
Design
241(1)
Manage People
241(5)
Leadership
242(1)
Communication
242(1)
Solving Problems
243(1)
Continuous Monitoring
243(1)
Manage Changes
243(2)
22 Test Project Management
245(3)
Understand the Requirements
246(1)
Test Planning
246(1)
Test Execution
247(1)
Identify and Improve Processes
247(1)
Essential Characteristics of a Test Project Manager
248(7)
Requirement Analysis
248(1)
Gap Analysis
248(1)
Lateral Thinking in Developing Test Cases
248(1)
Avoid Duplication and Repetition
249(1)
Test Data Generation
249(1)
Validate the Test Environment
249(1)
Test to Destroy
249(1)
Analyze the Test Results
249(1)
Do Not Hesitate to Accept Help from Others
250(1)
Convey Issues as They Arise
250(1)
Improve Communication
250(1)
Always Keep Updating Your Business Knowledge
250(1)
Learn the New Testing Technologies and Tools
250(1)
Deliver Quality
251(1)
Improve the Process
251(1)
Create a Knowledge Base
251(1)
Repeat the Success
251(2)
23 Test Estimation
253(2)
Finish-to-Start: (FS)
255(1)
Start-to-Start: (SS)
255(1)
Finish-to-Finish: (FF)
255(1)
Start-to-Finish (SF)
255(1)
Critical Activities for Test Estimation
255(2)
Test Scope Document
256(1)
Test Strategy
256(1)
Test Condition
257(1)
Test Case
257(1)
Test Script
257(1)
Execution/Run Plan
257(1)
Factors Affecting Test Estimation
257(1)
Test Planning Estimation
258(1)
Test Execution and Controlling Effort
259(1)
Test Result Analysis
259(1)
Effort Estimation - Model Project
259(4)
24 Defect Monitoring and Management Process
263(1)
Defect Reporting
264(1)
Defect Meetings
265(1)
Defect Classifications
265(1)
Defect Priority
266(1)
Defect Category
266(4)
Defect Metrics
267(2)
25 Integrating Testing into Development Methodology
269(8)
Step 1. Organize the Test Team
270(1)
Step 2. Identify Test Steps and Tasks to Integrate
270(1)
Step 3. Customize Test Steps and Tasks
271(1)
Step 4. Select Integration Points
271(1)
Step 5. Modify the Development Methodology
272(1)
Step 6. Incorporate Defect Recording
272(1)
Step 7. Train in Use of the Test Methodology
272(3)
26 On-Site/Offshore Model
275(1)
Step 1: Analysis
275(1)
Step 2: Determine the Economic Tradeoffs
276(1)
Step 3: Determine the Selection Criteria
276(1)
Project Management and Monitoring
276(1)
Outsourcing Methodology
277(2)
On-Site Activities
278(1)
Offshore Activities
278(1)
Implementing the On-Site/Offshore Model
279(4)
Knowledge Transfer
279(1)
Detailed Design
280(1)
Milestone-Based Transfer
280(1)
Steady State
280(1)
Application Management
280(1)
Relationship Model
281(2)
Standards
282(1)
Benefits of On-Site/Offshore Methodology
283(3)
On-Site/Offshore Model Challenges,
285(8)
Out of Sight
285(1)
Establish Transparency
285(1)
Security Considerations
285(1)
Project Monitoring
285(1)
Management Overhead
285(1)
Cultural Differences
285(1)
Software Licensing
285(1)
The Future of Onshore/Offshore
286(1)
SECTION V MODERN SOFTWARE TESTING TOOLS 287(1)
27 A Brief History of Software Testing
289(42)
Evolution of Automated Testing Tools
293(2)
Static Capture/Replay Tools (without Scripting Language)
294(1)
Static Capture/Replay Tools (with Scripting Language),
294(1)
Variable Capture/Replay Tools
295(3)
Functional Decomposition Approach
295(1)
Test Plan Driven ("Keyword") Approach
296(2)
Historical Software Testing and Development Parallels
298(3)
Extreme Programming
299(2)
28 Software Testing Trends
301(4)
Automated Capture/Replay Testing Tools
301(1)
Test Case Builder Tools
302(1)
Advanced Leading-Edge Automated Testing Tools
302(2)
Advanced Leading-Edge Test Case Builder Tools
304(1)
Necessary and Sufficient Conditions
304(1)
Test Data/Test Case Generation
305(6)
Sampling from Production
305(1)
Starting from Scratch
306(1)
Seeding the Data
306(1)
Generating Data Based upon the Database
307(1)
Generating Test Data/Test Cases Based upon the Requirements
308(3)
29 Taxonomy of Testing Tools
311(13)
Testing Tool Selection Checklist
311(1)
Vendor Tool Descriptions
312(1)
When You Should Consider Test Automation
312(8)
When You Should NOT Consider Test Automation
320(3)
30 Methodology to Evaluate Automated Testing Tools
323(1)
Step 1: Define Your Test Requirements
323(1)
Step 2: Set Tool Objectives
323(1)
Step 3a: Conduct Selection Activities for Informal Procurement
324(2)
Task 1: Develop the Acquisition Plan
324(1)
Task 2: Define Selection Criteria
324(1)
Task 3: Identify Candidate Tools
324(1)
Task 4: Conduct the Candidate Review
325(1)
Task 5: Score the Candidates
325(1)
Task 6: Select the Tool
325(1)
Step 3b: Conduct Selection Activities for Formal Procurement
326(1)
Task 1: Develop the Acquisition Plan
326(1)
Task 2: Create the Technical Requirements Document
326(1)
Task 3: Review Requirements
326(1)
Task 4: Generate the Request for Proposal
326(1)
Task 5: Solicit Proposals
326(1)
Task 6: Perform the Technical Evaluation
327(1)
Task 7: Select a Tool Source
327(1)
Step 4: Procure the Testing Tool
327(1)
Step 5: Create the Evaluation Plan
327(1)
Step 6: Create the Tool Manager's Plan
328(1)
Step 7: Create the Training Plan
328(1)
Step 8: Receive the Tool
328(1)
Step 9: Perform the Acceptance Test
329(1)
Step 10: Conduct Orientation
329(1)
Step 11: Implement Modifications
329(1)
Step 12: Train Tool Users
329(1)
Step 13: Use the Tool in the Operating Environment
330(1)
Step 14: Write the Evaluation Report
330(1)
Step 15: Determine Whether Goals Have Been Met
330(1)
APPENDICES 331(1)
A Spiral Testing Methodology
333(1)
B Software Quality Assurance Plan
343(1)
C Requirements Specification
345(1)
D Change Request Form
347(1)
E Test Templates
349(168)
E 1: Unit Test Plan
349(1)
E 2: System/Acceptance Test Plan
349(2)
E 3: Requirements Traceability Matrix
351(2)
E 4: Test Plan (Client/Server and Internet Spiral Testing)
353(2)
E 5: Function/Test Matrix
355(1)
E 6: GUI Component Test Matrix (Client/Server and Internet Spiral Testing)
355(1)
E 7: GUI-Based Functional Test Matrix (Client/Server and Internet Spiral Testing)
356(1)
E 8: Test Case
357(1)
E 9: Test Case Log
357(2)
E10: Test Log Summary Report
359(2)
E11: System Summary Report
361(1)
E12: Defect Report
362(2)
E13: Test Schedule
364(2)
E14: Retest Matrix
366(2)
E15: Spiral Testing Summary Report (Client/Server and Internet Spiral Testing)
368(1)
E16: Minutes of the Meeting
368(2)
E17: Test Approvals
370(1)
E18: Test Execution Plan
371(1)
E19: Test Project Milestones
372(1)
E20: PDCA Test Schedule
373(1)
E21: Test Strategy
374(3)
E22: Clarification Request
377(1)
E23: Screen Data Mapping
378(1)
E24: Test Condition versus Test Case
379(1)
E25: Project Status Report
380(1)
E26: Test Defect Details Report
381(2)
E27: Defect Report
383(1)
E28: Test Execution Tracking Manager
383(1)
E29: Final Test Summary Report
384(3)
F Checklists
387(1)
F 1: Requirements Phase Defect Checklist
388(1)
F 2: Logical Design Phase Defect Checklist
389(1)
F 3: Physical Design Phase Defect Checklist
390(3)
F 4: Program Unit Design Phase Defect Checklist
393(1)
F 5: Coding Phase Defect Checklist
394(2)
F 6: Field Testing Checklist
396(2)
F 7: Record Testing Checklist
398(2)
F 8: File Test Checklist
400(1)
F 9: Error Testing Checklist
401(2)
F10: Use Test Checklist
403(1)
F11: Search Test Checklist
404(1)
F12: Match/Merge Checklist
405(2)
F13: Stress Test Checklist
407(2)
F14: Attributes Testing Checklist
409(2)
F15: States Testing Checklist
411(1)
F16: Procedures Testing Checklist
412(1)
F17: Control Testing Checklist
413(5)
F18: Control Flow Testing Checklist
418(1)
F19: Testing Tool Selection Checklist
419(2)
F20: Project Information Gathering Checklist
421(2)
F21: Impact Analysis Checklist
423(2)
F22: Environment Readiness Checklist
425(2)
F23: Project Completion Checklist
427(2)
F24: Unit Testing Checklist
429(4)
F25: Ambiguity Review Checklist
433(2)
F26: Architecture Review Checklist
435(1)
F27: Data Design Review Checklist
436(1)
F28: Functional Specification Review Checklist
437(5)
F29: Prototype Review Checklist
442(1)
F30: Requirements Review Checklist
443(4)
F31: Technical Design Review Checklist
447(2)
F32: Test Case Preparation Review Checklist
449(2)
G Software Testing Techniques
451(1)
G 1: Basis Path Testing
451(1)
PROGRAM: FIELD-COUNT
451(1)
G 2: Black-Box Testing
452(1)
Extra Program Logic
453(1)
G 3: Bottom-Up Testing
453(1)
G 4: Boundary Value Testing
453(2)
Numeric Input Data
454(1)
Field Ranges
454(1)
Numeric Output Data
454(1)
Output Range of Values
454(1)
Nonnumeric Input Data
454(1)
Tables or Arrays
454(1)
Number of Items
454(1)
Nonnumeric Output Data
454(1)
Tables or Arrays
454(1)
Number of Outputs
454(1)
GUI
454(1)
G 5: Branch Coverage Testing
455(1)
PROGRAM: FIELD-COUNT
455(1)
G 6: Branch/Condition Coverage Testing
455(1)
PROGRAM: FIELD-COUNT
456(1)
G 7: Cause-Effect Graphing
456(4)
Cause-Effect Methodology
457(1)
Specification
458(2)
Causes
458(1)
Effects
458(2)
G 8: Condition Coverage
460(1)
PROGRAM: FIELD-COUNT
460(1)
G 9: CRUD Testing
461(1)
G10: Database Testing
461(31)
Integrity Testing
461(3)
Entity Integrity
462(1)
Primary Key Integrity
462(1)
Column Key Integrity
462(1)
Domain Integrity
463(1)
User-Defined Integrity
463(1)
Referential Integrity
463(1)
Data Modeling Essentials
464(2)
What Is a Model?
465(1)
Why Do We Create Models?
465(1)
Tables - A Definition
466(1)
Table Names
467(1)
Columns
467(1)
Rows
467(1)
Order
467(1)
Entities - A Definition
467(1)
Identification - Primary Key
468(2)
Compound Primary Keys
468(1)
Null Values
468(1)
Identifying Entities
469(1)
Entity Classes
469(1)
Relationships - A Definition
470(6)
Relationship Types
470(1)
One-to-One
470(2)
One-to-Many
472(1)
Many-to-Many
473(2)
Multiple Relationships
475(1)
Entities versus Relationships
475(1)
Attributes - A Definition
476(1)
Domain
477(1)
Domain Names
478(2)
Attributes versus Relationships
478(1)
Normalization - What Is It?
479(1)
Problems of Unnormalized Entities
479(1)
Steps in Normalization
480(8)
First Normal Form (1NF)
480(2)
Second Normal Form (2NF)
482(2)
Third Normal Form (3NF)
484(1)
Model Refinement
485(1)
Entity Subtypes
486(1)
A Definition
486(1)
Referential Integrity
486(2)
Dependency Constraints
488(4)
Constraint Rule
488(1)
Recursion
489(2)
Using the Model in Database Design
491(1)
Relational Design
491(1)
G11: Decision Tables
492(1)
PROGRAM: FIELD-COUNT
492(1)
G12: Desk Checking
493(1)
G13: Equivalence Partitioning
493(1)
Numeric Input Data
494(1)
Field Ranges
494(1)
Numeric Output Data
494(1)
Output Range of Values
494(1)
Nonnumeric Input Data
494(1)
Tables or Arrays
494(1)
Number of Items
494(1)
Nonnumeric Output Data
494(8)
Tables or Arrays
494(1)
Number of Outputs
494(1)
G14: Exception Testing
494(1)
G15: Free Form Testing
494(1)
G16: Gray-Box Testing
495(1)
G17: Histograms
496(1)
G18: Inspections
496(1)
G19: JADs
497(1)
G20: Orthogonal Array Testing
498(1)
G21: Pareto Analysis
499(2)
G22: Positive and Negative Testing
501(1)
G23: Prior Defect History Testing
502(1)
G24: Prototyping
502(5)
Cyclic Models
502(1)
Fourth-Generation Languages and Prototyping
503(1)
Iterative Development Accounting
504(1)
Evolutionary and Throwaway
504(1)
Application Prototyping
505(1)
Prototype Systems Development
505(1)
Data-Driven Prototyping
505(1)
Replacement of the Traditional Life Cycle
506(1)
Early-Stage Prototyping
506(1)
User Software Engineering
507(1)
G25: Random Testing
507(1)
G26: Range Testing
507(2)
G27: Regression Testing
509(1)
G28: Risk-Based Testing
509(1)
G29: Run Charts
510(1)
G30: Sandwich Testing
510(1)
G31: Statement Coverage Testing
511(1)
PROGRAM: FIELD-COUNT
511(1)
G32: State Transition Testing
511(1)
PROGRAM: FIELD-COUNT
512(1)
G33: Statistical Profile Testing
512(1)
G34: Structured Walkthroughs
512(2)
G35: Syntax Testing
514(1)
G36: Table Testing
514(1)
G37: Thread Testing
515(1)
G38: Top-Down Testing
515(1)
G39: White-Box Testing
516(1)
Bibliography 517(6)
Glossary 523(6)
Index 529

An electronic version of this book is available through VitalSource.

This book is viewable on PC, Mac, iPhone, iPad, iPod Touch, and most smartphones.

By purchasing, you will be able to view this book online, as well as download it, for the chosen number of days.

Digital License

You are licensing a digital product for a set duration. Durations are set forth in the product description, with "Lifetime" typically meaning five (5) years of online access and permanent download to a supported device. All licenses are non-transferable.

More details can be found here.

A downloadable version of this book is available through the eCampus Reader or compatible Adobe readers.

Applications are available on iOS, Android, PC, Mac, and Windows Mobile platforms.

Please view the compatibility matrix prior to purchase.