SECTION I SOFTWARE QUALITY IN PERSPECTIVE |
|
1 | (50) |
|
1 Quality Assurance Framework |
|
|
5 | (24) |
|
|
5 | (1) |
|
Prevention versus Detection |
|
|
6 | (1) |
|
Verification versus Validation |
|
|
7 | (1) |
|
Software Quality Assurance |
|
|
8 | (1) |
|
Components of Quality Assurance |
|
|
9 | (1) |
|
|
10 | (1) |
|
|
11 | (1) |
|
Software Configuration Management |
|
|
12 | (1) |
|
Elements of Software Configuration Management |
|
|
12 | (1) |
|
|
13 | (1) |
|
|
14 | (1) |
|
|
14 | (1) |
|
|
15 | (1) |
|
Software Quality Assurance Plan |
|
|
16 | (1) |
|
Steps to Develop and Implement a Software Quality Assurance Plan |
|
|
16 | (3) |
|
Step 1. Document the Plan |
|
|
16 | (2) |
|
Step 2. Obtain Management Acceptance |
|
|
18 | (1) |
|
Step 3. Obtain Development Acceptance |
|
|
18 | (1) |
|
Step 4. Plan for Implementation of the SQA Plan |
|
|
19 | (1) |
|
Step 5. Execute the SQA Plan |
|
|
19 | (1) |
|
|
19 | (8) |
|
|
19 | (1) |
|
Capability Maturity Model (CMM) |
|
|
20 | (3) |
|
|
21 | (1) |
|
|
21 | (1) |
|
|
22 | (1) |
|
|
22 | (1) |
|
|
23 | (1) |
|
|
23 | (1) |
|
|
24 | (1) |
|
Malcom Baldrige National Quality Award |
|
|
24 | (3) |
|
|
27 | (2) |
|
2 Overview of Testing Techniques |
|
|
29 | (12) |
|
Black-Box Testing (Functional) |
|
|
29 | (1) |
|
White-Box Testing (Structural) |
|
|
30 | (1) |
|
Gray-Box Testing (Functional and Structural) |
|
|
30 | (2) |
|
Manual versus Automated Testing |
|
|
31 | (1) |
|
Static versus Dynamic Testing |
|
|
31 | (1) |
|
Taxonomy of Software Testing Techniques |
|
|
32 | (9) |
|
3 Quality through Continuous Improvement Process |
|
|
41 | (10) |
|
Contribution of Edward Deming |
|
|
41 | (1) |
|
Role of Statistical Methods |
|
|
42 | (1) |
|
|
42 | (1) |
|
|
42 | (1) |
|
|
42 | (1) |
|
|
42 | (1) |
|
|
43 | (1) |
|
|
43 | (1) |
|
|
43 | (1) |
|
Deming's 14 Quality Principles |
|
|
43 | (5) |
|
Point 1: Create Constancy of Purpose |
|
|
43 | (1) |
|
Point 2: Adopt the New Philosophy |
|
|
44 | (1) |
|
Point 3: Cease Dependence on Mass Inspection |
|
|
44 | (1) |
|
Point 4: End the Practice of Awarding Business on Price Tag Alone |
|
|
44 | (1) |
|
Point 5: Improve Constantly and Forever the System of Production and Service |
|
|
45 | (1) |
|
Point 6: Institute Training and Retraining |
|
|
45 | (1) |
|
Point 7: Institute Leadership |
|
|
45 | (1) |
|
|
46 | (1) |
|
Point 9: Break Down Barriers between Staff Areas |
|
|
46 | (1) |
|
Point 10: Eliminate Slogans, Exhortations, and Targets for the Workforce |
|
|
47 | (1) |
|
Point 11: Eliminate Numerical Goals |
|
|
47 | (1) |
|
Point 12: Remove Barriers to Pride of Workmanship |
|
|
47 | (1) |
|
Point 13: Institute a Vigorous Program of Education and Retraining |
|
|
48 | (1) |
|
Point 14: Take Action to Accomplish the Transformation |
|
|
48 | (1) |
|
Continuous Improvement through the Plan, Do, Check, Act Process |
|
|
48 | (2) |
|
Going around the PDCA Circle |
|
|
50 | (1) |
SECTION II LIFE CYCLE TESTING REVIEW |
|
51 | (46) |
|
|
53 | (20) |
|
Waterfall Development Methodology |
|
|
53 | (1) |
|
Continuous Improvement "Phased" Approach |
|
|
54 | (1) |
|
Psychology of Life Cycle Testing |
|
|
54 | (1) |
|
Software Testing as a Continuous Improvement Process |
|
|
55 | (3) |
|
The Testing Bible: Software Test Plan |
|
|
58 | (2) |
|
Major Steps to Develop a Test Plan |
|
|
60 | (1) |
|
1. Define the Test Objectives |
|
|
60 | (1) |
|
2. Develop the Test Approach |
|
|
60 | (1) |
|
3. Define the Test Environment |
|
|
60 | (1) |
|
4. Develop the Test Specifications |
|
|
61 | (1) |
|
|
61 | (1) |
|
6. Review and Approve the Test Plan |
|
|
61 | (1) |
|
Components of a Test Plan |
|
|
61 | (1) |
|
Technical Reviews as a Continuous Improvement Process |
|
|
61 | (4) |
|
Motivation for Technical Reviews |
|
|
65 | (1) |
|
|
66 | (1) |
|
|
66 | (1) |
|
|
66 | (3) |
|
|
69 | (1) |
|
Steps for an Effective Review |
|
|
70 | (1) |
|
1. Plan for the Review Process |
|
|
70 | (1) |
|
|
70 | (1) |
|
3. Develop the Review Agenda |
|
|
71 | (1) |
|
4. Create a Review Report |
|
|
71 | (2) |
|
5 Verifying the Requirements Phase |
|
|
73 | (6) |
|
Testing the Requirements with Technical Reviews |
|
|
74 | (1) |
|
Inspections and Walkthroughs |
|
|
74 | (1) |
|
|
74 | (1) |
|
|
75 | (1) |
|
Requirements Traceability Matrix |
|
|
76 | (1) |
|
Building the System/Acceptance Test Plan |
|
|
76 | (3) |
|
6 Verifying the Logical Design Phase |
|
|
79 | (4) |
|
Data Model, Process Model, and the Linkage |
|
|
79 | (1) |
|
Testing the Logical Design with Technical Reviews |
|
|
80 | (1) |
|
Refining the System/Acceptance Test Plan |
|
|
81 | (2) |
|
7 Verifying the Physical Design Phase |
|
|
83 | (4) |
|
Testing the Physical Design with Technical Reviews |
|
|
83 | (2) |
|
Creating Integration Test Cases |
|
|
85 | (1) |
|
Methodology for Integration Testing |
|
|
85 | (2) |
|
Step 1: Identify Unit Interfaces |
|
|
85 | (1) |
|
Step 2: Reconcile Interfaces for Completeness |
|
|
85 | (1) |
|
Step 3: Create Integration Test Conditions |
|
|
86 | (1) |
|
Step 4: Evaluate the Completeness of Integration Test Conditions |
|
|
86 | (1) |
|
8 Verifying the Program Unit Design Phase |
|
|
87 | (4) |
|
Testing the Program Unit Design with Technical Reviews |
|
|
87 | (1) |
|
|
87 | (1) |
|
|
87 | (1) |
|
|
87 | (1) |
|
|
88 | (3) |
|
9 Verifying the Coding Phase |
|
|
91 | (6) |
|
Testing Coding with Technical Reviews |
|
|
91 | (1) |
|
|
91 | (1) |
|
|
92 | (1) |
|
|
93 | (1) |
|
|
93 | (1) |
|
|
94 | (1) |
|
|
95 | (2) |
SECTION III SOFTWARE TESTING METHODOLOGY |
|
97 | (1) |
|
10 Development Methodology Overview |
|
|
99 | (138) |
|
Limitations of Life Cycle Development |
|
|
99 | (1) |
|
The Client/Server Challenge |
|
|
100 | (1) |
|
Psychology of Client/Server Spiral Testing |
|
|
101 | (5) |
|
The New School of Thought |
|
|
101 | (1) |
|
Tester/Developer Perceptions |
|
|
102 | (1) |
|
Project Goal: Integrate QA and Development |
|
|
103 | (1) |
|
Iterative/Spiral Development Methodology |
|
|
104 | (2) |
|
|
106 | (1) |
|
|
107 | (1) |
|
Methodology for Developing Prototypes |
|
|
108 | (4) |
|
|
108 | (2) |
|
2. Demonstrate Prototypes to Management |
|
|
110 | (1) |
|
3. Demonstrate Prototype to Users |
|
|
110 | (1) |
|
4. Revise and Finalize Specifications |
|
|
111 | (1) |
|
5. Develop the Production System |
|
|
111 | (1) |
|
Continuous Improvement "Spiral" Testing Approach |
|
|
112 | (5) |
|
11 Information Gathering (Plan) |
|
|
117 | (1) |
|
Step 1: Prepare for the Interview |
|
|
117 | (1) |
|
Task 1: Identify the Participants |
|
|
117 | (1) |
|
Task 2: Define the Agenda |
|
|
118 | (1) |
|
Step 2: Conduct the Interview |
|
|
118 | (8) |
|
Task 1: Understand the Project |
|
|
118 | (3) |
|
Task 2: Understand the Project Objectives |
|
|
121 | (1) |
|
Task 3: Understand the Project Status |
|
|
121 | (1) |
|
Task 4: Understand the Project Plans |
|
|
122 | (1) |
|
Task 5: Understand the Project Development Methodology |
|
|
122 | (1) |
|
Task 6: Identify the High-Level Business Requirements |
|
|
123 | (1) |
|
Task 7: Perform Risk Analysis |
|
|
124 | (2) |
|
|
124 | (1) |
|
Method 1 Judgment and Instinct |
|
|
125 | (1) |
|
Method 2 Dollar Estimation |
|
|
125 | (1) |
|
Method 3 Identifying and Weighting Risk Attributes |
|
|
125 | (1) |
|
Step 3: Summarize the Findings |
|
|
126 | (4) |
|
Task 1: Summarize the Interview |
|
|
126 | (1) |
|
Task 2: Confirm the Interview Findings |
|
|
127 | (2) |
|
|
129 | (1) |
|
Step 1: Build a Test Plan |
|
|
130 | (19) |
|
Task 1: Prepare an Introduction |
|
|
130 | (1) |
|
Task 2: Define the High-Level Functional Requirements (in Scope) |
|
|
131 | (1) |
|
Task 3: Identify Manual/Automated Test Types |
|
|
132 | (1) |
|
Task 4: Identify the Test Exit Criteria |
|
|
133 | (1) |
|
Task 5: Establish Regression Test Strategy |
|
|
134 | (2) |
|
Task 6: Define the Test Deliverables |
|
|
136 | (1) |
|
Task 7: Organize the Test Team |
|
|
137 | (1) |
|
Task 8: Establish a Test Environment |
|
|
138 | (1) |
|
Task 9: Define the Dependencies |
|
|
139 | (1) |
|
Task 10: Create a Test Schedule |
|
|
139 | (3) |
|
Task 11: Select the Test Tools |
|
|
142 | (1) |
|
Task 12: Establish Defect Recording/Tracking Procedures |
|
|
143 | (2) |
|
Task 13: Establish Change Request Procedures |
|
|
145 | (2) |
|
Task 14: Establish Version Control Procedures |
|
|
147 | (1) |
|
Task 15: Define Configuration Build Procedures |
|
|
147 | (1) |
|
Task 16: Define Project Issue Resolution Procedures |
|
|
148 | (1) |
|
Task 17: Establish Reporting Procedures |
|
|
148 | (1) |
|
Task 18: Define Approval Procedures |
|
|
149 | (1) |
|
Step 2: Define the Metric Objectives |
|
|
149 | (5) |
|
Task 1: Define the Metrics |
|
|
150 | (1) |
|
Task 2: Define the Metric Points |
|
|
151 | (3) |
|
Step 3: Review/Approve the Plan |
|
|
154 | (3) |
|
Task 1: Schedule/Conduct the Review |
|
|
154 | (1) |
|
|
154 | (3) |
|
|
157 | (1) |
|
Step 1: Design Function Tests |
|
|
157 | (6) |
|
Task 1: Refine the Functional Test Requirements |
|
|
157 | (2) |
|
Task 2: Build a Function/Test Matrix |
|
|
159 | (4) |
|
|
163 | (4) |
|
Ten Guidelines for Good GUI Design |
|
|
164 | (1) |
|
Task 1: Identify the Application GUI Components |
|
|
165 | (1) |
|
Task 2: Define the GUI Tests |
|
|
165 | (2) |
|
Step 3: Define the System/Acceptance Tests |
|
|
167 | (2) |
|
Task 1: Identify Potential System Tests |
|
|
167 | (1) |
|
Task 2: Design System Fragment Tests |
|
|
168 | (1) |
|
Task 3: Identify Potential Acceptance Tests |
|
|
169 | (1) |
|
Step 4: Review/Approve Design |
|
|
169 | (4) |
|
Task 1: Schedule/Prepare for Review |
|
|
169 | (1) |
|
|
169 | (4) |
|
|
173 | (1) |
|
Step 1: Develop Test Scripts |
|
|
173 | (1) |
|
Task 1: Script the Manual/Automated GUI/Function Tests |
|
|
173 | (1) |
|
Task 2: Script the Manual/Automated System Fragment Tests |
|
|
173 | (1) |
|
Step 2: Review/Approve Test Development |
|
|
174 | (4) |
|
Task 1: Schedule/Prepare for Review |
|
|
174 | (1) |
|
|
174 | (3) |
|
15 Test Coverage through Traceability |
|
|
177 | (4) |
|
Use Cases and Traceability |
|
|
178 | (2) |
|
|
180 | (1) |
|
16 Test Execution/Evaluation (Do/Check) |
|
|
181 | (1) |
|
Step 1: Setup and Testing |
|
|
181 | (2) |
|
Task 1: Regression Test the Manual/Automated Spiral Fixes |
|
|
181 | (1) |
|
Task 2: Execute the Manual/Automated New Spiral Tests |
|
|
182 | (1) |
|
Task 3: Document the Spiral Test Defects |
|
|
183 | (1) |
|
|
183 | (1) |
|
Task 1: Analyze the Metrics |
|
|
183 | (1) |
|
Step 3: Publish Interim Report |
|
|
184 | (3) |
|
Task 1: Refine the Test Schedule |
|
|
184 | (1) |
|
Task 2: Identify Requirement Changes |
|
|
185 | (2) |
|
17 Prepare for the Next Spiral (Act) |
|
|
187 | (1) |
|
|
187 | (2) |
|
Task 1: Update the Function/GUI Tests |
|
|
187 | (1) |
|
Task 2: Update the System Fragment Tests |
|
|
188 | (1) |
|
Task 3: Update the Acceptance Tests |
|
|
189 | (1) |
|
Step 2: Reassess the Team, Procedures, and Test Environment |
|
|
189 | (2) |
|
Task 1: Evaluate the Test Team |
|
|
189 | (1) |
|
Task 2: Review the Test Control Procedures |
|
|
189 | (1) |
|
Task 3: Update the Test Environment |
|
|
190 | (1) |
|
Step 3: Publish Interim Test Report |
|
|
191 | (4) |
|
Task 1: Publish the Metric Graphics |
|
|
191 | (4) |
|
Test Case Execution Status |
|
|
191 | (1) |
|
|
191 | (1) |
|
|
191 | (1) |
|
|
192 | (3) |
|
18 Conduct the System Test |
|
|
195 | (5) |
|
Step 1: Complete System Test Plan |
|
|
195 | (5) |
|
Task 1: Finalize the System Test Types |
|
|
195 | (2) |
|
Task 2: Finalize System Test Schedule |
|
|
197 | (1) |
|
Task 3: Organize the System Test Team |
|
|
197 | (1) |
|
Task 4: Establish the System Test Environment |
|
|
197 | (3) |
|
Task 5: Install the System Test Tools |
|
|
200 | (1) |
|
Step 2: Complete System Test Cases |
|
|
200 | (11) |
|
Task 1: Design/Script the Performance Tests |
|
|
200 | (3) |
|
|
201 | (1) |
|
|
202 | (1) |
|
|
202 | (1) |
|
Task 2: Design/Script the Security Tests |
|
|
203 | (1) |
|
A Security Design Strategy |
|
|
203 | (1) |
|
Task 3: Design/Script the Volume Tests |
|
|
204 | (1) |
|
Task 4: Design/Script the Stress Tests |
|
|
205 | (1) |
|
Task 5: Design/Script the Compatibility Tests |
|
|
206 | (1) |
|
Task 6: Design/Script the Conversion Tests |
|
|
206 | (1) |
|
Task 7: Design/Script the Usability Tests |
|
|
207 | (1) |
|
Task 8: Design/Script the Documentation Tests |
|
|
208 | (1) |
|
Task 9: Design/Script the Backup Tests |
|
|
208 | (1) |
|
Task 10: Design/Script the Recovery Tests |
|
|
209 | (1) |
|
Task 11: Design/Script the Installation Tests |
|
|
209 | (1) |
|
Task 12: Design/Script Other System Test Types |
|
|
210 | (1) |
|
Step 3: Review/Approve System Tests |
|
|
211 | (1) |
|
Task 1: Schedule/Conduct the Review |
|
|
211 | (1) |
|
|
212 | (1) |
|
Step 4: Execute the System Tests |
|
|
212 | (3) |
|
Task 1: Regression Test the System Fixes |
|
|
212 | (1) |
|
Task 2: Execute the New System Tests |
|
|
213 | (1) |
|
Task 3: Document the System Defects |
|
|
213 | (2) |
|
19 Conduct Acceptance Testing |
|
|
215 | (1) |
|
Step 1: Complete Acceptance Test Planning |
|
|
215 | (3) |
|
Task 1: Finalize the Acceptance Test Types |
|
|
215 | (1) |
|
Task 2: Finalize the Acceptance Test Schedule |
|
|
215 | (1) |
|
Task 3: Organize the Acceptance Test Team |
|
|
215 | (2) |
|
Task 4: Establish the Acceptance Test Environment |
|
|
217 | (1) |
|
Task 5: Install Acceptance Test Tools |
|
|
218 | (1) |
|
Step 2: Complete Acceptance Test Cases |
|
|
218 | (1) |
|
Task 1: Subset the System-Level Test Cases |
|
|
218 | (1) |
|
Task 2: Design/Script Additional Acceptance Tests |
|
|
219 | (1) |
|
Step 3: Review/Approve Acceptance Test Plan |
|
|
219 | (1) |
|
Task 1: Schedule/Conduct the Review |
|
|
219 | (1) |
|
|
220 | (1) |
|
Step 4: Execute the Acceptance Tests |
|
|
220 | (3) |
|
Task 1: Regression Test the Acceptance Fixes |
|
|
220 | (1) |
|
Task 2: Execute the New Acceptance Tests |
|
|
220 | (1) |
|
Task 3: Document the Acceptance Defects |
|
|
221 | (2) |
|
20 Summarize/Report Spiral Test Results |
|
|
223 | (1) |
|
Step 1: Perform Data Reduction |
|
|
223 | (1) |
|
Task 1: Ensure All Tests Were Executed/Resolved |
|
|
223 | (1) |
|
Task 2: Consolidate Test Defects by Test Number |
|
|
223 | (1) |
|
Task 3: Post Remaining Defects to a Matrix |
|
|
223 | (1) |
|
Step 2: Prepare Final Test Report |
|
|
224 | (9) |
|
Task 1: Prepare the Project Overview |
|
|
225 | (1) |
|
Task 2: Summarize the Test Activities |
|
|
225 | (1) |
|
Task 3: Analyze/Create Metric Graphics |
|
|
225 | (7) |
|
|
225 | (2) |
|
|
227 | (1) |
|
|
227 | (1) |
|
|
227 | (1) |
|
|
227 | (1) |
|
|
227 | (3) |
|
|
230 | (1) |
|
|
230 | (1) |
|
|
230 | (1) |
|
System Testing Defect Types |
|
|
230 | (2) |
|
Acceptance Testing Defect Types |
|
|
232 | (1) |
|
Task 4: Develop Findings/Recommendations |
|
|
232 | (1) |
|
Step 3: Review/Approve the Final Test Report |
|
|
233 | (6) |
|
Task 1: Schedule/Conduct the Review |
|
|
233 | (3) |
|
|
236 | (1) |
|
Task 3: Publish the Final Test Report |
|
|
236 | (1) |
SECTION IV TEST PROJECT MANAGEMENT |
|
237 | (1) |
|
21 Overview of General Project Management |
|
|
239 | (48) |
|
|
239 | (1) |
|
Define the Scope of the Project |
|
|
240 | (1) |
|
Identify the Key Activities |
|
|
240 | (1) |
|
|
240 | (1) |
|
|
241 | (1) |
|
|
241 | (5) |
|
|
242 | (1) |
|
|
242 | (1) |
|
|
243 | (1) |
|
|
243 | (1) |
|
|
243 | (2) |
|
22 Test Project Management |
|
|
245 | (3) |
|
Understand the Requirements |
|
|
246 | (1) |
|
|
246 | (1) |
|
|
247 | (1) |
|
Identify and Improve Processes |
|
|
247 | (1) |
|
Essential Characteristics of a Test Project Manager |
|
|
248 | (7) |
|
|
248 | (1) |
|
|
248 | (1) |
|
Lateral Thinking in Developing Test Cases |
|
|
248 | (1) |
|
Avoid Duplication and Repetition |
|
|
249 | (1) |
|
|
249 | (1) |
|
Validate the Test Environment |
|
|
249 | (1) |
|
|
249 | (1) |
|
|
249 | (1) |
|
Do Not Hesitate to Accept Help from Others |
|
|
250 | (1) |
|
Convey Issues as They Arise |
|
|
250 | (1) |
|
|
250 | (1) |
|
Always Keep Updating Your Business Knowledge |
|
|
250 | (1) |
|
Learn the New Testing Technologies and Tools |
|
|
250 | (1) |
|
|
251 | (1) |
|
|
251 | (1) |
|
|
251 | (1) |
|
|
251 | (2) |
|
|
253 | (2) |
|
|
255 | (1) |
|
|
255 | (1) |
|
|
255 | (1) |
|
|
255 | (1) |
|
Critical Activities for Test Estimation |
|
|
255 | (2) |
|
|
256 | (1) |
|
|
256 | (1) |
|
|
257 | (1) |
|
|
257 | (1) |
|
|
257 | (1) |
|
|
257 | (1) |
|
Factors Affecting Test Estimation |
|
|
257 | (1) |
|
|
258 | (1) |
|
Test Execution and Controlling Effort |
|
|
259 | (1) |
|
|
259 | (1) |
|
Effort Estimation - Model Project |
|
|
259 | (4) |
|
24 Defect Monitoring and Management Process |
|
|
263 | (1) |
|
|
264 | (1) |
|
|
265 | (1) |
|
|
265 | (1) |
|
|
266 | (1) |
|
|
266 | (4) |
|
|
267 | (2) |
|
25 Integrating Testing into Development Methodology |
|
|
269 | (8) |
|
Step 1. Organize the Test Team |
|
|
270 | (1) |
|
Step 2. Identify Test Steps and Tasks to Integrate |
|
|
270 | (1) |
|
Step 3. Customize Test Steps and Tasks |
|
|
271 | (1) |
|
Step 4. Select Integration Points |
|
|
271 | (1) |
|
Step 5. Modify the Development Methodology |
|
|
272 | (1) |
|
Step 6. Incorporate Defect Recording |
|
|
272 | (1) |
|
Step 7. Train in Use of the Test Methodology |
|
|
272 | (3) |
|
26 On-Site/Offshore Model |
|
|
275 | (1) |
|
|
275 | (1) |
|
Step 2: Determine the Economic Tradeoffs |
|
|
276 | (1) |
|
Step 3: Determine the Selection Criteria |
|
|
276 | (1) |
|
Project Management and Monitoring |
|
|
276 | (1) |
|
|
277 | (2) |
|
|
278 | (1) |
|
|
278 | (1) |
|
Implementing the On-Site/Offshore Model |
|
|
279 | (4) |
|
|
279 | (1) |
|
|
280 | (1) |
|
|
280 | (1) |
|
|
280 | (1) |
|
|
280 | (1) |
|
|
281 | (2) |
|
|
282 | (1) |
|
Benefits of On-Site/Offshore Methodology |
|
|
283 | (3) |
|
On-Site/Offshore Model Challenges, |
|
|
285 | (8) |
|
|
285 | (1) |
|
|
285 | (1) |
|
|
285 | (1) |
|
|
285 | (1) |
|
|
285 | (1) |
|
|
285 | (1) |
|
|
285 | (1) |
|
The Future of Onshore/Offshore |
|
|
286 | (1) |
SECTION V MODERN SOFTWARE TESTING TOOLS |
|
287 | (1) |
|
27 A Brief History of Software Testing |
|
|
289 | (42) |
|
Evolution of Automated Testing Tools |
|
|
293 | (2) |
|
Static Capture/Replay Tools (without Scripting Language) |
|
|
294 | (1) |
|
Static Capture/Replay Tools (with Scripting Language), |
|
|
294 | (1) |
|
Variable Capture/Replay Tools |
|
|
295 | (3) |
|
Functional Decomposition Approach |
|
|
295 | (1) |
|
Test Plan Driven ("Keyword") Approach |
|
|
296 | (2) |
|
Historical Software Testing and Development Parallels |
|
|
298 | (3) |
|
|
299 | (2) |
|
28 Software Testing Trends |
|
|
301 | (4) |
|
Automated Capture/Replay Testing Tools |
|
|
301 | (1) |
|
|
302 | (1) |
|
Advanced Leading-Edge Automated Testing Tools |
|
|
302 | (2) |
|
Advanced Leading-Edge Test Case Builder Tools |
|
|
304 | (1) |
|
Necessary and Sufficient Conditions |
|
|
304 | (1) |
|
Test Data/Test Case Generation |
|
|
305 | (6) |
|
|
305 | (1) |
|
|
306 | (1) |
|
|
306 | (1) |
|
Generating Data Based upon the Database |
|
|
307 | (1) |
|
Generating Test Data/Test Cases Based upon the Requirements |
|
|
308 | (3) |
|
29 Taxonomy of Testing Tools |
|
|
311 | (13) |
|
Testing Tool Selection Checklist |
|
|
311 | (1) |
|
|
312 | (1) |
|
When You Should Consider Test Automation |
|
|
312 | (8) |
|
When You Should NOT Consider Test Automation |
|
|
320 | (3) |
|
30 Methodology to Evaluate Automated Testing Tools |
|
|
323 | (1) |
|
Step 1: Define Your Test Requirements |
|
|
323 | (1) |
|
Step 2: Set Tool Objectives |
|
|
323 | (1) |
|
Step 3a: Conduct Selection Activities for Informal Procurement |
|
|
324 | (2) |
|
Task 1: Develop the Acquisition Plan |
|
|
324 | (1) |
|
Task 2: Define Selection Criteria |
|
|
324 | (1) |
|
Task 3: Identify Candidate Tools |
|
|
324 | (1) |
|
Task 4: Conduct the Candidate Review |
|
|
325 | (1) |
|
Task 5: Score the Candidates |
|
|
325 | (1) |
|
|
325 | (1) |
|
Step 3b: Conduct Selection Activities for Formal Procurement |
|
|
326 | (1) |
|
Task 1: Develop the Acquisition Plan |
|
|
326 | (1) |
|
Task 2: Create the Technical Requirements Document |
|
|
326 | (1) |
|
Task 3: Review Requirements |
|
|
326 | (1) |
|
Task 4: Generate the Request for Proposal |
|
|
326 | (1) |
|
Task 5: Solicit Proposals |
|
|
326 | (1) |
|
Task 6: Perform the Technical Evaluation |
|
|
327 | (1) |
|
Task 7: Select a Tool Source |
|
|
327 | (1) |
|
Step 4: Procure the Testing Tool |
|
|
327 | (1) |
|
Step 5: Create the Evaluation Plan |
|
|
327 | (1) |
|
Step 6: Create the Tool Manager's Plan |
|
|
328 | (1) |
|
Step 7: Create the Training Plan |
|
|
328 | (1) |
|
|
328 | (1) |
|
Step 9: Perform the Acceptance Test |
|
|
329 | (1) |
|
Step 10: Conduct Orientation |
|
|
329 | (1) |
|
Step 11: Implement Modifications |
|
|
329 | (1) |
|
Step 12: Train Tool Users |
|
|
329 | (1) |
|
Step 13: Use the Tool in the Operating Environment |
|
|
330 | (1) |
|
Step 14: Write the Evaluation Report |
|
|
330 | (1) |
|
Step 15: Determine Whether Goals Have Been Met |
|
|
330 | (1) |
APPENDICES |
|
331 | (1) |
|
A Spiral Testing Methodology |
|
|
333 | (1) |
|
B Software Quality Assurance Plan |
|
|
343 | (1) |
|
C Requirements Specification |
|
|
345 | (1) |
|
|
347 | (1) |
|
|
349 | (168) |
|
|
349 | (1) |
|
E 2: System/Acceptance Test Plan |
|
|
349 | (2) |
|
E 3: Requirements Traceability Matrix |
|
|
351 | (2) |
|
E 4: Test Plan (Client/Server and Internet Spiral Testing) |
|
|
353 | (2) |
|
E 5: Function/Test Matrix |
|
|
355 | (1) |
|
E 6: GUI Component Test Matrix (Client/Server and Internet Spiral Testing) |
|
|
355 | (1) |
|
E 7: GUI-Based Functional Test Matrix (Client/Server and Internet Spiral Testing) |
|
|
356 | (1) |
|
|
357 | (1) |
|
|
357 | (2) |
|
E10: Test Log Summary Report |
|
|
359 | (2) |
|
E11: System Summary Report |
|
|
361 | (1) |
|
|
362 | (2) |
|
|
364 | (2) |
|
|
366 | (2) |
|
E15: Spiral Testing Summary Report (Client/Server and Internet Spiral Testing) |
|
|
368 | (1) |
|
E16: Minutes of the Meeting |
|
|
368 | (2) |
|
|
370 | (1) |
|
|
371 | (1) |
|
E19: Test Project Milestones |
|
|
372 | (1) |
|
|
373 | (1) |
|
|
374 | (3) |
|
E22: Clarification Request |
|
|
377 | (1) |
|
|
378 | (1) |
|
E24: Test Condition versus Test Case |
|
|
379 | (1) |
|
E25: Project Status Report |
|
|
380 | (1) |
|
E26: Test Defect Details Report |
|
|
381 | (2) |
|
|
383 | (1) |
|
E28: Test Execution Tracking Manager |
|
|
383 | (1) |
|
E29: Final Test Summary Report |
|
|
384 | (3) |
|
|
387 | (1) |
|
F 1: Requirements Phase Defect Checklist |
|
|
388 | (1) |
|
F 2: Logical Design Phase Defect Checklist |
|
|
389 | (1) |
|
F 3: Physical Design Phase Defect Checklist |
|
|
390 | (3) |
|
F 4: Program Unit Design Phase Defect Checklist |
|
|
393 | (1) |
|
F 5: Coding Phase Defect Checklist |
|
|
394 | (2) |
|
F 6: Field Testing Checklist |
|
|
396 | (2) |
|
F 7: Record Testing Checklist |
|
|
398 | (2) |
|
|
400 | (1) |
|
F 9: Error Testing Checklist |
|
|
401 | (2) |
|
|
403 | (1) |
|
F11: Search Test Checklist |
|
|
404 | (1) |
|
F12: Match/Merge Checklist |
|
|
405 | (2) |
|
F13: Stress Test Checklist |
|
|
407 | (2) |
|
F14: Attributes Testing Checklist |
|
|
409 | (2) |
|
F15: States Testing Checklist |
|
|
411 | (1) |
|
F16: Procedures Testing Checklist |
|
|
412 | (1) |
|
F17: Control Testing Checklist |
|
|
413 | (5) |
|
F18: Control Flow Testing Checklist |
|
|
418 | (1) |
|
F19: Testing Tool Selection Checklist |
|
|
419 | (2) |
|
F20: Project Information Gathering Checklist |
|
|
421 | (2) |
|
F21: Impact Analysis Checklist |
|
|
423 | (2) |
|
F22: Environment Readiness Checklist |
|
|
425 | (2) |
|
F23: Project Completion Checklist |
|
|
427 | (2) |
|
F24: Unit Testing Checklist |
|
|
429 | (4) |
|
F25: Ambiguity Review Checklist |
|
|
433 | (2) |
|
F26: Architecture Review Checklist |
|
|
435 | (1) |
|
F27: Data Design Review Checklist |
|
|
436 | (1) |
|
F28: Functional Specification Review Checklist |
|
|
437 | (5) |
|
F29: Prototype Review Checklist |
|
|
442 | (1) |
|
F30: Requirements Review Checklist |
|
|
443 | (4) |
|
F31: Technical Design Review Checklist |
|
|
447 | (2) |
|
F32: Test Case Preparation Review Checklist |
|
|
449 | (2) |
|
G Software Testing Techniques |
|
|
451 | (1) |
|
|
451 | (1) |
|
|
451 | (1) |
|
|
452 | (1) |
|
|
453 | (1) |
|
|
453 | (1) |
|
G 4: Boundary Value Testing |
|
|
453 | (2) |
|
|
454 | (1) |
|
|
454 | (1) |
|
|
454 | (1) |
|
|
454 | (1) |
|
|
454 | (1) |
|
|
454 | (1) |
|
|
454 | (1) |
|
|
454 | (1) |
|
|
454 | (1) |
|
|
454 | (1) |
|
|
454 | (1) |
|
G 5: Branch Coverage Testing |
|
|
455 | (1) |
|
|
455 | (1) |
|
G 6: Branch/Condition Coverage Testing |
|
|
455 | (1) |
|
|
456 | (1) |
|
G 7: Cause-Effect Graphing |
|
|
456 | (4) |
|
|
457 | (1) |
|
|
458 | (2) |
|
|
458 | (1) |
|
|
458 | (2) |
|
|
460 | (1) |
|
|
460 | (1) |
|
|
461 | (1) |
|
|
461 | (31) |
|
|
461 | (3) |
|
|
462 | (1) |
|
|
462 | (1) |
|
|
462 | (1) |
|
|
463 | (1) |
|
|
463 | (1) |
|
|
463 | (1) |
|
|
464 | (2) |
|
|
465 | (1) |
|
|
465 | (1) |
|
|
466 | (1) |
|
|
467 | (1) |
|
|
467 | (1) |
|
|
467 | (1) |
|
|
467 | (1) |
|
|
467 | (1) |
|
Identification - Primary Key |
|
|
468 | (2) |
|
|
468 | (1) |
|
|
468 | (1) |
|
|
469 | (1) |
|
|
469 | (1) |
|
Relationships - A Definition |
|
|
470 | (6) |
|
|
470 | (1) |
|
|
470 | (2) |
|
|
472 | (1) |
|
|
473 | (2) |
|
|
475 | (1) |
|
Entities versus Relationships |
|
|
475 | (1) |
|
Attributes - A Definition |
|
|
476 | (1) |
|
|
477 | (1) |
|
|
478 | (2) |
|
Attributes versus Relationships |
|
|
478 | (1) |
|
Normalization - What Is It? |
|
|
479 | (1) |
|
Problems of Unnormalized Entities |
|
|
479 | (1) |
|
|
480 | (8) |
|
|
480 | (2) |
|
|
482 | (2) |
|
|
484 | (1) |
|
|
485 | (1) |
|
|
486 | (1) |
|
|
486 | (1) |
|
|
486 | (2) |
|
|
488 | (4) |
|
|
488 | (1) |
|
|
489 | (2) |
|
Using the Model in Database Design |
|
|
491 | (1) |
|
|
491 | (1) |
|
|
492 | (1) |
|
|
492 | (1) |
|
|
493 | (1) |
|
G13: Equivalence Partitioning |
|
|
493 | (1) |
|
|
494 | (1) |
|
|
494 | (1) |
|
|
494 | (1) |
|
|
494 | (1) |
|
|
494 | (1) |
|
|
494 | (1) |
|
|
494 | (1) |
|
|
494 | (8) |
|
|
494 | (1) |
|
|
494 | (1) |
|
|
494 | (1) |
|
|
494 | (1) |
|
|
495 | (1) |
|
|
496 | (1) |
|
|
496 | (1) |
|
|
497 | (1) |
|
G20: Orthogonal Array Testing |
|
|
498 | (1) |
|
|
499 | (2) |
|
G22: Positive and Negative Testing |
|
|
501 | (1) |
|
G23: Prior Defect History Testing |
|
|
502 | (1) |
|
|
502 | (5) |
|
|
502 | (1) |
|
Fourth-Generation Languages and Prototyping |
|
|
503 | (1) |
|
Iterative Development Accounting |
|
|
504 | (1) |
|
Evolutionary and Throwaway |
|
|
504 | (1) |
|
|
505 | (1) |
|
Prototype Systems Development |
|
|
505 | (1) |
|
|
505 | (1) |
|
Replacement of the Traditional Life Cycle |
|
|
506 | (1) |
|
|
506 | (1) |
|
User Software Engineering |
|
|
507 | (1) |
|
|
507 | (1) |
|
|
507 | (2) |
|
|
509 | (1) |
|
|
509 | (1) |
|
|
510 | (1) |
|
|
510 | (1) |
|
G31: Statement Coverage Testing |
|
|
511 | (1) |
|
|
511 | (1) |
|
G32: State Transition Testing |
|
|
511 | (1) |
|
|
512 | (1) |
|
G33: Statistical Profile Testing |
|
|
512 | (1) |
|
G34: Structured Walkthroughs |
|
|
512 | (2) |
|
|
514 | (1) |
|
|
514 | (1) |
|
|
515 | (1) |
|
|
515 | (1) |
|
|
516 | (1) |
Bibliography |
|
517 | (6) |
Glossary |
|
523 | (6) |
Index |
|
529 | |