Course Description
This interactive workshop explains common SQA misconceptions and the six functions SQA should perform to provide far greater value, analyzes why SQA groups so frequently have failed in IS, and presents practical approaches for successfully using SQA effectively throughout any life cycle to produce high quality systems. Because some distinguish SQA as reviewing documents vs. testing’s executing code, key concepts and techniques are presented for reviewing requirements and designs. And, because some still expect an SQA course to be about testing, half of this class does describe testing content more briefly than in our Effective Methods of Software Testing Workshop.
Objectives* What SQA is and why SQA is NOT SQC (testing).
* Reasons for SQA failures and factors critical to success of SQA in IS development.
* The six Proactive Software Quality Assurance™ functions that SQA should perform.
* Proactive methods for more effectively reviewing requirements and designs.
* A structured Proactive Testing™ model of which testing activities should be performed when and by whom within the life cycle to maximize testing efficiency and effectiveness.
* Truly agile test planning techniques that prevent showstoppers.
* Designing tests that spot numerous ordinarily-overlooked defects in less time.
* Applying risk analysis, reusable testware, and metrics to perform more thorough testing in less time..
* Measuring system quality and SQA/Testing effectiveness.
Agenda
SYSTEM/SOFTWARE QUALITY AND QUALITY ASSURANCE Exercise: What is quality, quality assurance Quality in the project manager’s triangle Quality is free, cost of poor quality What we, others mean by quality Need for positive common quality definition Quality factors and quality dimensions Engineered Deliverable Quality Quality assurance vs. quality control SQA in IEEE Stds. 12207 and 730 Proactive SQA changes in IEEE Std. 730 Not just ‘traffic cop’ compliance
SYSTEM/SOFTWARE PROCESSES REAL vs. Presumed processes, silos Exercise: Your software process Defect injection, detection, ejection metrics Economics of quality problems in life cycle Making the business case for SQA Life cycle concepts, waterfall vs. iterative Process capability, variation, improvement Project, process, product measures Direct and indirect process evaluation SEI Process Capability Maturity Models
QUALITY ASSURANCE CONCEPTS Exercise: Why SQA groups so often fail SQA groups’ changes over time Common SQA interpretations, issues Quality control (QC) testing ‘QA Test’ Document and procedure compliance ‘QA Reviews’ and toll gates Standards and procedures manuals Staffing and organizational influence Reasons for resistance to SQA SQA needs broader view of quality Proactive SQA™ for effectiveness Assuring processes vs. doing it all 6 functions of effective software QA QA Plans, quality reviews of deliverables Exercise: Managing SQA tasks, resources Engineering standards, conventions Quality controls at all key points Project control Configuration management, checkpoints Recordkeeping and auditing Metrics and analysis for improvement Exercise: Key product and process metrics Promoting awareness and recognition
ACTIVE STATIC TESTING Role of requirements in producing quality Exercise: ‘Established Requirements’ issues Exercise: Reviewing Requirements Unrecognized weaknesses of “Regular Way” Why review of requirements fails Formal technical reviews, procedures Review approaches, formality Often overlooked walkthrough limitations Why reviews so economically find defects Foundation technique, topic guidelines Evaluating requirements form, testability REAL, business vs. system requirements Finding overlooked, incorrect requirements Reviewing design suitability and content Four powerful design review CAT-Scans Exercise: Reviews and Software Process
QA HOW TESTING CAN CUT EFFORT & TIME Testing for correctness vs. testing for errors Developer views of testing Reactive testing—out of time, but not tests Proactive Testing Life Cycle model CAT-Scan Approach to find more errors Dynamic, passive and active static testing V-model and objectives of each test level Developer vs. independent test group testing Strategy—create fewer errors, catch more Four keys to effective testing Need for testing sampling Written vs. not written benefits and issues Test activities that save the developer’s time The “we don’t have time” fallacy
TEST PLANNING VALUE NOT BUSYWORK Risk elements, relation to testing Proactive vs. reactive risk analysis IEEE Standard for Test Documentation Benefits of the structure Enabling manageability, reuse, selectivity Test plans vs. test designs, cases, procedures Exercise: Anticipating showstoppers Risk-based way to define test units Letting testing drive development Preventing major cause of overruns Master Test Plan counterpart to project plan Approach, use of automated tools Entry/exit criteria, anticipating change DETAILED TEST PLANNING IEEE Standard on Unit Testing Functional (Black Box) testing strategy 3-level top-down test planning and design Exercise: Functionality matrix Detailed Test Plan technical document White box structural testing coverage Use cases, revealing overlooked conditions Exercise: Defining use case test coverage
INTEGRATION/SYSTEM TEST PLANNING Graphical technique to simplify integrations Integration test plans prevent schedule slips Smoke tests; system and special testing Daily, top- and bottom-down builds strategy
DESIGNING AND WRITING TEST CASES Exercise: Your challenges and issues Exercise: Disciplined brainstorming Checklists find more overlooked conditions Data formats, data and process models Business rules, decision tables and trees Equivalence classes and boundary values Formal, informal Test Design Specifications Leveraging reusable test designs Test Case Specifications vs. test data values Writing test cases, script/matrix Embedding keystroke-level procedural detail Exploratory testing applied most effectively
MEASURING AND MANAGING TESTING Estimating Defect isolation Defect reporting, categories and analysis Defect reports that prompt suitable action Exercise: Measures for managing testing Common measures of test status, issues Exercise: Test status report audiences Projecting when software is good enough Exercise: Measuring testing effectiveness Exercise: Post-Implementation Review
Comments
Office Policy: In fairness to all participants, anyone arriving more than 30 minutes late will be rescheduled for another class date.
Cancellation Policy: No Shows: If you are registered for a class and do not attend and fail to contact our office to cancel or reschedule, a fee equivalent to your daily rate will be applied.
Rescheduling: Productivity Point reserves the right to cancel or reschedule any training course.Should we reschedule a course, a full credit will be applied to the rescheduled course. Productivity Point cannot assume responsibility for any other costs to the student (i.e.non-refundable airline tickets). Class credits are redeemable for up to 1 years.
Cancellations: There is no charge for cancellations that are made Ten (10) or more business days prior to the scheduled training date. Cancellations that are made nine (9) business days or less of the scheduled training date are considered “late cancellations” and the full price of the class will be charged.All training cancelled within 10 or more business days' notice will have a credit on account in the full amount of purchase. This credit can be applied to any Productivity Point products or services for up to 1 year from the date of original transaction. There are no refunds.