929
Index
Aacceptance criteria, post-implementation
analysis, 578acceptance testing
administrative procedures, 498concerns, 493–494data components, 492execution, 499functionality requirements, 497input, 495–496interface quality requirements, 497objectives, 492–493overview, 491people components, 492performance requirements, 497plan contents, 498–499programming testing, 324project description contents, 498RAD testing, 637rules, 492software quality requirements, 497structure components, 492tester roles, 44use cases, 500–503user responsibilities, 498validation, 221V-concept testing, 158–159
work papersacceptance criteria, 527acceptance plan form creation,
544–545automated application criteria,
566–567change control forms, 546checklist, 550–551data change forms, 547–548deletion instructions, 538–539installation phase, 531inventory material, 552–553production change instructions,
536–537program change form completion,
540–541, 549program change history, 534–535quality control checklist, 560–563recovery planning data, 532–533system boundary diagram, 528system problem form completion,
542–543test cases, 530training checklist form completion,
558–559training failure notification, 568–569training module form completion,
556–557
33_598376 bindex.qxp 3/3/06 11:22 PM Page 929
COPYRIG
HTED M
ATERIAL
930 Index
acceptance testing (continued)training plan form completion,
554–555training quality control checklist,
564–565use cases, 504–507, 529
workbench concept, 494access control
best practices, 850configuration management, 602data warehouse testing, 770–771, 777risk assessment, 216test factors, 41–42unauthorized, 766Web-based systems testing, 803
access deniedsecurity testing, 223verification testing, 301, 319
accidental versus intentional losses,738–739
accuracy, best practices, 849acquisition, system processing, 76activities, Project Status reports, 469ad hoc process level, client/server
systems testing, 616–617adequacy evaluation, software security
testing, 761administration
administrative componentsacceptance testing, 498CM (configuration management),
602–603test plan development, 250–251
barriers, 865–866development activities, 753
agile testing processDRE (Defect Removal Efficiency), 822flexibility, 820–821importance of, 819–820mature processes, 821MTBF (mean time between failures),
822–823MTTF (mean time to failure), 822–823
MTTR (mean time to repair), 822–823objective-driven processes, 821quality, 821team representatives, 820–821tester agility, 842–843time-compression efforts
best practices, 825calendar-day efficient, 823–824challenges, 824readiness criteria, 826solutions to, 825V-concept testing, 826–827workbench concept, 834
variability and, 820work papers, 923–927
algebraic specification, 237alliance, business innovation, 878allocation of resources, management
support needs, 51annual report preparation, 120applications
internal controls testing, 666–669risks, verification testing, 304–308system activities, 752
assessment objectives, post-implementation analysis, 574–575
assessment questionnaires, baselinedevelopment, 10
assessment teams, baselinedevelopment, 10
asset value, post-implementationanalysis, 579
assistant tool managers, 119audience criteria, documentation, 179audit trails
corrective controls, 665inadequate, 766programming testing, 327risk assessment, 216test factors, 40, 42validation testing, 435verification testing, 319
auditors, roles and responsibilities, 6
33_598376 bindex.qxp 3/3/06 11:22 PM Page 930
Index 931
audits, configuration management, 602authorization
cycle control objectives, 675internal controls testing, 658programming testing, 326risk assessment, 215test factors, 40, 42transactions, 755validation testing, 434verification testing, 301, 318Web-based systems testing, 803
authors, inspection processresponsibilities, 258
automated applicationsoperational fit tests, 697operational testing, 522–523tools, 104, 221
availability, baseline information, 741Average Age of Uncorrected Defects by
Type report, 475average prioritized objectives, 245awareness concerns, baseline
information, 743axiomatic specification, 238
Bback-end processing, 611, 800background and reference data, test
plan development, 251backup data
data warehouse testing, 773–774, 778recovery testing, 223
balance checks, preventive controls, 662
bandwidth access, Web-based systemstesting, 805
barrier identificationcommunication barriers
conflict-resolution methods, 880effective communication process,
882–884how to address, 882lines of communication, 881management support solutions, 883
objectives, understanding consensusof, 880
open channel concept, 884quality control, 884realistic mechanism solutions, 884respect for others, 880
cultural barriersbusiness innovation management,
878competencies, 874–876discussed, 869how to address, 879–880manage by fact concept, 876–877manage by process concept, 873–874management cultures, 870people management, 871–872
improvement barriersadministrative/organizational
barriers, 865–866quality control, 869red flag/hot button barriers, 864root cause, 866–869staff-competency barriers, 865stakeholder perspective, 861–864
work paperscommunication barriers, 920–922cultural barriers, 919stakeholder analysis, 915–918
base case, verification testing, 293baselines
accurate and precise information, 741analysis, 750availability, 741awareness concerns, 743awareness training, 747categories, 739collection methods training, 747configuration management, 604corporate language, adjustments for,
741data collection methods, 744–747development
assessment questionnaires, 10assessment teams, 10
33_598376 bindex.qxp 3/3/06 11:22 PM Page 931
932 Index
baselines (continued)capabilities assessment, 13–14cause-effect diagram, 9drivers, 8environment assessment, 8footprint charts, 10implementation procedures, 9results assessments, 11–12tester competency assessment, 14–16verification, 13
forms completion training, 747–748methods, 743objectives, 751one-time data collection procedures,
741reasons for, 740resources protected concerns, 743security sources, 740status, reporting, 749support concerns, 744team member selection, 742–743training concerns, 743what to collect, 740
batch jobs, system processing, 76batch tests, 248best practices
access control, 850accuracy, 849audits, 850capability chart, 847communication, 850completeness, 849consistency, 849data commonality, 850effectiveness and efficiency measures,
848–849, 852–854error tolerance, 849execution efficiency, 850expandability, 850generality, 850identifying from best projects, 854–856instrumentation, 850machine independence, 850
modularity, 850operability, 850operational software, 843quality control, 856–857quality factors, 843–846quality in fact perspective, 843quality in perspective, 843self-descriptiveness, 850simplicity, 850storage efficiency, 850system independence, 850tester agility, 842–843time-compression efforts, 825traceability, 849training, 850work papers, 906–908
black box testingCOTS software testing challenges, 689discussed, 69
boundary value analysis tools, 105branch testing, description of, 239browser compatibility concerns, Web-
based systems testing, 800, 805–806budgets
administrative/organizationalbarriers, 865
client/server testing readiness, 614efficiency measures, 853facts, managing, 162inadequate, 39people management through, 871post-implementation analysis, 578Project Status reports, 468–469Summary Status reports, 467
business innovation management,cultural barriers, 878
business logic techniquesCOTS software testing concerns, 692testing guidelines, 67–68
business requirements, methodologies,592
business transaction control points,software security testing, 755–756
33_598376 bindex.qxp 3/3/06 11:22 PM Page 932
Index 933
Ccaching, Web-based systems testing,
805calculation correctness, Web-based
systems testing, 804capabilities assessment, baseline
development, 13–14capability chart, best practices, 847Capability Maturing Model Integrated
(CMMI), 596–597, 824capture/playback tools, 105cause-effect
diagram, baseline development, 9graphing tools, 105
CBOK (Common Body of Knowledge)categories, 127–128code of ethics, 125continuing education, 125discussed, 14–15tester competency, 125–126work papers
individual competency evaluation,149
new information technology, 148project management, 135–138security procedure assessment,
146–147software controls, 146test environment, building, 133–135test planning, 138–142test status and analysis, 143–144test team competency evaluation, 150testing principles and concepts,
132–133user acceptance testing, 144–145
central computer sites, software securitytesting, 746
central processors, vulnerabilities, 738Certified Software Tester. See CSTE
certificatechampion, barrier/obstacle solution,
868change characteristics, post-
implementation analysis, 576
change controldevelopment activities, 753operational testing, 517–518
change estimation, timelines, 837change in scope criteria, documentation,
174change recommendation, tester roles, 44check procedures
client/server systems testing, 624COTS software testing, 705data warehouse testing, 780edit checks, 414multiplatform environment testing,
726operational testing, 522organization, 200post-implementation analysis, 580RAD testing, 642results analysis, 482software security testing, 762test plan development, 262validation testing, 439variability measures, 834verification testing, 330Web-based systems testing, 809
checklistprocess preparation, 86quality control, 207–209testing tactics, 100–101tools, 105
CI (configuration identification), 601clarity of communication, COTS
software testing, 700class, classifying defects by, 258classification objectives, cycle control
objectives, 675clearness, requirements measures, 594client/server systems testing
ad hoc process level, 616–617back-end processing, 611check procedures, 624client needs assessment, 622–623concerns, 612–613consistent process level, 618–619
33_598376 bindex.qxp 3/3/06 11:22 PM Page 933
934 Index
client/server systems testing (continued)footprint chart, 621guidelines, 624installations, 622maturity levels, 615–616measured process level, 619–620optimized process level, 620–621output, 624readiness assessment, 614–615, 621repeatable process level, 617–618security, 622work papers
client data, 627–628footprint chart, 631quality control checklist, 632readiness results, 630security, 626–627standards, 628–629system installation, 625
workbench concept, 613CM (configuration management)
administrative activities, 602–603audits, 602baselines, 604basic requirements, 600CI (configuration identification), 601configuration control, 601CSA (configuration-status accounting)
system, 601–602data distribution and access, 602document library, 604initial release, 604interface requirements, 605marking and labeling, 604–605planning, 602software development library, 604technical reviews, 603WBS (Work Breakdown Structure), 603
CMMI (Capability Maturity ModelIntegrated), 596–597, 824
codecomparison tools, 105verification concerns, Web-based
systems testing, 800walkthrough, program verification, 55
code of ethics, CBOK, 125coding errors, undetected defects, 66collection methods training, baseline
information, 747combination teams
advantages/disadvantages, 171communication concerns, 170
commands, operations activities, 754commercial off-the-shelf software. See
COTS software testingCommon Body of Knowledge. See
CBOKcommunication
barriers/obstacle solutionsconflict-resolution methods, 880effective communication process,
882–884how to address, 882lines of communication, 881management support solutions, 883objectives, understanding consensus
of, 880open channel concept, 884quality control, 884realistic mechanism solutions, 884respect for others, 880
best practices, 850combination team concerns, 170communication lines and networks,
software security testing, 746development activities, 753failed, application risks, 308–309
compatibilityCOTS implementation risks, 688Web-based systems testing, 804–805,
808–809competency
barriers, 865baseline development, 14–16building software skills, 598client needs assessment, 623competent-programmer hypothesis,
242cultural barriers, 874–876efficiency measures, 852–853
33_598376 bindex.qxp 3/3/06 11:22 PM Page 934
Index 935
inadequate, 39people management skills, 598process-selection skills, 598project management skills, 598testers
baseline development, 14–16CBOK (Common Body of
Knowledge), 125–128competent assessment, 128CSTE (Certified Software Tester)
certificate, 125, 127–128fully competent assessment, 128job performance roles, 126–127negative attitudes, 130not competent assessment, 128, 130training curriculum development,
128–130work papers, 28–32
of tests, assessing, 14–16variability measures, 835work papers, 28–32
competition, barrier/obstacle solutions,868
compiler-based analysis tools, 105completeness evaluation
best practices, 849client needs assessment, 623documentation, 179–180preventive controls, 660requirements measures, 594
complexity measures, structuralanalysis, 238
compliance testingexamples of, 223how to use, 228with methodology, 299objectives, 227parallel testing, 229post-implementation analysis, 575test factors, 41–42validation testing, 434when to use, 228
component testing, Web-based systemstesting, 807
computation faults, error-based testing,241
computer operations, vulnerabilities,738
computer processingoperations activities, 754testing without, 669
computer programs, vulnerabilities, 736computer software, system
processing, 77computer store demonstration, COTS
software testing, 700–701conciseness, best practices, 850concurrency testing, Web-based systems
testing, 804concurrent software development
criteria, documentation, 175conditional testing, 240conditions, validation testing results,
436, 438–439configuration identification (CI), 601configuration management. See CMconfiguration-status accounting (CSA)
system, 601–602confirmation tools, 105conflict-resolution methods, 880consensus policies, 46–47consistency
best practices, 849process advantages, 153requirements measures, 594
consistent process level, client/serversystems testing, 618–619
constraint method, software costestimation, 182–183
constraintsmethodologies, 592profile information, 214
content criteria, documentation, 179–180
contingency planningprogramming testing, 327test plan development, 222verification testing, 319
33_598376 bindex.qxp 3/3/06 11:22 PM Page 935
936 Index
continuing education, CBOK, 125continuity of processing
risk assessment, 216test factors, 41–42
contracted softwareCOTS versus, 686test tactics, 75testing efforts, 704–705vendor reputation importance,
690–691contracting officers, roles and
responsibilities, 591control flow analysis tools, 105control testing
examples of, 229how to use, 234objectives, 234when to use, 234
controllability, requirements measures,594
controlled demonstration, COTSsoftware testing, 700–701
controlscorrective, 665detective
control totals, 664data transmission, 663discussed, 662documentation, 664logs, 663output checks, 664–665
objectives, verification testing, 303preventive
balance checks, 662completeness tests, 660control data, 661control totals, 660data input, 659file auto-updating, 661has totals, 661input validation, 659–661limit tests, 660logical tests, 660
overflow tests, 662prenumbered forms, 659project checks, 660record counts, 661self-checking digits, 660source-data, 658–659turnaround documents, 659validity tests, 660
project status calculation, 191risks, 64
conversion, cycle control objectives, 675core business areas, test plan
development, 221corporate language, adjustments for, 741corrected conditions, undetected
defects, 66, 155corrective controls, internal controls
testing, 665correctness
programming testing, 327requirements measures, 594risk assessment, 217software quality factors, 844test factors, 40, 42tools, 105validation testing, 435Web-based systems testing, 800, 804
costscost/benefit analysis, 172cost-effectiveness of testing, 47–48critical success factors, 695defects, 67, 154internal controls testing, 665profile information, 214software cost estimation
inflation, 188labor rates, 188parametric models, 183–184personnel, 187prudent person test, 189recalculation, 188–189resources, 188schedules, 187
33_598376 bindex.qxp 3/3/06 11:22 PM Page 936
Index 937
strategies for, 182–183validation, 185–189
tools, 114–116COTS (commercial off-the-shelf)
software testingadvantages, 687challenges, 689–690check procedures, 705clarity of communication, 700concerns, 691–692contracted software versus, 686CSFs (critical success factors), 695–696data compatibility, 698demonstrations, 700–701disadvantages, 687–688disk storage, 704ease of use, 700functional testing, 702–703guidelines, 706hardware compatibility, 697–698help routines, 701input, 693knowledge to execute, 701management information, 694objectives, 691operating system compatibility, 698operational fit tests, 696–697output, 705overview, 685people fit tests, 701–702probability, 694products/reports output, 693program compatibility, 698risk assessment, 688–689software functionality, 700structural testing, 703–704work flow, 698–699work papers
completeness tests, 707–708functional testing, 710quality control checklist, 712–715structural testing, 711test of fit, 709
workbench concept, 692
couplingcoupling-effect hypothesis, 242programming testing, 328risk assessment, 218test factors, 41, 43validation testing, 434
courses, training curriculum, 128–129crashes, Web-based systems testing, 807critical criteria
documentation, 174requirements measures, 594
critical path definition, timelines, 836cross-references, inspection process, 260CSA (configuration-status accounting)
system, 601–602CSFs (critical success factors), 695–696CSTE (Certified Software Tester)
certificatedefined, 14–15tester competency, 125, 127–128uses for, 127
cultural barriersbusiness innovation management, 878competencies, 874–876discussed, 869how to address, 879–880manage by fact concept, 876–877manage by process concept, 873–874management cultures, 870people management, 871–872
culture, client/server testing readiness,614
curriculum, training, 128–130customers
profile information, 213roles of, 6satisfaction measures, 852and user involvement, lack of, 210
customer-site demonstration, COTSsoftware testing, 700–701
customization, V-concept testing,160–161
cycle control objectives, internal controlstesting, 675
33_598376 bindex.qxp 3/3/06 11:22 PM Page 937
938 Index
Ddata
accessibility, enterprise-widerequirements, 767
commonality, best practices, 850compatibility, COTS software testing,
698components, acceptance testing, 492distribution, configuration
management, 602generation aids, test verification, 55improper use of, 766incomplete, risk factors, 39requirements, document development,
173data acquisition, system processing, 76data collection
baseline information, 744–747post-implementation analysis, 577
data dictionary tools, 105, 770data entry
errors, undetected defects, 66, 155operational fit tests, 697
data exchange issues, test plandevelopment, 222
data flow analysis tools, 105, 238–239data formats, client needs assessment,
623data handling areas, vulnerabilities, 738data input
preventive controls, 659problems, 39response time criteria, documentation,
175data integrity controls
data warehouse testing, 771–772,778–779
programming testing, 326verification testing, 318
data transmission, detective controls,663
data warehouse testingaccess control processes, 770–771, 777
backup/recovery processes, 773–774, 778
check procedures, 780concerns, 765–766, 768–769data integrity processes, 771–772,
778–779documentation processes, 769, 771, 776enterprise-wide requirements, 767–768front-end planning, 770guidelines, 780input, 767–768operations processes, 772–773, 777organizational processes, 769, 775–776output, 780overview, 765statistics, 773system development processes,
770–771, 779work papers
access control, 785activity process, 797audio trails, 784concerns rating, 788, 795–796continuity of processing, 792data, placing in wrong calendar
period, 787documentation, 791fraud, 789inadequate responsibility
assignment, 781inadequate service levels, 786incomplete data concerns, 782management support concerns, 794performance criteria, 793quality control checklist, 798reviews, 790update concerns, 783
workbench concept, 766–767database management
multiplatform environment testing, 717system processing, 76
databasesbuilt/used, profile information, 214sources, validation testing, 413
33_598376 bindex.qxp 3/3/06 11:22 PM Page 938
Index 939
specifications, document development,173
tools, 105date controls
enterprise-wide requirements, 768Project Status reports, 468Summary Status reports, 467
debuggingprogramming testing, 325–326verification testing, 292
decision and planning aids, systemprocessing, 76
decision tablesas documentation, 180functional testing, 238
Defect Distribution report, 475–476Defect Removal Efficiency (DRE), 822defects
classifying, 258corrected conditions, 155costs, 67, 154data entry errors, 155efficiency measures, 853error correction mistakes, 155extra requirements, 65, 471failures versus, 65–66hard to find, 66–67improperly interpreted requirements,
155incorrectly recorded requirements, 155inspection process, 258–259instructional errors, 155missing requirements, 65, 471naming, 471post-implementation analysis,
576–578, 580program coding errors, 155program specification, 155results analysis, 462severity levels, 471testing errors, 155testing guidelines, 65–67undetected, 66wrong specifications, 65, 471
Defects Uncovered versus CorrelatedGap Timeline report, 473–474
definition, document development, 172degree of generality criteria,
documentation, 174deliverables
efficiency measures, 853inspection process, 256profile information, 213–214
demonstrations, COTS software testing,700–701
descriptions, system test plan standards,81–82
designbad design problems, 39changes, regression testing, 55document development, 172specifications, undetected defects, 66verification, SDLC, 53–54
design phase, verification testing,296–297
design-based functional testing tools,105
desk checking tools, 106desk debugging, 325–326desk reviews, inspection process,
259–260destruction, transaction, 756detective controls, internal controls
testingcontrol totals, 664data transmission, 663discussed, 662documentation, 664logs, 663output checks, 664–665
developersfunctional testing phases, 70roles and responsibilities, 6
developmentdocument verification, 171–174documentation development phase,
172phases, test strategies, 56
33_598376 bindex.qxp 3/3/06 11:22 PM Page 939
940 Index
development (continued)process profiles, 212–215software security testing, 753–754test process improvement, 44tester roles and responsibilities, 6
development project types, V-concepttesting, 75
developmental costs criteria,documentation, 174
deviation, validation testing results, 427diagnostic software, system
processing, 76digital storage facilities, vulnerabilities,
738disaster planning
operations activities, 754test tools, 106validation testing, 436
disk space allocation, stress testing, 223disk storage
COTS software testing, 704multiplatform environment testing,
724document library, configuration
management, 604documentation
audience criteria, 179change in scope criteria, 174client needs assessment, 623combining and expanding documents
types, 179–180completeness evaluation, 179–180concurrent software development
criteria, 175content criteria, 179–180cost/benefit analysis, 172critical criteria, 174data dictionaries, 770data input response time criteria, 175data requirements, 173data warehouse testing, 769, 771, 776database specifications, 173decision tables, 180degree of generality criteria, 174
detective controls, 664development activities, 753development phases, 171–174developmental costs criteria, 174enterprise-wide requirements, 768equipment complexity criteria, 174feasibility studies, 172flexibility criteria, 179flowcharts, 180formal publication, 178format criteria, 180forms, 180functional requirements, 173implementation procedures, 890inadequate, 766internal, 177–178levels, 177–178minimal, 177multiple programs/files, 180need for, 174–175operations manual, 174organization, 167originality required criteria, 174personnel assigned criteria, 174problems, 513program change response time criteria,
174program maintenance manuals, 174program specification, 173programming language criteria, 175project requests, 172redundancy criteria, 179section titles, 180size criteria, 179software summary, 172span of operation criteria, 174specification requirements, tester
roles, 44standards, 770stress testing, 223system/subsystem specifications, 173test analysis reports, 174test plans, 174timelines, 180–181
33_598376 bindex.qxp 3/3/06 11:22 PM Page 940
Index 941
tool use, 124user manuals, 174validation testing, 412weighted criteria score, 175–177work papers
documentation completeness, 202estimation, 203–205quality control checklist, 207–209weighted criteria calculation, 201
working document, 178domain testing, error-based testing, 241domino effects, revised testing
approach, 50downloads, Web-based systems testing,
806DRE (Defect Removal Efficiency), 822drivers, baseline development, 8dropped lines, Web-based systems
testing, 807dynamic analysis
program verification, 55programming testing, 324
dynamic page generation, Web-basedsystems testing, 806
Eease of operation
risk assessment, 218test factors, 41, 43validation testing, 436
ease of useclient needs assessment, 623COTS software testing, 700critical success factors, 695programming testing, 327risk assessment, 217scoring success factors, 317test factors, 41, 43validation testing, 435
e-commerce activities, 878edit checks, 414education, barrier/obstacle solutions,
868effect, validation testing results, 436, 438
effectiveness measuresbest practices, 848–849, 852–854communication process, 882–884testing guidelines, 65
efficiencybest practices, 848–849results analysis, 463software quality factors, 844
e-learning courses, QAI, 129electronic device reliance, revised
testing approach, 50e-mail functions, Web-based systems
testing, 806end dates, timelines, 838enterprise-wide requirements
data warehouse testing, 767–768methodologies, 592
environment assessment criteria, 8environment controls, internal controls
testing, 656equipment complexity criteria,
documentation, 174equivalence partitioning, 236–237errors
accidental versus intentional losses,738–739
application risks, 305–307corrective controls, 665data entry, 155error correction mistakes, 155error guessing tools, 106, 721–722error handling testing
examples of, 229how to use, 232objectives, 231operations activities, 754when to use, 232
error tolerance, best practices, 849error-based testing, 241–242fault estimation, 241fault-based testing, 242–243instructional, 155perturbation testing, 242program coding, 155
33_598376 bindex.qxp 3/3/06 11:22 PM Page 941
942 Index
errors (continued)statistical testing, 241undetected defects, 66
estimationparametric models, 183–184resources, availability, 181–182, 221software cost
inflation, 188labor rates, 188parametric models, 183–184personnel, 187prudent person test, 189recalculation, 188–189resources, 188schedules, 187strategies for, 182–183validation, 185–189
troubled project characteristics,181–182
evaluationpolicy criteria, 45system test plan standards, 81–82
eventsevent control, system processing, 76test script development, 431
examination tools, 105executable spec tools, 106execution
acceptance testing, 499best practices, 850execution testing
examples of, 223how to use, 225objectives, 225when to use, 225
validation testing, 434–436expandability
best practices, 850critical success factors, 695
Expected versus Actual DefectsUncovered Timeline report, 472–473
expenditure, cycle control objectives,675
expenses, Project Status reports, 468
experience evaluation, staffing, 600expression testing, description of, 240extensions, project status calculation,
193–194external team
advantages/disadvantages, 170external/internal work processes,
variability measures, 835extra requirements
COTS software testing concerns, 692defects, 65, 471
Ffacility requirements, test plan
development, 222fact finding tools, 106fact management, V-concept testing,
162failures
COTS software testing concerns, 691defects versus, 65–66
fault design problems, bad designdecisions, 39
fault estimation, error-based training,241
fault-based testing, 242–243feasibility reviews, 70, 172file auto-updating, preventive controls,
661file design, validation testing, 413–414file downloads, Web-based systems
testing, 806file handling, multiplatform
environment testing, 724–725file integrity
programming testing, 326–327risk assessment, 215test factors, 40, 42validation testing, 435verification testing, 301
file reconciliation, control testing, 229final planning iteration, RAD testing,
642final reports, walkthroughs, 314
33_598376 bindex.qxp 3/3/06 11:22 PM Page 942
Index 943
financial planning objectives, internalcontrols testing, 674–675
first priority, probable penetrationpoints, 760
flexibilityagile testing process, 820–821documentation criteria, 179software quality factors, 844test plan development, 262
flowchartsas documentation, 180tools, 106
follow-ups, inspection process, 261footprint charts
baseline development, 10client/server systems testing, 621work papers, 23
formal publication documentation, 178format criteria, documentation, 180forms
completion training, 747–748as documentation, 180
fraud studies, software security testing,761
front-end planning, data warehousetesting, 770
fully competent assessment, testercompetency, 128
functional desk debugging, 326functional problems, COTS
implementation risks, 688functional requirements, document
development, 173functional testing. See also structural
testingadvantages/disadvantages, 69algebraic specification, 237analysis, 236axiomatic specification, 238black box testing, 69COTS software testing, 702–703decision tables, 238equivalence partitioning, 236–237functional analysis, 236
input domain testing, 236interface-based, 236–237output domain coverage, 237reasons for, 69special-value testing, 237state machines, 238syntax checking, 237system testing, 70user acceptance, 70validation techniques, 69–70verification techniques, 69–70
functionality requirements, acceptancetesting, 497
Functions Working Timeline report, 472function/test matrix, reporting,
470–472funding, administrative/organizational
barriers, 865
Ggenerality, best practices, 850global extent, fault-based testing,
242–243graphics filters, Web-based systems
testing, 805guidelines, testing
business logic techniques, 67–68defects, uncovering, 65–67effective test performance, 65functional testing, 69–71life-cycle testing, 68reasons for, 63–64risk reduction, 64–65structural testing, 69–71
Hhacking, 757hardware compatibility
COTS software testing, 697–698Web-based systems testing, 805
hardware configurations, multiplatformenvironment testing, 717
hardware constraints, operational fittests, 696
33_598376 bindex.qxp 3/3/06 11:22 PM Page 943
944 Index
has totals, preventive controls, 661help it happen concept, stakeholder
perspective, 861–862help routines
client needs assessment, 623COTS software testing, 701
heuristic models, estimation, 183hierarchical organization, people
management, 871high positive correlation, scoring
success factors, 317high prioritized objectives, 245high readiness assessment, client/server
systems testing, 621hot button barriers, 864hotlines, tool manager duties, 119HTML tools, Web-based systems testing,
809
IICS (internal control specialist), 591IEEE (Institute of Electrical and
Electronics Engineers) standards, 595image processing, 76impersonation, vulnerabilities, 736implementation procedures
baseline development, 9discussed, 891documentation approaches, 890measurability, 892methodologies, 592objectives, 892obtainable ideas, 886planning process, 891prioritization, 888–889profile information, 214quality control, 890, 894requisite resources, 893–894results, 892–893test plan development, 251time-compression efforts, 886–888user acceptance, 885
improperly interpreted requirements,defects, 155
improvement barriersadministrative/organizational
barriers, 865–866quality control, 869red flag/hot button barriers, 864root cause, 866–869staff-competency barriers, 865stakeholder perspective, 861–864
improvement planningcriteria, 16processes, 154self-assessment practices, 17–18
in control processes, variabilitymeasures, 832
inadequate assignment concerns, datawarehouse testing, 765
inadequate audit trails, 766inadequate documentation, 766inadequate service levels, 766incomplete data
data warehouse testing, 766risk factors, 39
incorrectly recorded requirements,defects, 155
incremental methodology, 588industry issues, profile information, 214inefficient testing, 857–860inflation, software cost estimation, 188in-house courses, QAI, 129initial release, configuration
management, 604initiation, documentation development
phase, 171in-phase agreement, RAD testing, 641input
acceptance testing, 495–496COTS software testing, 693data warehouse testing, 767–768documentation, 167multiplatform environment testing,
720–721organization, 167post-implementation analysis, 574
33_598376 bindex.qxp 3/3/06 11:22 PM Page 944
Index 945
RAD (rapid application development)testing, 636
results analysis, 461–463software security testing, 735test plan development, 212timelines, 837validation testing, 411verification testing, 296–297vulnerabilities, 736Web-based systems testing, 801–802
input component, workbench concept, 71
input domain testing, 236inspection process
author responsibilities, 258concerns, 255cross-references, 260defect classification, 258–259design deliverables, 322desk reviews, 259–260follow-ups, 261importance of, 254individual preparation, 259–260inspectors responsibilities, 258meetings, 260–261moderator responsibilities, 256–257overview sessions, 259planning and organizing procedures,
259products/deliverables, 256reader responsibilities, 257recorder responsibilities, 257tools, 106validation testing, 436verification testing, 292
installation phase, operational testing,508–509
installationsclient/server systems testing, 622verification, SDLC, 53, 55
Institute of Electrical and ElectronicsEngineers (IEEE), 595
instructional errors, 155
instructions coverage, post-implementation analysis, 578
instrumentationbest practices, 850tools, 106
integrationCOTS implementation risks, 689revised testing approach, 50Web-based systems testing, 800
integration scripting, 431integration testing
as functional tests, 70RAD testing, 637tools, 106validation, 221Web-based systems testing, 807
integrity, software quality factors, 844intentional versus accidental losses,
738–739interfaces
activities, software security testing, 752design complete factor, verification
testing, 320interface-based functional testing,
236–237multiplatform environment testing,
725–726profile information, 214–215quality, acceptance testing, 497requirements, configuration
management, 605Interim Test report, 478internal control specialist (ICS), 591internal controls testing
application background information,666–667
application controls, 668–669corrective controls, 665cost/benefit analysis, 665cycle control objectives, 675detective controls
control totals, 664data transmission, 663discussed, 662
33_598376 bindex.qxp 3/3/06 11:22 PM Page 945
946 Index
internal controls testing (continued)documentation, 664logs, 663output checks, 664–665
environment controls, 656financial planning objectives, 674–675master records, 671mini-company approach, 672multiple exposures, 657non-effective controls, 678objectives, 657overview, 655password protection, 656preventive controls
balance checks, 662completeness tests, 660control data, 661control totals, 660data input, 659file auto-updating, 661has totals, 661input validation, 659–661limit tests, 660logical tests, 660overflow tests, 662prenumbered forms, 659project checks, 660record counts, 661self-checking digits, 660source-data, 658–659turnaround documents, 659validity tests, 660
quality control checklist, 678results of, 677risk assessment, 668strong controls, 677system control objectives, 674test-data approach, 669–672transaction flow testing, 672–673weak controls, 678without computer processing, 669work papers
documentation, 679file control, 683
input controls, 679–681output controls, 682program and processing controls,
681–682workbench concept, 667
internal documentation, 177–178internal team
advantages/disadvantages, 169–170international standards, 594–595Internet. See Web-based systems testinginteroperability
implementation risks, 689software quality factors, 844
intersystems testingexamples of, 229how to use, 233objectives, 233when to use, 233
intrusion concerns, Web-based systemstesting, 803
invalid data, tests using, 413inventory process, reporting, 465IT operations
managementpolicy criteria, managing, 45roles and responsibilities, 6, 591
vulnerabilities, 736iterative development, test tactics, 75
Jjoint policy development, 47judgment evaluation approach, post-
implementation analysis, 575
Kknowledge to execute, COTS software
testing, 701
Llabeling and marking, configuration
management, 604labor rates, software cost estimation, 188lack of training concerns, 210LAN (local area network), 800
33_598376 bindex.qxp 3/3/06 11:22 PM Page 946
Index 947
legal/industry issues, profileinformation, 214
legend informationProject Status reports, 470Summary Status reports, 468
let it happen concept, stakeholderperspective, 862
licensing issues, COTS implementationrisks, 689
life cycle phasesSDLC, 52–53tool selection, 109–111
life-cycle testing, testing guidelines, 68limit tests, preventive controls, 660limited space, inspection process
concerns, 255lines of communication, 881load testing, Web-based systems testing,
808local area network (LAN), 800local extent, fault-based testing, 242–243logging tools, 107logical tests, preventive controls, 660logs, detective controls, 663lose-lose situations, test plan
development, 211lost connections, Web-based systems
testing, 807low prioritized objectives, 245low readiness assessment, client/server
systems testing, 621
Mmaintainability
critical success factors, 695risk assessment, 217software quality factors, 844test factors, 41, 43validation testing, 436
maintenancetest tactics, 75verification, SDLC, 53, 55
make it happen concept, stakeholderperspective, 861
manage by fact concept, culturalbarriers, 876–877
manage by process concept, culturalbarriers, 873–874
manageable processes, 154management
COTS software testing, 694cultural barriers, 870IT operations
policy criteria, managing, 45roles and responsibilities, 6, 591
management supportcommunication barrier solutions,
883time-compression readiness criteria,
826risk appetite, 38roles and responsibilities, 6–7support for testing, 50–51test manager responsibilities, 167–168tool managers
assistant, 119mentors, 119need for, 117positions, prerequisites to creating,
118responsibilities of, 117, 119–120skill levels, 118–119tenure, 120
manual applications, operationaltesting, 523–524
manual support testingexamples of, 229how to use, 232–233objectives, 232when to use, 233
manual tools, 104mapping tools, 106marketing, barrier/obstacle solutions,
868marking and labeling, configuration
management, 604master records, internal controls
testing, 671
33_598376 bindex.qxp 3/3/06 11:22 PM Page 947
948 Index
mathematical models, systemprocessing, 76
mature test processes, agile testing, 821maturity levels, client/server systems
testing, 615–616MBTI (Myers Briggs Type Indicator),
130mean time between failures (MTBF),
822–823mean time to failure (MTTF), 822–823mean time to repair (MTTR), 822–823measurability
implementation procedures, 892requirements measures, 594
measured process level, client/serversystems testing, 619–620
measurement first, action secondconcept, 581
measurement units, reporting, 464–465media libraries, 754media, vulnerabilities, 736medium readiness assessment,
client/server systems testing, 621meetings, inspection process, 260–261memory, Web-based systems testing,
805mentors, tool managers, 119message processing, system
processing, 76methodologies, software development
business requirements, 592CMMI (Capability Maturity Model
Integrated), 596–597competencies required, 598–599compliance with, 299constraint requirements, 592enterprise-wide requirements, 592implementation requirements, 592incremental methodology, 588international standards, 594–595overview, 586prototyping, 587RAD (rapid application development)
methodology, 587–588
risk assessment, 215–218SDLC (software development life
cycle), 588–589SEI (Software Engineering Institute),
596self-assessment, 605–606spiral methodology, 588sponsor responsibilities, 590SRS (System Requirements
Specifications), 595staff experience, 600state of requirements, 592systems analyst perspective, 593user responsibilities, 590V-concept testing, 588waterfall methodology, 587work papers
analysis footprint, 609self-assessment, 607–608
methods, baseline information, 743milestones
design verification, 54project status calculation, 191test plan development, 251
mini-company approach, internalcontrols testing, 672
minimal documentation, 177misinterpretation, undetected
defects, 66missing requirements
COTS software testing concerns, 691defects, 65, 471
modeling tools, 106moderators, inspection process
responsibilities, 256–257modifiability, requirements measures,
594modularity, best practices, 850monitors, Web-based systems testing,
805motivation
client/server testing readiness, 614management support needs, 51motivation factors, testers, 51
33_598376 bindex.qxp 3/3/06 11:22 PM Page 948
Index 949
MTBF (mean time between failures),822–823
MTTF (mean time to failure), 822–823MTTR (mean time to repair), 822–823multimedia support, Web-based
systems testing, 805multiplatform environment testing
challenges, 718–719check procedures, 726concerns, 718disk storage, 724error guessing, 721–722file handling, 724–725guidelines, 726–727hardware configurations, 717input, 720–721interfaces, 725–726needed platforms, listing, 723objectives, 718output, 726overview, 717structural testing, 723–725test room configurations, 723transaction processing events, 724V-concept testing, 725–726work papers
concerns, 728configurations, 728quality control checklist, 731–732validity, 729–730
workbench concept, 719–720multiple exposures, internal controls
testing, 657mutation testing, 242Myers Briggs Type Indicator (MBTI), 130
Nnavigation correctness, Web-based
systems testing, 804needs gap and risks, 38negative attitudes, tester competency, 130network and telephone switching
equipment, test plan development,222
new systems development, projectscope, 77
non-effective controls, internal controlstesting, 678
non-IT teamsadvantages/disadvantages, 170vulnerabilities, 738
no-rework day timelines, 839Normalized Defect Distribution report,
476–477not competent assessment, tester
competency, 128, 130
Oobjectives
acceptance testing, 492–493average prioritized, 245compliance testing, 227control testing, 234COTS testing, 691error handling testing, 231execution testing, 225implementation procedures, 892internal controls testing, 657intersystems testing, 233itemizing, 245manual support testing, 232multiplatform environment testing,
718objectives-driven processes, agile
testing, 821operations testing, 227parallel testing, 234priority assignment, 245profile information, 213RAD testing, 634recovery testing, 226regression testing, 231requirements testing, 230security testing, 228software security testing, 228stress testing, 224test plan development, 210, 213understanding consensus of, 880
33_598376 bindex.qxp 3/3/06 11:22 PM Page 949
950 Index
objectives (continued)V-concept testing, 159–160verification testing, 293
object-oriented (OO) systemdevelopments, 500–501, 802
obsolete data, risk factors, 39obtainable ideas, implementation
procedures, 886office equipment, software security
testing, 746online system tests, 248online terminal systems, vulnerabilities,
738OO (object-oriented) system
developments, 500–501, 802open channel concept, communication
barrier solutions, 884operability, best practices, 850operating systems
access and integrity, vulnerabilities,736
compatibility, COTS software testing,697–698
flaws, application risks, 308multiplatform environment testing,
717operational fit tests, COTS software
testing, 696–697operational needs, COTS software
testing concerns, 692operational profiles, validation testing,
413operational software, best practices, 843operational status, parallel testing, 229operational testing
automated applications, 522–523change control, 517–518check procedures, 522discussed, 503guidelines, 525–526installation phase, 508–509manual applications, 523–524output, 522problems, documenting, 513
production monitoring, 512–513software version changes, 509–511test data development, 515–517test plan development and updates,
514–515training failures, 524training materials, 519–522V-concept testing, 158–159
operationsactivities, software security testing,
754–755documentation development, 172manuals, document development, 174operations testing
examples of, 223how to use, 227objectives, 227when to use, 227
processes, data warehouse testing,772–773, 777
optimized process level, client/serversystems testing, 620–621
organizationcheck procedures, 200document verification, 171–175input, 167output, 200project scope, defining, 168project status calculation, 189–193software testing model definition, 7teams, appointing, 168–171test estimation, 181–185test managers, appointing, 167–168V-concept testing, 157workbench concept, 166
organizational barriers, 865–866organizational processes, data
warehouse testing, 769, 775–776origin, classifying defects by, 258originality required criteria,
documentation, 174origination, transactions, 755out of control processes, variability
measures, 832
33_598376 bindex.qxp 3/3/06 11:22 PM Page 950
Index 951
out-of-phase agreement, RAD testing,642
outputclient/server systems testing, 624components, workbench concept, 71COTS software testing, 705data warehouse testing, 780multiplatform environment testing,
726operational testing, 522organization, 200output checks, detective controls,
664–665post-implementation analysis, 580–581RAD testing, 643results analysis, 482software security testing, 762test plan development, 262timelines, 837validation testing, 439verification testing, 331vulnerabilities, 736Web-based systems testing, 810
output domain coverage, 237outside users, revised testing
approach, 50outsourcing, people management, 871overflow tests, preventive controls, 662overlooked details, undetected
defects, 66over-reliance, test plan development,
210overview sessions, inspection process,
259
Ppaperwork,
administrative/organizationalbarriers, 865–866
parallel operation tools, 106parallel simulation tools, 107parallel testing
examples of, 229how to use, 235
objectives, 234when to use, 235
parametric modelsheuristic models, 183phenomenological models, 183regression models, 183–184
password protection, internal controlstesting, 656
path domains, error-based testing, 241path testing, 240paths coverage, post-implementation
analysis, 578pattern processing, 76payroll application example, validation
testing, 416–429PDCA (plan-do-check-act) cycle, 4–5,
64, 891peer reviews
programming testing, 328–330tools, 107
penalties, risk assessment, 58penetration points, software security
testingbusiness transaction control points,
755–756characteristics of, 756development activities, 753–754first priority, 760interface activities, 752operations activities, 754–755second priority, 760staff activities, 751third priority, 760
people components, acceptance testing,492
people fit tests, COTS software testing,701–702
people management skillscompetencies required, 598cultural barriers, 871–872
people needs, COTS software testingconcerns, 692
percentage-of-hardware method,software cost estimation, 183
33_598376 bindex.qxp 3/3/06 11:22 PM Page 951
952 Index
performanceacceptance testing, 497inspection process concerns, 255methods, project status calculation,
190–192people management through, 871programming testing, 328RAD testing, 635risk assessment, 218test factors, 41, 43validation testing, 434Web-based systems testing, 800,
803–804, 808personnel
design reviews, 321software cost estimation, 187
personnel assignment criteria,documentation, 174
perturbation testing, error-based testing,242
phenomenological models, estimation,183
physical access, vulnerabilities, 736plan-do-check-act (PDCA) cycle, 4–5,
64, 891planning
CM (configuration management), 602implementation procedures, 891
platforms. See multiplatformenvironment testing
playback tools, 105point system, project status calculation,
192–193policies
consensus, 46–47criteria for, 45development activities, 753good practices, 45joint development, 47management directive method, 46methods for establishing, 46–47
portabilityprogramming testing, 328risk assessment, 217–218
software quality factors, 844test factors, 41, 43validation testing, 436verification testing, 320
post-implementation analysisacceptance criteria, 578assessment objectives, 574–575asset value, 579budgets, 578change characteristics, 576check procedures, 580compliance, 575concerns, 572data collection, 577defects, 576–578, 580guidelines, 581input, 574instructions coverage, 578judgment evaluation approach, 575measurement assignment
responsibilities, 575measurement first, action second
concept, 581output, 580–581overview, 154, 571paths coverage, 578rerun analysis, 579scale of ten, 580schedules, 580source code analysis, 579startup failure, 579system complaints, 576termination analysis, 579test costs, 578test to business effectiveness, 578testing metrics, 576–579user participation, 578user reaction evaluation approach, 576V-concept testing, 159what to measure, 575work papers, 582workbench concept, 572–573
potential failures, assessing severity of,221
33_598376 bindex.qxp 3/3/06 11:22 PM Page 952
Index 953
pre-implementation, 154prenumbered forms, preventive
controls, 659presentations
system processing, 76walkthroughs, 313–314
preventive controls, internal controlstesting
balance checks, 662completeness tests, 660control data, 661control totals, 660data input, 659file auto-updating, 661has totals, 661input validation, 659–661limit tests, 660logical tests, 660overflow tests, 662prenumbered forms, 659project checks, 660record counts, 661self-checking digits, 660source-data, 658–659turnaround documents, 659validity tests, 660
print handling, 805priorities
administrative/organizationalbarriers, 866
implementation procedures, 888–889objectives, 245
privileged users, interface activities, 752
probability, COTS software testing, 694problems, documenting, 513procedures
development activities, 753procedure control, system
processing, 76procedures in place, security testing,
223workbench concept, 71
process control, system processing, 76
process management, V-concept testing,161–162
process preparation checklist, 86process requirements, reporting, 464–466processes
advantages of, 153improvements, 44, 65, 154manageable, 154teachable, 154
process-selection skills, 598procurement, COTS implementation
risks, 689production
installation verification, 55validation testing, 413
production library control, operationsactivities, 754
production monitoring, operationaltesting, 512–513
productsinspection process, 256output, COTS software testing, 693
profiles, project, 212–215program change response time criteria,
documentation, 174program coding errors, undetected
defects, 66, 155program compatibility, COTS software
testing, 698program errors, application risks,
306–307program maintenance manuals,
document development, 174program specification
defects, 155document development, 173
program users, interface activities, 752program verification, SDLC, 53, 55programming, document development,
172programming language criteria,
documentation, 175programming phase, verification
testing, 297, 323–324
33_598376 bindex.qxp 3/3/06 11:22 PM Page 953
954 Index
programming skills, tool selectionconsiderations, 111–113
programming testingacceptance testing, 324complexity of, 323desk debugging, 325–326dynamic testing, 324importance of, 323peer reviews, 328–330static analysis, 324test factor analysis, 326–328
programs, vulnerabilities, 736, 738project description contents, acceptance
testing, 498project leader assessment
configuration management plans,602–603
verification testing, 317project management
roles and responsibilities, 6–7, 590, 910skills, competencies required, 598
project phases, V-concept testing, 79project requests, document
development, 172project scope
new systems development, 77organization techniques, 168V-concept testing, 77
project status calculationcontrols, 191discussed, 189extensions, 193–194milestone method, 191performance methods, 190–192point system, 192–193reports, 195rolling baseline, 195target dates, 194tracking systems, 190
project types, V-concept testing, 75projects
checks, preventive controls, 660profiling, 212–215
project informationProject Status reports, 468Summary Status reports, 467
protection points, software securitytesting, 746
prototypingmethodology, 587test tactics, 75
pseudo-concurrency scripting, 431purchased software, test tactics, 75pureness, requirements measures, 594
QQA specialist roles and responsibilities,
591QAI (Quality Assurance Institute)
discussed, 5e-learning courses, 129in-house courses, 129management support for testing,
50–51Web site, 15, 125
QFD (quality function deployment), 292Quality Assurance Institute. See QAIquality control
agile testing process, 821best practices, 856–857checklist, 207–209communication barriers, 884implementation procedures, 890, 894improvement barriers, 869internal controls testing, 678unit testing, 244variability measures, 841
quality factors, best practices, 843–846quality function deployment (QFD), 292quality in fact perspective, best
practices, 843quality in perspective, best practices,
843questionnaires, baseline development, 10questions/recommendations,
responding to, 314
33_598376 bindex.qxp 3/3/06 11:22 PM Page 954
Index 955
RRAD (rapid application development)
testingacceptance testing, 637check procedures, 642concerns, 634–635defined, 587–588discussed, 210final planning iteration, 642guidelines, 643in-phase agreement, 641input, 636integration testing, 637objectives, 634out-of-phase agreement, 642output, 643overview, 633performance, 635spiral testing, 638strengths/weaknesses, 639subsequent testing iterations, 640–642test planning iterations, 640testing components, 635work papers
applicability checklist, 644conceptual model development,
646–647logical data model development, 647production system development,
651–652production system release, 652quality control checklist, 653scope and purpose of system
definition, 645–646specifications, revising, 650–651system development, 648–649test system release, 652
workbench concept, 635–636ratio tools, 107reactionary environment, people
management, 871readability, test plan development, 262readers, inspection process
responsibilities, 257
readiness assessmentclient/server systems testing, 614–615,
621time-compression efforts, 826
realistic mechanism solutions,communication barriers, 884
record counts, preventive controls, 661recorders, inspection process
responsibilities, 257records
records-retention program, 753undetected defects, 66
recoverability concerns, Web-basedsystems testing, 807
recovery testingdata warehouse testing, 773–774, 778examples of, 223how to use, 226importance of, 225objectives, 226validation testing, 435when to use, 226
red flag/hot button barriers, 864reduction, variability measures, 835redundancy criteria, documentation,
179regression models, estimation, 183regression scripting, 431regression testing
COTS software testing challenges, 690design changes, 55examples of, 229how to use, 231importance of, 230maintenance verification, 55objectives, 231Web-based systems testing, 808when to use, 231
relationship tools, 107relevance, requirements measures, 594reliability
risk assessment, 215software quality factors, 844test factors, 41, 43
33_598376 bindex.qxp 3/3/06 11:22 PM Page 955
956 Index
reliability (continued)validation testing, 434Web-based systems testing, 806
reliability, critical success factors, 696reload pages, Web-based systems
testing, 805remote computer sites, software security
testing, 746repeatable process level, client/server
systems testing, 617–618reports. See also results
Average Age of Uncorrected Defectsby Type, 475
COTS software testing, 693Defects Uncovered versus Correlated
Gap Timeline, 473–474document development, 174Expected versus Actual Defects
Uncovered Timeline, 472–473Functions Working Timeline, 472function/test matrix, 470–472Interim Test, 478inventory process, 465measurement teams, 465measurement units, 464–465Normalized Defect Distribution,
476–477preparation, tool manager duties, 120process requirements, 464–466Project Status, 468–470project status calculation, 195summaries, 479–480Summary Status, 466–468system test report example, 480–481test verification, 55Testing Action, 477timeline development, 837V-concept testing, 158vulnerabilities, 737work papers
defect reporting, 484–485quality control, 486–487writing guidelines, 488–489
requirement phase, verification testing,296
requirements reviews, 70requirements testing
examples of, 229how to use, 230objectives, 230when to use, 230
requirements tracing, 292, 314–315requirements verification, SDLC, 53–54requisite resources, implementation
procedures, 893–894rerun analysis, post-implementation
analysis, 579resentment, inspection process concerns,
255resources
allocation for, management supportneeds, 51
estimation, 181–182, 221resources needed, test plan
development, 221resources protected concerns, baseline
information, 743software cost estimation, 188
respect for others, communicationbarriers, 880
responsibilitiestest managers, 167–168of tool managers, 119–120
resubmission, corrective controls, 665results. See also reports
analysischeck procedures, 482concerns, 460defects, 462efficiency, 463guidelines, 482input, 461–463output, 482overview, 459test scripts, 433workbench concepts, 460
assessments, baseline development,11–12
implementation procedures, 892internal controls testing, 677
33_598376 bindex.qxp 3/3/06 11:22 PM Page 956
Index 957
managing, V-concept testing, 162software security testing, 760–761test verification, 55validation testing
conditions, 436, 438–439deviation, 437effect, 436, 438
retesting activity, maintenanceverification, 55
retrieval, transaction, 756reusability, software quality factors, 844revenue, cycle control objectives, 675reviews
feasibility, 70method selections, 329–330programming testing, 328–330requirements, 70test plan development, 262verification testing, 292
rewardsbarrier/obstacle solutions, 868management support needs, 51
rework factors, variability measures,835, 839
risksapplication, 304–308bad design problems, 39controls, 64data input problems, 39faulty design problems, 39identifiable, 303incomplete data, 39needs gap, 38risk appetite, 38risk assessment
access control, 216audit trails, 216authorization, 215continuity of processing, 216correctness, 217COTS (commercial off-the-shelf)
software, 688–689coupling, 218ease of operation, 218
ease of use, 217file integrity, 215internal controls testing, 668maintainability, 217methodology, 216–217penalties, 58performance, 218portability, 217–218reliability, 215service levels, 216software security testing, 752test plan development, 27, 215–216testing guidelines, 64–65V-concept testing, 77–79Web-based systems testing, 802–803work papers, 61
risk matrixtools, 107verification testing, 293, 302
specifications gap, 38team member establishment, 302–303test plan development, 215–218
rolling baseline, project statuscalculation, 195
root cause identificationimprovement barriers, 866–869red flag/hot button barriers, 864variability measures, 840
rule establishmentacceptance testing, 492programming testing, 329walkthroughs, 312–313
Ssampling, scoring success factors, 316Sarbanes-Oxley Act of 2002, 655scale of ten, post-implementation
analysis, 580schedules
COTS software testing challenges, 690efficiency measures, 853facts, managing, 162inadequate, 39people management through, 871
33_598376 bindex.qxp 3/3/06 11:22 PM Page 957
958 Index
schedules (continued)post-implementation analysis, 580profile information, 214software cost estimation, 187test plan development, 222tool manager duties, 119
scheduling statusProject Status reports, 469Summary Status reports, 467
scope. See project scopescoring success factors, verification
testing, 316–318scoring tools, 107scripts. See test scriptsSDLC (system development life cycle)
design requirements, 53–54economics of testing, 47–49installation requirements, 53, 55life cycle phases, 52–53maintenance requirements, 53, 55program requirements, 53, 55requirements verification, 53–54software development methodologies,
588–589test requirements, 53, 55
second priority, probable penetrationpoints, 760
section titles, documentation, 180security. See also software security
testingapplication risks, 304–305client/server systems testing, 622COTS implementation risks, 688critical success factors, 696programming testing, 327validation testing, 435Web-based systems testing, 800, 803
SEI (Software Engineering Institute),596, 824
self-assessmentimprovement planning, 17–18software development methodologies,
605–606software testing model definition
on test processes, 24–27work papers, 909–914
self-checking digits, preventive controls,660
self-descriptiveness, best practices, 850senior management roles and
responsibilities, 6sensor processing, 76service levels
inadequate, 766programming testing, 327risk assessment, 216test factors, 41–42validation testing, 435verification testing, 319
seven-step software testing process. SeeV-concept testing
severityclassifying defects by, 258defect severity levels, 471
signal processing, 76simplicity, best practices, 850simulation
design verification, 54software cost estimation, 183system processing, 76
site validation tools, Web-based systemstesting, 809
size criteria, documentation, 179skill levels
tool managers, 118–119tool selection considerations, 111–114
snapshot tools, 107Software Certifications Web site, 125software cost estimation. See costssoftware development library,
configuration management, 604Software Engineering Institute (SEI),
596, 824software functionality, COTS software
testing, 700software methods, test matrix, 246software packages, test script
development, 431
33_598376 bindex.qxp 3/3/06 11:22 PM Page 958
Index 959
software quality requirements,acceptance testing, 497
software security testing. See alsosecurity
accidental versus intentional losses,738–739
adequacy evaluation, 761baseline information
accurate and precise information, 741analysis, 750availability, 741baseline awareness training, 747categories, 739collection method training, 747corporate language adjustments, 741data collection methods, 744–747forms completion training, 747–748methods, 743objectives, 751one-time data collection procedures,
741reasons for, 740resources protected concerns, 743security sources, 740status, reporting, 749support concerns, 744team member selection, 742–743training concerns, 743what to collect, 740
central computer sites, 746check procedures, 762communication lines and networks,
746examples of, 223fraud studies, 761guidelines, 762hacking, 757how to use, 228input, 735objectives, 228, 734office equipment, 746output, 762overview, 733
penetration pointsbusiness transaction control points,
755–756characteristics of, 756developing, 757–760development activities, 753–754first priority, 760interface activities, 752operations activities, 754–755second priority, 760staff activities, 751third priority, 760
protection points, 746remote computer sites, 746results, 760–761risk assessment, 752storage areas, 746vulnerabilities
central processors, 738computer operations, 738computer programs, 736–737data and report preparation facilities,
737data handling areas, 738digital storage facilities, 738discussed, 735impersonation, 737input data, 736IT operations, 736media, 737non-IT areas, 738online terminal systems, 738operating system access and integrity,
737output data, 736physical access, 736programming offices, 738test processes, 736
when to use, 228work papers, 763workbench concept, 734–735
software summary, documentdevelopment, 172
33_598376 bindex.qxp 3/3/06 11:22 PM Page 959
960 Index
software systems, V-concept testing,76–77
software version changes, operationaltesting, 509–511
source code analysis, post-implementation analysis, 579
source-data, preventive controls,658–659
space allocation, monitoring, 773span of operation criteria,
documentation, 174special-value testing, 237specifications
implementing, risks associated with,38–39
system test plan standards, 81–82variance from, 65
spiral testingmethodology, 588RAD testing, 638
sponsor responsibilities, 590SRS (System Requirements
Specifications), 595SSS (system security specialist), 591staff
administrative/organizationalbarriers, 865
client/server testing readiness, 615competency
barriers, 865efficiency measures, 852–853profile information, 214
experience evaluation, 600people management through, 871software security testing, 751verification testing concerns, 294
stakeholder perspective, improvementbarriers, 861–864
standardsdevelopment activities, 753documentation process, 770policy criteria, 45system test plan, 79–82unit test plan, 83
start datesProject Status reports, 468timelines, 838
start time delays, inspection processconcerns, 255
starting early, test plan development, 262startup failure, post-implementation
analysis, 579state machines, functional testing, 238state of requirements, 592state of the art technology, tool use, 108statement testing of, 239static analysis
program verification, 55programming testing, 324verification testing, 293
statisticsoperations process, 773profile information, 215statistical testing, 241
status. See project status calculationstop it from happening concept,
stakeholder perspective, 862storage areas, software security testing,
746storage efficiency, best practices, 850strategies
business innovation, 878converting to testing tactics, 83–85risk assessment, 56–57system development phases,
identifying, 56test factors, selecting and ranking, 56V-concept testing, 74–75
strengths, building on, 857–860stress testing
examples of, 223how to use, 224objectives, 224test data for, 430tools associated with, 104validation testing, 435Web-based systems testing, 804, 808when to use, 224
33_598376 bindex.qxp 3/3/06 11:22 PM Page 960
Index 961
stress/performance scripting, 431strong controls, internal controls testing,
677structural desk debugging, 325structural testing. See also functional
testingadvantages/disadvantages, 69branch testing, 239conditional testing, 234COTS software testing, 703–704defined, 69expression testing, 240feasibility reviews, 70multiplatform environment testing,
723–725path testing, 240reasons for, 69requirements reviews, 70statement testing, 239stress testing
examples of, 223how to use, 224objectives, 224test data for, 430tools associated with, 104validation testing, 435Web-based systems testing, 804, 808when to use, 224
structural analysis, 238–239types ofvalidation techniques, 69–70verification techniques, 69–70work papers, 87–91
structure components, acceptancetesting, 492
substantiation objectives, cycle controlobjectives, 675
success factorseffectiveness measures, 852people management, 871verification testing, 293
summaries, report, 479–480Summary Status reports, 466–468symbolic execution tools, 107, 239
syntactical desk debugging, 325system access, application risks, 304system analyst perspective, software
development methodologies, 593system boundary diagrams, 500–501system building concepts, workbench
concept, 72system chains, revised testing
approach, 50system component identification, test
plan development, 221system control objectives, internal
controls testing, 674system development life cycle. See
SDLCsystem development processes, data
warehouse testing, 770–771, 779system independence, best practices, 850system log tools, 107system maintenance, test tactics, 75System Requirements Specifications
(SRS), 595system security specialist (SSS), 591system skills, tool selection
considerations, 111, 113–114system testing
as functional tests, 70validation, 221Web-based systems testing, 807
system/subsystem specifications,document development, 173
Ttalking the talk, management support,
51target dates, project status calculation,
194, 468teachable processes, 154teams
agile testing process, 820–821appointing, 168–171assessment
baseline development, 10verification testing, 317
33_598376 bindex.qxp 3/3/06 11:22 PM Page 961
962 Index
teams (continued)baseline information gathering,
742–743combination, 170–171composition, 169cultural barriers, 875design reviews, 321external, 170internal, 169–170non-IT, 170report measurement, 465risk team establishment, 302–303tester agility, 842walkthroughs, 313
technical interface activities, 752technical reviews, configuration
management, 603technical risk assessment, work papers,
93–96technical skills, tool selection
considerations, 111, 114technical status, Summary Status
reports, 467techniques versus tools, 103technological developments, revised
testing approach, 50technology issues, COTS software
testing challenges, 690telephone and network switching
equipment, test plan development,222
tenure, tool managers, 120termination analysis, post-
implementation analysis, 579test case generators, Web-based systems
testing, 809test data generator tools, 107test factors
programming testing, 326–328selecting and ranking, 56verification testing, 299–302work papers, 62
test managers, 167–168
test matrixbatch tests, 248defined, 245online system test, 248risk assessment, 57software methods, 246structural attributes, 246test matrix example, 246verification tests, 248–249
test plan developmentadministrative components, 250–251automated test tools, 221check procedures, 262constraints, 214contingency plans, 222core business areas and processes, 221cost/schedules, 214customer and user involvement, lack
of, 210data exchange issues, 222databases built/used, 214deliverables, 213facility requirements, 222flexibility, 262implementation, 214, 251input, 212inspection, 254–258interfaces, to other systems, 214lack of testing tools, 210lack of training concerns, 210legal/industry issues, 214lose-lose situations, 211objectives, 210, 213, 245output, 262over-reliance, 210potential failures, assessing severity of,
221projects, profiling, 212–215RAD (rapid application development),
210readability, 262resources needed, 221reviews, 262
33_598376 bindex.qxp 3/3/06 11:22 PM Page 962
Index 963
risk assessment, 27, 215–218schedule issues, 222staff competency, 214starting early, 262statistics, 214system component identification, 221telephone and network switching
equipment, 222test matrix development, 245–248testing concerns matrix, 219–220testing technique selection, 222–223unit testing and analysis, 235us versus them mentality, 210validation strategies, 221V-concept testing, 157–158work papers
administrative checkpoint, 273–274batch tests, 267general information, 271inspection report, 276–280milestones, 272moderator checklist, 275objectives, 264online system tests, 268quality control, 283–287software module, 265structural attribute, 266test matrix, 270verification tests, 269
workbench concept, 211written example, 252–254
test processesvulnerabilities, 736work papers, 24–27
test room configurations, multiplatformenvironment testing, 723
test scriptsdeveloping, 430–432discussed, 104execution, 433integration scripting, 431maintaining, 434pseudo-concurrency scripting, 431regression scripting, 431
results analysis, 433stress/performance scripting, 431tools, 107unit scripting, 431
test to business effectiveness, 578test verification, SDLC, 53, 55testability, software quality factors, 844test-data approach, internal controls
testing, 669–672testers
acceptance testing roles, 44change recommendation roles, 44competency
baseline development, 14–16CBOK (Common Body of
Knowledge), 125–128competent assessment, 128CSTE (Certified Software Tester)
certificate, 125, 127–128fully competent assessment, 128job performance roles, 126–127negative attitudes, 130not competent assessment, 128, 130training curriculum development,
128–130work papers, 28–32
developmental improvement roles, 44documentation specification
requirement roles, 44motivation factors, 51process improvement roles, 44requirements measures, 594risk appetite, 38
tester’s workbench. See workbenchconcept
Testing Action report, 477testing guidelines. See guidelines, testingtesting methodology cube, 83, 85testing metrics, post-implementation
analysis, 578testing strategies. See strategiestesting tactical dashboard indicators, 7third priority, probable penetration
points, 760
33_598376 bindex.qxp 3/3/06 11:22 PM Page 963
964 Index
throughput testing, Web-based systemstesting, 804
time improvement ideas, variabilitymeasures, 841
time-compression effortsbest practices, 825calendar-day efficient, 823–824challenges, 824implementation procedures, 886–888readiness criteria, 826solutions to, 825V-concept testing, 826–827work papers, 828–830workbench concept, 834
timelineschange estimation, 837critical path definition, 836documentation, 180–181end dates, 838in control processes, 832input, 837no-rework days, 839out of control processes, 832output, 837project name, 838report generation, 837start dates, 838work papers, 896–905workday, 839
timeouts, Web-based systems testing,807
tool managersassistant, 119mentors, 119need for, 117positions, prerequisites to creating,
118responsibilities of, 117, 119–120skill levels, 118–119tenure, 120
toolsautomatic, 104boundary value analysis, 105capture, 105
cause-effect graphing, 105checklist, 105code comparison, 105compiler-based analysis, 105confirmation, 105control flow analysis, 105correctness proof, 105costs, 114–116data dictionary, 105, 770data flow analysis, 105, 238–239database, 105design reviews, 105design-based functional testing, 105desk checking, 106disaster testing, 106documentation, 124error guessing, 106, 721–722examination, 105executable specs, 106fact finding, 106flowchart, 106inspection, 106instrumentation, 106integrated test facility, 106lack of, 210life cycle phase testing, 109–111logging, 107manual, 104mapping, 106matching to its use, 109matching to skill levels, 111–114modeling, 106most used, 108parallel operation, 106parallel simulation, 107peer review, 107playback, 105ratio, 107relationship, 107replacements, 120risk matrix, 107scoring, 107selection considerations, 108–109,
121–123
33_598376 bindex.qxp 3/3/06 11:22 PM Page 964
Index 965
snapshot, 107specialized use, 108state of the art technology, 108stress testing, 104symbolic execution, 107, 239system log, 107techniques versus, 103test data, 107test script, 107tracing, 108use case, 108utility program, 108walkthrough, 108
traceabilitybest practices, 849requirements measures, 594
tracking systems, project statuscalculation, 190
traditional system development, testtactics, 75
trainingbarriers/obstacle solutions, 868baseline awareness, 747best practices, 850collection methods, 747concerns, baseline information, 743curriculum, 128–130development activities, 753failures, operational testing, 524forms completion, 747–748lack of, 210management support needs, 51manual support testing, 229material, operational testing, 519–522tool usage, 116–117
transaction flow testing, 672–673transaction processing events, 414, 724transactions
authorization, 755destruction, 756origination, 755retrieval, 756turnaround time, stress testing, 223usage, 756
transferability, critical success factors,696
treasury, cycle control objectives, 675troubled projects, inadequate
estimation, 181–182turnaround documents, preventive
controls, 659
Uunauthorized access, 766undetected defects, 66unit scripting, 431unit testing
as functional tests, 70quality control, 244test plan standards, 83validation, 221Web-based systems testing, 807
unknown conditionsCOTS software testing challenges, 689undetected defects, 66
update controls, enterprise-widerequirements, 767
upgrades, tool manager duties, 119–120us versus them mentality, 210usability
requirements measures, 594software quality factors, 844Web-based systems testing,
800, 806, 808usage
enterprise-wide requirements, 768training testers in, 116–117transaction, 756
use casesacceptance testing, 500–503tools, 108validation testing, 412
user acceptanceas functional tests, 70implementation procedures, 885Web-based systems testing, 808
user education, client/server testingreadiness, 614
33_598376 bindex.qxp 3/3/06 11:22 PM Page 965
966 Index
user manuals, document development,174
usersfunctional testing phases, 70outside users, revised testing
approach, 50participation, post-implementation
analysis, 578profile information, 213reaction evaluation, 576roles and responsibilities, 6, 498, 590skills, tool selection considerations,
111–112utilities and commands, operations
activities, 754utility program tools, 108
Vvalidation testing
acceptance testing, 221audit trails, 435authorization, 434check procedures, 439compliance, 434concerns, 410correctness, 435coupling, 434design goals, defining, 414disaster testing, 436ease of operation, 436ease of use, 435execution, 434–436file design, 413–414file integrity, 435functional testing, 69–70guidelines, 439–440input, 411inspections, 436integration testing, 221maintainability, 436objectives, 410output, 439overview, 408payroll application example, 416–429
performance, 434portability, 436preventive controls, 659–661recovery testing, 435reliability, 434results, documenting
conditions, 436, 438–439deviation, 437effect, 436, 438
security, 435service levels, 435stress testing, 435structural testing, 69–70system testing, 221test data
creating, 415–416entering, 414sources, 412for stress testing, 430
test plan development, 221test scripts
developing, 430–432execution, 433levels of, 430maintaining, 434results analysis, 433
transaction-processing programs, 414unit testing, 221V-concept testing, 158work papers
audit trails, 445compliance, 443, 450correctness, 451coupling, 455ease of operations, 456ease of use, 452file integrity, 444functional testing, 442maintainability, 453performance, 448portability, 454problem documentation, 457quality control, 458recovery testing, 446
33_598376 bindex.qxp 3/3/06 11:22 PM Page 966
Index 967
security, 449stress testing, 447test script development, 441
workbench concept, 410–411value received,
administrative/organizationalbarriers, 866
variability measuresagile testing and, 820check procedures, 834competency measures, 835discussed, 831external/internal work processes, 835quality control, 841reduction, 835rework factors, 835root cause identification, 840time improvement ideas, 841timelines
change estimation, 837critical path definition, 836end dates, 838in control processes, 832input, 837no-rework days, 839out of control processes, 832output, 837project name, 838report generation, 837start dates, 838workday, 839
workbench concept, 833–834variance from specifications, 65V-concept testing
acceptance testing, 158–159analysis, 158customization, 160–161development project types, 75discussed, 72–73facts, managing, 162importance of, 68multiplatform environment testing,
725–726objectives, 159–160
operational testing, 158–159organization, 157overview, 156post-implementation analysis, 159process management, 161–162project phases, 79project scope, 77reporting, 158results management, 162risk identification, 77–79software system types, 76–77strategic objectives, determining,
74–75strategies, converting to testing tactics,
83–85test plan development, 157–158test plan standards, 79–82testing methodology cube, 83, 85time-compression efforts, 826–827unit test plan standards, 83validation testing, 158verification testing, 158workbench concept with, 162–163
vendorsCOTS implementation risks, 689interface activities, 752reputation importance, 690–691tool manager duties, 119
verification testingadequate control assessment, 310application risks, 304–308base case, 293baseline development, 13check procedures, 330concerns, 294control objectives, 303debugging, 292design deliverables, inspecting, 322design phase, 296–297design reviews, 320–322discussed, 298functional testing, 69–70guidelines, 331–332input, 296–297
33_598376 bindex.qxp 3/3/06 11:22 PM Page 967
968 Index
verification testing (continued)inspections, 292for large documents, 248–249objectives, 293output, 331programming phase, 297, 323–324programming testing
acceptance testing, 324complexity of, 323desk debugging, 325–326dynamic testing, 324importance of, 323peer reviews, 328–330static analysis, 324test factor analysis, 326–328
project leader assessment, 317requirements phase, 296requirements tracing, 292, 314–315reviews, 292risk matrix, 293, 302scoring success factors, 316–318static analysis, 293structural testing, 69–70success factors, 293team assessment, 317test factor analysis, 310–312, 318–319test factors, 299–302V-concept testing, 158walkthroughs, 292, 312work papers
access defined, 347, 366audit trails, 363, 394authorization rules, 342, 359–360,
391–392business system design review,
377–380computer applications risk scoring
form, 349–353computer processing control
procedure, 354computer systems design review,
381–385contingency planning, 364, 395coupling, 404
data integrity controls, 357–358,389–390
design compliance, 367–371design criteria, 374–375failure impact, 345file integrity, 343, 361–362, 393functional specifications, 334interface design, 373maintenance specifications, 336needs communicated, 376online processing controls, 355–356operation procedure development,
405operational needs, 340output control, 355performance criteria, 339portability needs, 337program compliance, 398–402quality control, 348, 387–388, 405reconstruction requirements, 344requirements compliance with
methodology, 333security implementation, 397service levels defined, 346, 365, 396system interface factors, 338tolerances, 341usability specifications, 335
workbench concept, 294–295verification, validation, and testing
(VV&T), 589version changes, operational testing,
509–511video cards, Web-based systems testing,
805viruses, Web-based systems testing, 803vulnerabilities
central processors, 738computer operations, 738computer programs, 736–737data and report preparation facilities,
737data handling areas, 738digital storage facilities, 738discussed, 735
33_598376 bindex.qxp 3/3/06 11:22 PM Page 968
Index 969
impersonation, 737input data, 736IT operations, 736media, 737non-IT areas, 738online terminal systems, 738operating system access and integrity,
737output data, 736physical access, 736programming offices, 738test processes, 736
VV&T (verification, validation, andtesting), 589
Wwalkthroughs
of customer/user area, 212–213final reports, 314presentations, 313–314questions/recommendations,
responding to, 314rules, establishing, 312–313team selection, 313tools, 108verification testing, 292, 312
WAN (wide area network), 800waterfall methodology, 587WBS (Work Breakdown Structure), 603weak controls, internal controls testing,
678weaknesses, minimizing, 857–860Web sites
QAI (Quality Assurance Institute), 15, 125
Software Certifications, 125Web-based systems testing
access control, 803authorization, 803back-end processing, 800bandwidth access, 805browser compatibility concerns, 800,
805–806caching, 805
calculation correctness, 804check procedures, 809code verification concerns, 800compatibility concerns, 804–805,
808–809component testing, 807concurrency testing, 804correctness concerns, 800, 804crashes, 807dropped lines, 807dynamic page generation, 806e-mail functions, 806file downloads, 806graphics filters, 805guidelines, 810hardware compatibility, 805HTML tools, 809input, 801–802integration concerns, 800integration testing, 807load testing, 808lost connections, 807memory, 805monitors, 805multimedia support, 805navigation correctness, 804output, 810overview, 799performance concerns, 800, 803–804performance testing, 808print handling, 805recoverability concerns, 807regression testing, 808reliability concerns, 806reload pages, 805risk assessment, 802–803security concerns, 800, 803site validation tools, 809stress testing, 804, 808system testing, 807test case generators, 809throughput testing, 804timeouts, 807unit testing, 807
33_598376 bindex.qxp 3/3/06 11:22 PM Page 969
970 Index
Web-based systems testing (continued)usability concerns, 800, 806usability testing, 808user acceptance testing, 808video cards, 805viruses, 803work papers
quality control checklist, 815risk assessment, 812testing tool selection, 814testing types, 813
workbench concept, 800–801weighted criteria score, documentation,
175–177wide area network (WAN), 800Work Breakdown Structure (WBS), 603work flow, COTS software testing,
698–699work papers
acceptance testingacceptance criteria, 527acceptance plan form creation,
544–545automated application criteria,
566–567change control forms, 546checklist, 550–551data change forms, 547–548deletion instructions, 538–539installation phase, 531inventory material, 552–553production change instructions,
536–537program change form completion,
540–541, 549program change history, 534–535quality control checklist, 560–563recovery planning data, 532–533system boundary diagram, 528system problem form completion,
542–543test cases, 530training checklist form completion,
558–559
training failure notification, 568–569training module form completion,
556–557training plan form completion,
554–555training quality control checklist,
564–565use cases, 504–507, 529
agile testing process, 923–927barriers
communication barriers, 920–922cultural barriers, 919stakeholder analysis, 915–918
best practices, 906–908CBOK (Common Body of Knowledge)
individual competency evaluation,149
new information technology, 148project management, 135–138security procedure assessment,
146–147software controls, 146test environment, building, 133–135test planning, 138–143test status and analysis, 143–144test team competency evaluation,
150testing principles and concepts,
132–133user acceptance testing, 144–145
client/server systems testingclient data, 627–628footprint chart, 631quality control checklist, 632readiness results, 630security, 626–627standards, 628–629system installation, 625
COTS (commercial off-the-shelf)software testing
completeness tests, 707–708functional testing, 710quality control checklist, 712–715structural testing, 711
33_598376 bindex.qxp 3/3/06 11:22 PM Page 970
Index 971
data warehouse testingaccess control, 785activity process, 797audio trails, 784concerns rating, 788, 795–796continuity of processing, 792data, placing in wrong calendar
period, 787documentation, 791fraud, 789inadequate responsibility
assignment, 781inadequate service levels, 786incomplete data concerns, 782management support concerns, 794performance criteria, 793quality control checklist, 798reviews, 790update concerns, 783
documentationcompleteness, 202estimation, 203–205quality control checklist, 207–209weighted criteria calculation, 201
footprint chart, 23internal controls testing
documentation, 679file control, 683input controls, 679–681output control, 682program and processing controls,
681–682methodologies, software development
analysis footprint, 609self-assessment, 607–608
multiplatform environment testingconcerns, 728configurations, 728quality control checklist, 731–732validity, 729–730
post-implementation analysis, 582RAD (rapid application development)
testingapplicability checklist, 644
conceptual model development,646–647
logical data model development, 647production system development,
651–652production system release, 652quality control checklist, 653scope and purpose of system
definition, 645–646specifications, revising, 650–651system development, 648–649test system release, 652
reportsdefect reporting, 484–485quality control, 486–487writing guidelines, 488–489
risk score analysis, 99self-assessment, 909–914size risk assessment, 97–98software security testing, 763software testing environment, 19–22structural testing, 87–91technical risk assessment, 93–96test factor ranks, 58–59, 61test plan development
administrative checkpoint, 273–274batch tests, 267general information, 271inspection report, 276–280milestones, 272moderator checklist, 275objectives, 264online system tests, 268quality control, 283–287software module, 265structural attribute, 266test matrix, 270verification tests, 269
tester competency assessment, 28–32time-compression efforts, 828–830timelines, 896–905tool use
documentation, 124selection considerations, 121–123
33_598376 bindex.qxp 3/3/06 11:22 PM Page 971
972 Index
work papers (continued)validation testing
audit trails, 445compliance, 443, 450correctness, 451coupling, 455ease of operations, 456ease of use, 452file integrity, 444functional testing, 442maintainability, 453performance, 448portability, 454problem documentation, 457quality control, 458recovery testing, 446security, 449stress testing, 447test script development, 441
verification testingaccess defined, 347, 366audit trails, 363, 394authorization rules, 342, 359–360,
391–392business system design review,
377–380computer applications risk scoring
form, 349–353computer processing control
procedure, 354computer systems design review,
381–385contingency planning, 364, 395coupling, 404data integrity controls, 357–358,
389–390design compliance, 367–371design criteria, 374–375failure impact, 345file integrity, 343, 361–362, 393functional specifications, 334interface design, 373maintenance specifications, 336need communicated, 376
online processing controls, 355–356operation procedure development,
405operational needs, 340output control, 355performance criteria, 339portability, 337, 372, 403program compliance, 398–402quality control, 348, 387–388, 405reconstruction requirements, 344requirements compliance with
methodology, 333security implementation, 397service levels defined, 346, 365, 396system interface factors, 338tolerances, 341usability specifications, 335
Web-based systems testingquality control checklist, 815risk assessment, 812testing tool selection, 814testing types, 813
workbench conceptacceptance testing, 494client/server systems testing, 613COTS software testing, 692data warehouse testing, 766–767input component, 71internal controls testing, 667multiplatform environment testing,
719–720multiple workbenches, 72organization, 166output components, 71post-implementation analysis, 572–573procedures to check, 71procedures to do, 71process advantages, 163RAD (rapid application development)
testing, 635–636results analysis, 460software security testing, 734–735system building concepts, 72test plan development, 211
33_598376 bindex.qxp 3/3/06 11:22 PM Page 972
Index 973
testing tools and, 71time-compression efforts, 834validation testing, 410–411variability measures, 833–834with V-concept testing, 162–163verification testing, 294–295Web-based systems testing, 800–801
workday timelines, 839working documents, 178world-class testing organization
baseline developmentassessment teams, 10capabilities assessment, 13–14cause-effect diagram, 9drivers, 8
environment assessment, 8footprint charts, 10implementation procedures, 9results assessments, 11–12verification, 13
discussed, 3improvement planning, 16–18PDCA (plan-do-check-act) cycle, 4–5software development process, 4software testing model definition
discussed, 5organization, 7self assessment, 6
wrong specifications, defects, 65, 471
33_598376 bindex.qxp 3/3/06 11:22 PM Page 973
33_598376 bindex.qxp 3/3/06 11:22 PM Page 974
33_598376 bindex.qxp 3/3/06 11:22 PM Page 975
33_598376 bindex.qxp 3/3/06 11:22 PM Page 976
33_598376 bindex.qxp 3/3/06 11:22 PM Page 977
33_598376 bindex.qxp 3/3/06 11:22 PM Page 978