SOUTHERN NAZARENE UNIVERSITY
ORGANIZATIONAL LEADERSHIP
EVALUATION OF THE UNITED STATES AIR FORCE
UNQUALIFIED COMPUTER DISPLAY MAINTENANCE
TECHNICIAN
END OF COURSE EXAMINATION
A PROJECT REPORT
SUBMITTED TO THE FACULTY
in partial fulfillment of the requirements for the
degree of
Bachelor of Science
by
RICARDO A. ARREDONDO
Group 208
Bethany, Oklahoma
2006
EVALUATION OF THE UNITED STATES AIR FORCE
UNQUALIFIED COMPUTER DISPLAY MAINTENANCE
TECHNICIAN
END OF COURSE EXAMINATION
A PROJECT REPORT APPROVED FOR THE
ORGANIZATIONAL LEADERSHIP PROGRAM
By
_______________________
Project Director
This report is not to be regarded as confidential and its use as a
Sample in future classes is not restricted.
By
_______________________
Site Contact
TABLE OF CONTENTS
PAGE
ABSTRACT 1
CHAPTER 1. Introduction and Statement
of the Problem 2
Statement of Purpose 2
Organizational Context 2
Setting of the Problem 2
History and Background 6
Scope of the Project 8
Significance of the Project 9
Definition of the Terms 10
CHAPTER 2. Review of the Literature 12
Overview of Computer Based Testing 13
Areas of Consideration for Validity in
Computer Based Tests 15
Computer Literacy and Overall Interaction
of Computer Technology 17
A Socioeconomic Approach to Computer
Based Assessment 19
Conclusion 20
PAGE
CHAPTER 3. Methods and Procedures 22
Hypothesis 22
Data Source 22
Instrumentation 23
Procedure 24
Data Analysis 25
Limitations 25
CHAPTER 4. Summary of Results 26
Restatement of the Hypothesis 26
Descriptive Statistical Information 27
Results of Significance Test 30
Results of Needs Analysis 30
Status Quo 31
Implementation of CBT Familiarization 31
Alternative Suggestions 32
CHAPTER 5. Discussions and Conclusions 33
General Discussion and Conclusion 33
Strengths and Weaknesses of the Study 34
Recommendations 35
Suggestions for Future Research 37
REFERENCE 39
LIST OF FIGURES
FIGURE 1. Department of Defense Organizational Chart
FIGURE 2. USAF Mission and Vision Statement, 966th AACS
Mission Statement
FIGURE 3. 966th AACS Organizational Chart
FIGURE 4. Example Data Format
FIGURE 5. Unqualified Computer Display Maintenance Technician
Test and Background Data
FIGURE 6. Unqualified Computer Display Maintenance Technician
End of Course Examination Comparison
1
ABSTRACT
The 966th Airborne Air Control Squadron (AACS), under the governance of the
United States Air Force (USAF) and the 552nd
Operations Group (552nd
OG), was the
primary Replacement Training Unit (RTU) for all E-3 Airborne Warning and Control
System (AWACS). The purpose of the 966th AACS was to provide training for all E-3
AWACS crewmembers with the skills necessary to surpass the initial phase of their
Professional Flight Training (PFT), to include preparation for the 552nd
Operations Group
Standards and Evaluation Office End of Course Examination (552nd
OGV EOCE).
Due to the trend of failures of the 552nd
OGV EOCE, an evaluation was
undertaken to obtain the major contribution in the failure of 552nd
OGV EOCE amongst
Unqualified Computer Display Maintenance Technicians (UCDMTs). The project
involved tracking the passing and failure rate data obtained from the 552nd
OGV EOCE.
Recommendations were then suggested to increase the passing rate.
Following the gathering of data and research phase of this project, an analysis was
performed on the validity of CBTs, computer literacy, gender, and the socioeconomic
factor of CBTs. The data obtained from the research supported the familiarization factor
of CBTs. It was hypothesized that repetitious exposure to Computer Based Testing of
similar content would familiarize and inherently increase the mean scores of UCDMTs.
Using an independent samples t test at the .05 level of significance, the hypothesis
of exposure to CBTs was determined to be not statistically significant. However, the
results of the project did increase the mean scores of UCDMTs. This increase was
determined by the 966th AACS, 552
nd OG, and the USAF to be significant enough to
implement familiarization and exposure of CBTs in all future UCDMT PFT.
2
Chapter 1
Introduction and Statement of the Problem
Statement of Purpose
The purpose of this project was to obtain the major contribution in the failure rate
of United States Air Force (USAF) 966th Airborne Air Control Squadron (AACS)
Unqualified Computer Display Maintenance Technicians (UCDMT) 552nd
Operations
Group Standards and Evaluation Office End of Course Examination (552nd
OGV EOCE).
Data was analyzed and examined from 1 July 2006 to 1 October 2006.
The project involved tracking the passing and failure rate data obtained from the
552nd
OGV EOCE. A failure of the 552nd
OGV EOCE delayed training on an average of
two weeks and costs the USAF and the 966th AACS millions of dollars each year in
productivity and remedial training. The results sought after in this project were to find
the major contributors to failure of the 552nd
OGV EOCE. Recommendations were then
suggested to increase the passing rate.
Organizational Context
Setting of the problem. Signed in to law through the National Security Act,
President Harry Truman on September 18, 1947, created the United States Air Force.
The National Security Act divided the United States Military into three separate and
equal branches under the Department of Defense (DOD). Under the Secretary of
Defense, the Department of the Air Force as well as the Departments of the Army and
Navy was commanded by the Joint Chiefs of Staff (JCS). The Joint Chiefs of Staff
resided in Washington, D.C. and advised the Secretary of Defense and the President of
3
the United States in decisions about the military. Ultimately the President of the United
States was the Commander In Chief (CINC) of all the military forces.
------------------------------
Insert Figure 1 Here
------------------------------
The goals and focus of the United States Air Force were reflected in its Organizational
Mission Statement and Vision Statement.
------------------------------
Insert Figure 2 Here
------------------------------
The 966th AACS was under direct command of the United States Air Force,
through its Major Command (MAJCOM), Air Combat Command (ACC), 552nd
Air
Control Wing (552nd
ACW), 552nd
Operations Group (552nd
OG). The 552nd
OG was
divided into operational and training units. The 966th AACS was the primary
Replacement Training Unit (RTU) for all E-3 Airborne Warning and Control System
(AWACS) throughout the USAF. The 966th AACS as a RTU was formed in 1996 to
provided academic and simulator training for all E-3 mission crewmembers. The 966th
AACS provided initial and returning E-3 aircrew members with the skills necessary to
surpass the Initial Qualification Training (IQT) phase of the ACC Professional Flight
Training (PFT).
5
UNITED STATES AIR FORCE MISSION STATEMENT
The mission of the United States Air Force is to deliver sovereign options for the defense
of the United States of America and its global interests -- to fly and fight in Air, Space,
and Cyberspace.
UNITED STATES AIR FORCE VISION STATEMENT
Global Vigilance, Reach and Power.
966TH AACS MISSION STATEMENT
The mission of the 966th AACS is to conduct combat crew flight training in tactics,
techniques and operations of assigned aircraft and associated equipment. The unit also
maintains the readiness state of personnel and equipment for dispersal and augmentation
of tactical forces as directed by higher authorities. The unit trains flight and mission
crews for the operational squadrons of the 552nd ACW. Instructors develop student study
guides and lesson plans required for use in the training environment. They provide
training during airborne missions as well as in mission simulators.
Figure 2. USAF Mission and Vision Statement, 966th AACS Mission Statement
6
The UCDMTs of the 966th AACS studied in this project were placed under the
supervision of the Director of Operations, Gulf Flight (DOMG). The mission of DOMG
as well as the 966th AACS was to teach, train, and document. The ACC syllabus directly
commanded instruction to include the preparation of the 552nd
OGV EOCE.
------------------------------
Insert Figure 3 Here
------------------------------
History and background. Since the inception of the 552nd
OGV EOCE in 1996, a
continuously steady rate of failures from UCDMTs was prevalent. The 552nd
OGV
EOCE had seen many changes from format to context; however, the failure rate was
continuously steady. The implications of a failure of the 552nd
OGV EOCE plagued the
operational capability of the USAF, ACC, 552nd
ACW, 552nd
OG and 966th AACS.
Losses in productivity and remedial training produced an unnecessary strain on an
exhausted ACC PFT and the existing stipend military budget.
Each year the 552nd
OGV EOCE is revised and tested for accuracy through the
552nd OGV CDMT Subject Matter Experts (SMEs). The revisions were a direct
reflection of predetermined course syllabi and plans of instructions created by ACC level
SMEs, as well as 966th AACS instructors. All 552nd OG CDMT evaluators were
required to verify the base level knowledge of the 552nd
OGV EOCE through annual test
evaluations to determine the necessity for development or alterations of the 552nd
OGV
EOCE.
8
Although the annual revisions changed the content of the 552nd
OGV EOCE to
reflect the course syllabi, the greatest change came with the computerization of
educational testing. In 1999, all 552nd
OGV Emcees converted to Computer Based
Testing (CBT). The goal was to eliminate the human error in scoring, as well as
effectively eliminate the compromise of integrity. The change in format allowed 552nd
OGV EOCE proctors to efficiently test and grade all examinees on a larger scale. The
downside of the integration of computerization was that the failure rate of all aircrew
members of the E-3 AWACS, including UCDMTs, increased.
Scope of the problem. The 966th AACS was responsible for the training of all E-
3 AWACS aircrew training. The UCDMT course was only 1 of 14 position specific
crewmember-training programs enacted by the 966th AACS. The training involved in
assuring UCDMTs pass the 552nd
OGV EOCE and progress on to their operational
squadrons accumulated on average of three to four months of PFT.
The average PFT time it took to graduate a UCDMT was vitally important to the
manning of the 552 ACW, ACC, and the USAF. PFT days were tracked by the 966th
AACS, 552nd ACW, and ACC. The average PFT days it took to graduate a UCDMT
was used to fill the manning orders of the USAF. The manning orders were used through
all levels of command, from USAF recruiters, to the operational capability of the 552nd
ACW, and ultimately the capability of the USAF.
On average, a failure in the 552nd
OGV EOCE delayed a UCDMT from
graduating by two weeks. The two-week delay required remedial classroom and possible
flight training. This delay in training equally affected the readiness date of the
operational squadron’s expected usage of the UCDMT. Readiness delays caused the
9
operational squadrons to be understaffed and directly affected the war fighting
capabilities of the 552nd
ACW squadrons and the USAF.
Significance of the Project.
The significance of this project was vitally important to the 966th AACS DOMG
as well as the USAF as global entity. Delays in training not only caused conflict in the
scheduling of remedial training, but more importantly caused the loss of millions of DOD
budget dollars.
Identifying and correcting the major contributors to the failure of the 552nd
OGV
EOCE by UCDMTs eliminated not only the delays in training and manning utilized by
the operational squadrons of the 552nd ACW and the USAF, but identified the
effectiveness of the training provided by the Instructors assigned to the 966th AACS.
In identifying the major contributors of the increasing failure rate of the 552nd
OGV EOCE by UCDMTs, the data gathered provided a source of recommendations on
specific training programs with direction to the identification of their own specific PFT
programs in the 966th AACS. These recommendations were also suggested as a basis for
a strategic plan.
10
Definition of Terms
AACS – Airborne Air Control Squadrons are the base level organizations in the USAF.
ACC – Air Combat Command. ACC is the MAJCOM responsible for air combat forces
within the USAF. ACC operates fighter, bomber, reconnaissance, battle-management,
rescue, theater airlift aircraft, and command, control, communications, intelligence, and
reconnaissance (C2ISR) aircraft.
ACW- Air Control Wing. An ACW is a subunit of the Major Command (MAJCOM)
structure. The 552nd
ACW is responsible for all E-3 Airborne Warning and Control
System (AWACS) operational squadrons.
AWACS – Airborne Warning and Control System. The E-3 Sentry or AWACS is an
airborne surveillance and command and control platform that functions for tactical and
air defense forces throughout the world.
CDMT – Computer Display Maintenance Technician. The CDMT is responsible for the
operational function of all onboard computing done by the E-3 mainframe computer. A
(U), (I), or (E) before the CDMT identifier respectively identifies an Unqualified,
Instructor, and Evaluator, i.e. UCDMT.
DOD – Department of Defense. The U.S. department that is in charge of ensuring
national security and regulating military moves.
DOMG – Director of Operations Gulf Flight. A subunit of the 966th AACS, it is
responsible for the training of UCDMTs.
11
EOCE – End of Course Examination. The final exam given by the 552nd
Standards and
Evaluations Office (OGV) to all unqualified E-3 AWACS crewmembers.
MAJCOM – Major Command. Assigned directly under the Secretary of Defense and the
Secretary of the Air Force, MAJCOMs are responsible for their specific logistical
capabilities. ACC is the MAJCOM for the 966th AACS.
OGV – Standards and Evaluations Office. The office responsible for the enforcement of
all 552 Operations Group, as well as USAF level regulations and instructions.
IQT – Initial Qualification Trainee. The classification given to an E-3 crewmember
while conducting their initial crew specific training.
PFT – Professional Flight Training. The training received from the 966th AACS.
SME – Subject Matter Expert. A SME is a source of great knowledge in their crew
specific position.
TRU – Training Replacement Unit. A Unit or Squadron responsible for providing the
training necessary to fulfill a basic mission qualification status on board the E-3
AWACS.
USAF – The United States Air Force. The most dominant and technologically superior
air and space force the globe has ever seen.
12
Chapter 2
Review of the Literature
The progression of technology and computers had affected the way most people
completed their job on a daily basis. Computerization was a more efficient, cost
effective, and required less human interaction to accomplish a given task (Russell &
O’Conner, 2002). One of greatest benefits of computerization was its benefit to
educators. Many educators at all levels of instruction, to include those involved in
professional military education, increasingly utilized the practice of the computerization
of curriculum, particularly Computer Based Tests (CBTs).
CBTs could be used to assess almost any subject and any student. However, the
development of a valid and useful CBT to assess the comprehension of a student was the
responsibility of the teacher or CBT developer. Many variables had an impact on the
worth of a CBT. The importance of developing psychometrically sound test had been
discussed in the literature for decades, although the application of these concepts had
been neglected (Bridge, Musial, Thomas, & Sawilowsky 2003).
Even though CBTs posed a great advantage to many applications other than
education, the challenges to valid CBTs grew exponentially as its usage increased
(Parshall, Spray, & Davey, 2002). The question was not if, but when, all important
assessment programs would attempt to incorporate the emerging technologies. In the path
of technology’s promise of significantly more effective assessment and the fulfillment of
that promise were a number of equally significant barriers (Rabinowitz & Brandt, 2001).
13
Since the focus of this paper was to find the underlying cause in the failure of the
computer based 552nd
Operations Group Standards and Evaluations Office End of Course
Examinations (552 OGV EOCE) of the United States Air Force (USAF) 966th Airborne
Air Control Squadron (AACS) Unqualified Computer Display Maintenance Technicians
(UCDMTs), this chapter focused on an overview of computer based testing, the areas of
consideration for validity in computer based tests, computer literacy and overall
interaction of computer technology, and the socioeconomic approach to computer based
assessment.
Overview of Computer Based Testing
Computer Based Testing was defined as any test given electronically via a
computer interface. When defining CBTs one must clearly define CAT or Computer
Adaptive Tests. CATs were the most widely used version of CBTs today (Noyes &
Robbins, 2004). Where CBTs were an exact representation of the paper based
assessment, usually multiple-choice, CATs actually adapted to the user’s answer, i.e. a
correct answer results in a more difficult question and vice versus.
In the research, the majority of the information was a CBT. CBTs provided an
instant assessment of the individual being tested or assessed. Computerized results were
instantly created for the test proctor. The test results provided the proctor an immediate
result of the comprehension level and even the areas necessary for improvement.
Teachers, supervisors, or college entrance examinees could directly assess the
level and quality of prior education (Russo, 2002). The opportunities for teaching and
assessment had endless options for integration. Computer Based Testing could not only
14
immediately assess the skills of the tester, but reflected the overall effectiveness of the
test preparation of the teacher (Rex & Nelson, 2004).
Computer based testing also addressed many drawbacks of current testing
practices, which included scoring errors, lost mail, postage and handling expenses,
diminishing classroom instruction time, and the high costs of human scorers for written
exams (Bociji & Greasley, 1999). The convenience factor of computer-based testing
programs was vitally important. Computer based tests provided near-instant gratification
to superintendents, teachers and parents by eliminating the long waits for test results
(Russo, 2002).
Coincidently enough the first psychometric tests were utilized by the United
States Military after World War II (Mills & Potenza, 2002). Since the 1970’s there was a
tremendous amount of research done on CATs and CBTs. For example, the book,
Computerized Adaptive Testing: From Inquiry to Operation (Sands, Waters, & McBride,
1997) Chronicles the work of the military psychometricians who created the CAT-
ASVAB or Computerized Adaptive Test-Armed Services Vocational Aptitude Battery.
The CAT-ASVAB, which surveyed about one fourth of all high school and post
secondary schools nationwide each year and was the largest distributed computerized
assessment in the U.S., spawned the technology that many educators used today (Barker,
2002). In its infancy, CBTs were accessed through a mainframe computer which
connected to numerous dumb terminals. Through the advancement of technology,
smaller and faster computers allowed numerous outlets to utilized, what was a CBT on a
CD-ROM.
15
Networking technology in the late 1990s and 2000s allowed users from around
the globe to access endless amounts of information via the internet. Specifically in the
summer of 1996, aptitude testing for enlistment in the Armed Services of the United
States converted to a higher technology with the implementation of the CAT-ASVAB in
all 65 Military Entrance Processing Stations (MEPS) (Curran & Jordan, 1996).
CBTs grew from a simple question and answer format to an interactive audio and
video medium. CBTs integrated multimedia sources and looked toward the future with
adaptive testing and artificial intelligence integration. In some cases, the tests were
developed for or converted to a computer-based format for no better reason that the trend
value (Parshall et al., 2002). The active and ongoing conversion to computer based
learning and CBTs introduced many problems to the effectiveness of this new
technology.
The expansive use of computer technology for academic assessment grew rapidly
since its inception. The questions proposed from this new technology were endless;
however, the most encompassing agent of CBTs was how to keep the validity in creating
a CBT.
Areas of Consideration for Validity in Computer Based Tests
There was been a rapid advancement in the technological and applied method of a
computer based testing. Many strides have been accomplished since the creation of
CBTs, but many more systematic approaches must be taken into account before a
computer based test was to be implemented (Togo, 2002). One such systematic method
to create a valid CBT is the blueprint method.
16
A test blueprint was a tool used in the process for generating content-valid exams
by linking the subject matter delivered during instruction and the items appearing on the
test. These procedures, as well as other educational measurement practices, were often
overlooked (Bridge et al., 2003). Another method to analyze when determining the
validity or need of a CBT was the “test mode effect.” The test mode effect was the result
when identical paper and computer-based tests did not obtain the same result (Clariana &
Wallace, 2002).
There were numerous books, articles, journals, and methods found during review
of the research. The main objective was to find a common formula in creating a valid
CBT that could be applied to the UCDMTs 552nd
OGV EOCE at the 966th AACS. One
such method in creating a content valid computer based assessment was based on four
main priciples; ensuring the test was actually measuring its purpose, develop a content
relevant test based on the representative of the test items, test items must be clearly
written and of high quality, and lastly retain subject matter experts review the test for
quality (Bridge et al., 2003).
The majority of the computer based assessment available neglect reflection of the
material being assessed. The focus of the content was lost and more emphasis was placed
on the aesthetics of the CBT, which alienated such topics as the sequencing of the
assessment. Past research had shown that topical sequencing of identical questions
results in higher achievement on identical content assessments (Togo, 2002). However,
the difference from randomization of test questions was slight and varying data had yet to
determine solid evidence. Although topical sequencing was not addressed in a computer-
based format, could this structure be applied? Test blueprinting, test mode effect, and
17
sequencing were just a few of many paths of less resistance to understanding the
complexity of computer-based assessment.
Given Developers of CBTs had created a true and valid assessment using the
aforementioned techniques to eliminate any ambiguities, how could students or test takers
still fail or perform poorly on CBTs. Even a perfectly developed CBT would seem just
as strange and difficult to a student whom had never used this technology. The computer
literacy of the subject must be a factor in the CBT equation.
Computer Literacy and Overall Interaction of Computer Technology
A survey of high school students regarding Oregon's online testing system
showed that CBTs were determined to be faster and more enjoyable than the traditional
paper and pencil assessment (Park, 2003). After completion of the CBT, the students
also felt that they achieved a higher score on the computerized version compared to its
predecessor (Park, 2003). Was the positive attitude determined by the familiarity of
computerization?
Not all research showed that computer based assessments were less stressful.
Using the NASA-TLX (task load index), a tool used in the assessment of cognitive
workload, researchers found that a greater NASA-TLX was necessary to complete the
computer based assessment as opposed to it’s identical paper based function (Noyes &
Robbins, 2004). This finding suggested that the workload of completing curriculum via
computer based could hinder the already stressful workload of testing alone.
In the article entitled Paper-Based vs. Computer-Based Assessment from the
British Educational Communications and Technology Agency, stated that “we anticipate
that computer familiarity was the most fundamental key factor in the test mode effect.”
18
Familiarity, experience, and preparation helped to improve the perception and opinions of
the CBT experience. Another area of concern that researchers found in the search for
validity of CBTs was not just the content of the test itself, but the Attitude Toward
Computer Assessment Scale or ATCAS (Smith & Caputi, 2004). The ATCAS study was
developed to explore the examinees’ emotional, perceptual, and attitudinal reactions
towards computerized testing relative to conventional testing methods (Smith & Caputi,
2004). A similar study done by the University of Iowa showed an early step in
evaluation of CBTs was to be sure that the exam format was measuring the examinees
knowledge and not their comfort level or confidence with the technology. Therefore, it
was important that the CBT reproduce or accommodate traditional test-taking behavior
(Peterson, Gordon, Gordon, Elliott, & Kreiter, 2002).
In review of the literature a different approach to the familiarity of CBTs used
were the issues of gender differences and measured performance. Could the issue of
gender be a determining factor in the overall competency and familiarity of a valid
computer based assessment? Although a small determining factor, gender differences
with regard to perceived self-efficacy expectations and attitudes towards computers
represented an important issue in the area of computer education (Busch, 1995).
Understanding that the overall comfort level of computers can aide in the
performance of computer-based assessments, however considering the majority of
UCDMT students from the 966th AACS were males on average age of 22 years, the
gender issue was hard to determine as a factor in the failure rate of 552nd
OGV EOCE.
Regardless, the research had to be done to eliminate any ambiguity concerning
familiarity as a major factor in determining the performance on computer based
19
assessments. Research had shown that male students in fact, had significantly less anxiety
in the familiarity of technology, but only showed a slight favor in regards to computer
literacy and computer based assessments (Bush, 1995). However, due to the dramatic
increase in the availability and necessity of computer literacy in current society the
research found was outdated and non-conclusive. It can only be determined, due to the
lack of documented research that the element of gender gap was rapidly closing and was
not to be as much of a factor as the issue of literacy and attitude.
A Socioeconomic Approach to Computer Based Assessment
The final area of research focused on the socioeconomic approach to computer
based assessment. Computer literacy was a growing problem among low income and
ethnically diverse college students, which was a similar representation of the pool of
UCDMT students of the 966th AACS. Similarly, the USAF as many Universities
throughout the United States, assumed that entry-level students possessed the adequate
computer skills necessary to integrate computer based training or testing into their
learning (Chisholm, Carey, & Hernandez, 1999).
Could the level of income, which may directly affect the familiarization of
information technology, be a possible challenge to ethnically diverse? The review of the
literature displayed many parallel problems that the USAF and educators around the
world share. The overall competency of computer technology was a determining factor
in the performance of computerized assessments. Universities as well as the USAF
seldom stopped to determine if required computer competencies existed equally among
all students of all ethnicities (Chisholm, Carey, & Hernandez, 1998). Even more,
evidence was found to show that minority students from the University of Arizona West,
20
were exposed to computer technology in their latter years, as opposed to the majority of
Asian and Caucasian students surveyed, furthering the proof that minority student’s
exposure to computer technology and level of competency at this age level, was limited
(Chisholm et al., 1998).
The aforementioned article in the British Journal of Educational Technology
concluded that another test mode effect was socioeconomic.
Based on our review and these results, we anticipate that computer familiarity was
the most fundamental key factor in the test mode effect, especially for unfamiliar
content and/or for low attaining examinees (especially an issue for students with
reduced computer access, such as women and minorities) (Clariana, 2002, p 600).
The objective of identifying those with limited computer proficiency was not solved by
simply identifying minority students, even though Latinos and African American students
were less likely to own a computer, but to understand that minority students had a
tendency to obtain limited comprehension of computer literacy (Stanley, 2003). The
socioeconomic aspect of identifying the validity of computer-based assessment was a
small, yet a valid identifier in the purpose of this project.
Conclusion
Computer based assessments grew intensely in complexity and frequency of use
in all areas of education and training. CBTs were used so frequently, that the validity of
computer-based assessments had come into question. The entities that converted to the
more convenient and accurate format, did so without assessing the complexities of test
production, the issues of validity in computer based tests, computer literacy and overall
interaction of computer technology, and the socioeconomic approach to computer based
assessment.
21
The purpose of this project was to obtain the major contribution in the failure rate
of United States Air Force (USAF) 966th Airborne Air Control Squadron (AACS)
Unqualified Computer Display Maintenance Technicians (UCDMT) 552nd
Operations
Group Standards and Evaluation Office End of Course Examination (552nd
OGV EOCE).
Data was analyzed and examined from 1 July 2006 to 1 October 2006.
The project involved tracking the passing and failure rate data obtained from the
552nd
OGV EOCE. A failure of the 552nd
OGV EOCE delayed training on an average of
two weeks and costs the USAF and the 966th AACS millions of dollars each year in
productivity and remedial training. The results sought after in this project were to find
the major contributors to failure of the 552nd
OGV EOCE. Recommendations were then
suggested to increase the passing rate.
22
Chapter 3
Methods and Procedures
Hypothesis
The purpose of this research was to determine if the repetitious exposure to
computer based tests (CBTs) of similar content would familiarize and inherently increase
the mean scores of Unqualified Computer Display Maintenance Technicians (UCDMTs)
from the 966th Airborne Air Control Squadron (AACS), 552
nd Air Control Wing (ACW),
United States Air Force (USAF), from the periods of 1 July, 2006 through 1 October,
2006 in comparison to the 552nd
Standards and Evaluation Office End of Course
Examination (552nd
OGV EOCE) scores collected from the previous three months. It
was hypothesized that the increase in mean scores would directly decrease the recent
trend of failures of the 552nd
OGV EOCE. The decreased failure rate would eliminate the
need for remedial training, which would improve the productivity of the 966th AACS
Professional Flight Training (PFT) program as well as the operational capability of the
USAF.
The independent variable of the experiment was the exposure to similar content-
based CBTs administered to the UCDMTs. The dependent variable was the student
achievement, or the mean test scores collected after the independent variable of the
experiment was introduced into the existing PFT program of UCDMTs.
Data Source
The data collected for this study was obtained from the 552nd
OGV EOCE test
score database. The sample size obtained 12 UCDMT students over a six-month period,
the first three month period to observe the control UCDMT class and the second three
23
month period to observe the UCDMT class with the independent variable applied. The
sample was selected on the basis of UCDMT class start date.
Instrumentation
The operational definition of the independent variable was the exposure to similar
content laden CBTs prior to commencement of the training program and 552nd
OGV
EOCE. The CBTs were similar in format and content of the 552nd
OGV EOCE. The
CBTs were administered throughout the UCDMTs three month course by an Instructor
Computer Display Maintenance Technician (ICDMT), similar to the protocol of the 552nd
OGV EOCE, and were only graded on a complete/non complete tracking format. All
UCDMTs received the equal tests in substance and frequency. The conditions in which
the CBTs was administered consisted of a non-timed open book format in which the
UCDMT worked at his/her own pace.
The dependent variable of student achievement was an interval measurement. The
operational definition of the dependent variable was the 552nd
OGV EOCE score
obtained from the 552nd
OGV EOCE database. The data source used was the interval
data of 552nd
OGV EOCE score obtained from the 552nd
OGV EOCE database. The
mean score on the 552nd
OGV EOCE of the UCDMTs from the preceding three months
of evaluation were compared to the mean score of UCDMTs exposed to the dependent
variable for the following three-month period.
The data was shown in a column by row data table format, displaying all test
scores including the mean of the control group Tn, in comparison to those with the
independent variable applied Sn. Figure 1 below illustrates an example format.
24
Tn = Control Group Sn = Independent Variable Applied
T1 = X T2 = X T3 = X T3 = X Tn M = X
S1= X S2 = X S3 = X S4 = X Sn M = X
Figure 4.
Procedure
This was a before-after, between subjects experiment. The necessary data
obtained for the before and after comparison analysis included individual and mean test
scores from the UCDMTs 552nd
OGV EOCE. The test scores gathered from the three
months preceding the implementation of the independent variable were labeled as Tn
UCDMT or the control group. All raw scores were tabulated and the mean score of the
552nd
OGV EOCE was determined.
Prior to obtaining the next three month increment of test scores, an independent
variable of familiarization with CBTs was applied to a new test group of UCDMTs, this
group of UCDMTs were labeled as Sn or the test group with the independent variable
applied. Throughout a three-month period, the Sn UCDMTs received a controlled
amount of ICDMT administered tests of identical content and frequency. The tests were
administered in a similar environment to that of the 552nd
OGV EOCE. All Sn UCDMTs
CBTs were tracked as complete/non-complete status. No test scores were released to the
Sn UCDMTs.
25
Data Analysis
When the data from this experiment was analyzed, the arithmetic mean and
standard deviation were calculated for the dependent variable of academic performance,
before and after the initiation of the independent variable of exposure to CBTs. The
variability measures of range, variance, and standard deviation were calculated for the
dependent variable. The data was then represented with a bar graph which displayed the
mean scores of UCDMTs before and after exposure to CBTs on Y axis and the X axis
distinguishing Tn and Sn.
The experiment tested the null hypothesis that no significant difference existed in
test scores from the Tn UCDMTs or before and the Sn UCDMTs or after implementation
of the independent variable of CBTs. The analysis technique used to test the hypothesis
utilized the William Seal Gossett independent samples t test at the .05 level of
significance.
Limitations
The generalization of the findings could have a possible impact on the similar
training programs implemented throughout the 966th AACS and other training
organizations throughout the USAF.
The uncontrollable variables that could have an affect on the outcome of the
experiment include but are not limited to; of Stress Level’s, General Aptitude of the
selected UCDMTs (regardless of the pre-screened aptitude assessment necessary for
UCDMT job placement), Psychological or Physiological ailments on the day of the 552nd
OGV EOCE or throughout the Tn or Sn UCDMTs training, or even the Psychometrics of
the 552nd
OGV EOCE itself.
26
Chapter 4
Summary of Results
The following is a summation of the data collected during an evaluation of 552nd
Standards and Evaluations Office End of Course Examination (552nd
OGV EOCE) test
scores of Unqualified Computer Display Maintenance Technicians (UCDMTs) from the
966th Airborne Air Control Squadron (AACS), the Air Combat Command (ACC)
Replacement Training Unit (RTU) for the E-3 Airborne Warning and Control System
(AWACS) of the United States Air Force (USAF). The information from the test
instrument information evaluated is found to not support a need for repetitious exposure
to Computer Based Test’s (CBTs). A restatement of the hypothesis is included,
additionally; alternative methods to the resolution of existing 552nd
OGV EOCE failures
among UCDMTs are presented for assessment.
Restatement of Hypothesis
The purpose of this research was to determine if the repetitious exposure to CBTs
of similar content would familiarize and inherently increase the mean scores of UCDMTs
from the 966th AACS, 552
nd Air Control Wing (ACW), USAF, from the periods of 1
July, 2006 through 1 October, 2006 in comparison to the 552nd
OGV EOCE scores
collected from the previous three months. It was hypothesized that the increase in mean
scores would directly decrease the recent trend of failures of the 552nd
OGV EOCE. The
decreased failure rate would eliminate the need for remedial training, which would
improve the productivity of the 966th AACS Professional Flight Training (PFT) program
as well as the operational capability of the USAF.
27
The independent variable of the experiment was the exposure to similar content-
based CBTs administered to the UCDMTs. The dependent variable was the student
achievement, or the mean test scores collected after the independent variable of the
experiment was introduced into the existing PFT program of UCDMTs.
Descriptive Statistical Information
The hypothesized statement, that an exposure to CBTs of similar content will
familiarize and inherently increase the mean scores of UCDMTs 552nd
OGV EOCE, was
investigated. A sample of 12 UCDMTs from the 966th AACS RTU was selected based
on respective class start dates. This selection of UCDMTs represents 20% of annual
UCDMTs trained for the replenishment of 552 ACW Combat Mission Ready (CMR)
CDMTs. The individual score, with respect to age, gender, and ethnicity of the Tn and
Sn are represented in figure 5.
------------------------------
Insert Figure 5 Here
------------------------------
The UCDMT comparison of the Sn and Tn individual scores and mean of the 552nd
OGV
EOCE test scores are represented in figure 6.
------------------------------
Insert Figure 6 Here
------------------------------
30
The mean test scores with the independent variable of familiarization with CBTs
applied resulted in a overall 7% increase in mean test scores, more importantly, also
resulted zero failures in the 552nd
OGV EOCE among the Tn UCDMT group. This result
eliminates the necessity for additional classroom or flight training from 966th AACS
Instructor Computer Display Maintenance Technicians (Acme’s), and eliminates excess
spending of the allotted budget. The data collected indicates that there possessed an
inherent need for exposure and familiarization with CBTs.
Results of Significance Test
An independent samples t test at the .05 level of significance is applied to test the
hypothesis. The calculations from the 552nd
OGV EOCE results are as follows; the
standard deviation of Sn equaled 8.00, Tn equaled 6.15, the mean score of Sn equaled
86.00 of the 100.00 possible points, and Tn equaled 93.33 of the 100.00 possible points.
The value of t equaled 1.7798 with degrees of freedom equaled to 10.00. At the .05 level
of significance, the difference is considered to be not statistically significant. Although
the mean score of Sn calculated less than that of the Tn UCDMTs, the null hypothesis is
accepted. It is determined that a need for exposure to familiarization with CBTs is not
statistically significant in the case of UCDMTs 552nd
OGV EOCE failures.
Results of Needs Analysis
Although statistically the need for an exposure to similar 552nd
OGV EOCE CBTs
is not perceived to be significant, the original hypothesis remained. The implemented Tn
training program resulted in zero UCDMT 552nd
OGV EOCE failures, required less
remedial training as the Sn UCDMTs. This result is determined to be significant enough
that 966th AACS command leadership determined that CBTs are to be established as
31
required course syllabi procedure, for further non-implementation of CBTs is determined
to be a non-utilization of 966th AACS student course materials.
Status Quo. Failure to implement this change in required syllabus criteria will
result in continuous additional remedial training and required documentation of the
consistent trend of UCDMT 552nd
OGV EOCE failures. This resulting trend and the
current level of UCDMTs awaiting training will further compound the existing 966th
AACS, ACC, USAF, and Department of Defense (DOD) budget cutbacks. The further
non-utilization of government resources, such as test development software and computer
based student facilities, in addition to DOD funds, will only further illustrate the
necessity for implemented change. This implemented change will pose no further effort
from the 966th AACS ICDMTs than that of the remedial training necessary for the
UCDMTs 552nd
OGV EOCE failures.
Implementation of CBT Familiarization. An implementation of CBT courseware
and integration of such type tests has already been created for the before and after
implementation comparison of this study and is prepared for immediate implementation.
The necessary dedicated UCDMT and ICDMT time for the CBT practicum will only
result in actual implementation of the existing Instructor of the Day (IOD) duties, no
further time dedication is necessary. The CBT courseware, developed by 966th AACS
Subject Matter Experts, consists of 25 552nd
OGV EOCE content similar random
question tests given on a bi-weekly basis throughout the three-month UCDMT course
length. The 966th AACS ICDMTs will supervise and monitor all six required courseware
test and annotate progression throughout training. Any additional training deemed
necessary will be handled on a case-by-case basis.
32
Alternative Suggestions. The alternative to 966th AACS implementation of
ICMDT IOD procedures and supervised CBT courseware will be the implementation of a
UCDMT guided discussion and unsupervised CBT familiarization. UCDMTs will be
encouraged to privately engage the available CBT courseware and facilitate the learning
process amongst themselves and other classmates. This alternative to the supervised
CBTs method will require minimal supervision and dedicated time from the 966th
ICDMTs. This alternative will only supplement the learning process of the established
syllabus, but will incorporate an ownership amongst the UCDMTs PFT. This ownership
will develop a sense of priority and importance in regards to the UCDMT 552nd
OGV
EOCE.
33
Chapter 5
Discussion and Conclusions
The purpose of this project was to obtain the major contribution in the failure rate
of United States Air Force (USAF) 966th Airborne Air Control Squadron (AACS)
Unqualified Computer Display Maintenance Technicians (UCDMT) 552nd
Operations
Group Standards and Evaluation Office End of Course Examination (552nd
OGV EOCE).
Data was analyzed and examined from 1 July 2006 to 1 October 2006.
The project involved tracking the passing and failure rate data obtained from the
552nd
OGV EOCE. A failure of the 552nd
OGV EOCE delayed training on an average of
two weeks and costs the USAF and the 966th AACS millions of dollars each year in
productivity and remedial training. The results sought after in this project were to find
the major contributors to failure of the 552nd
OGV EOCE. Recommendations were then
suggested to increase the passing rate.
General Discussion and Conclusions
The results of the UCDMT 552nd
OGV EOCE study indicated that the original
hypothesis was incorrect. The statistical analysis and information gathered from the
project indicated that there was not a significant degree of deviation at the .05 level from
the Sn sample group’s mean. However, the mean of Tn UCDMTs compared to Sn
UCDMTs resulted in a 7% increase overall. This increase, regardless of statistical
variance was of much continued interest. This interest of mean variance between
subjects Sn and Tn, placed some merit on the application of the overall project. Due to
the high demand Professional Flight Training (PFT) program of the 966th AACS, any
positive result in regard to the elimination or reduction in UCDMT 552nd
OGV EOCE
34
test failures and the subsequent elimination and or reduction in the necessary additional
training in which crucial Instructor Computer Display Maintenance Technician (ICDMT)
time would be necessary, was seen as essential shift from the continuous trend.
This continuous trend was addressed and eliminated in the months preceding the
application of exposure of CBTs, in regards to test subjects Tn. The intervention,
although not deemed statistically necessary, posed a viable solution to the re-occurring
trend of UCDMT 552nd
OGV EOCE failures, for there was no other solution, suggestion,
or plans to counteract such. Due to the insight provided by the research, 966th AACS
command level attention was directed to and resulted in the implementation of and
integration in UCDMT course syllabi. Further investigation into the success or failure of
the future UCDMTs that would be exposed to the syllabi with CBT integration. If
successful, a recommendation would be made to implement similar courseware into other
966th AACS and USAF PFT programs.
Strengths and Weaknesses of the Study
The recognized strengths of the study were the immediate application of the CBT
familiarization and exposure to further designated UCDMT groups, the additional
perceived expressed interest in UCDMT progress of the 966th AACS PFT, and exposure
of the notable trend of UCDMT 552nd
OGV EOCE to 966th AACS command leadership,
552nd
OGV courseware creators, and Air Combat Command (ACC) PFT syllabus
standards.
The creation of the applicable CBT courseware created such a valid and viable
user interface that minimum ICDMT and UCDMT training was necessary for the, if
necessary, further research and implementation of similar courseware to other applicable
35
aspects of 966th AACS PFT. The UCDMTs involved in this research were also observed
as to respond with more enthusiasm and interest in the course syllabus as well as the
552nd
OGV EOCE administered by 552 OGV testing proctors. Class participation and
group dynamic was paramount above all previous UCDMT groups in recent years. In
addition, the strengths of the study that involved the available courseware and positive
student interactions and the possible impact they created, was observed as an immediate
necessary tool for UCDMT PFT.
The weaknesses of the study were selected group size, inadequate representation
of gender, education level, varying socioeconomic background and perceived expressed
interest among the test results. Due to the fact that only a 20% representation of pre-
screened and filtered aptitude UCDMTs involved prior to the implementation of any
966th AACS PFT, the baseline intelligence quotient was already manipulated if not
skewed. This in addition to the non-female gender representation in many PFT career
fields and training elements, and that males under the age of 25 are more likely to be
predisposed to computer familiarization, may have faulted the results as well. Lastly and
the most common mistake among comparison studies, is that the perceived entrusted
attention may have produced a Hawthorne Effect among the Tn UCDMT group.
Recommendations
In lieu of the statistical information gathered from the research, the
“Implementation of CBT familiarization “ detailed in Chapter 4 could be implemented
with minimal ICDMT intervention or required additional training involved. An
implementation of CBT courseware and integration of such type tests has already been
created and is available for immediate implementation. Furthermore, the necessary
36
dedicated UCDMT and ICDMT time for the CBT practicum only reiterates the
implementation of the existing dormant practicum of Instructor of the Day (IOD) duties.
Although not statistically significant enough to prove the hypothesis stated in Chapter 3,
the results produced positive impact on mean scores, as well as unintentional positive
results on group dynamics, a necessary tool for the further education of UCDMT groups
in the 966th AACS.
The alternative implementation of “status quo” and “Implementation of a
UCDMT guided discussion and unsupervised CBT familiarization” would not be feasible
due to the continuous undocumented UCDMT progression and the immaturity level of
unsupervised status of 18-25 year old predominantly male subjects. Both alternatives are
similar in account to the negation of any perceived necessity to adamantly address the
continuing trend of UCDMT 552nd
OGV EOCE failures.
As suggested in the research, a broader spectrum solution was also addressed.
This involved the validity of the 552nd
EOCE and the “test mode effect.” To analyze the
validity of the 552nd
OGV EOCE would require a UCDMT to be administered two
identical separate tests; one test in CBT format and the other in paper and pencil format.
The analysis and administration would then analyze the score and determine if the
variance was substantial. This alternative recommendation would require additional
personnel hours from the 552nd
OGV test proctors, but more importantly require
additional voluntary effort from UCDMT test subjects. Either of which were unavailable,
unfeasible, or in direction violation of 552nd
OGV regulations.
In addressing the Attitude Toward Computer Assessment Scale or ATCAS to
explore the examinees’ emotional, perceptual, and attitudinal reactions towards
37
computerized testing relative to conventional testing methods, a questionnaire was
developed to assess the pre-test state of the examinee. Although this created insight into
the attitude of the examinee, it did not address the aptitude. This recommendation was
not used for its perceived minimal effect on the 552nd
OGV EOCE mean scores. Though
this method was cost effective and prepared for implementations, the hypothesized
implementation of CBT familiarization courseware was also deemed cost effective,
prepared for implementation, but more importantly, in support of the research, addressed
the most common factor amongst UCDMTs, familiarization.
In implementation of CBT familiarization courseware, developed by 966th AACS
Subject Matter Experts (SME), the further implementation of UCDMT research would
consist of 25 EOCE similar content random question tests given on a bi-weekly basis
throughout the three-month UCDMT course length. The six pre-552nd
OGV EOCE
would be tracked and scored in the UCDMTs PFT folder in the hypothesized result of
continual ICDMT interaction and exposure to similar 552nd
OGV EOCE practicum
would result in the elimination, if not dramatic reduction of 552nd
OGV EOCE failure
amongst UCDMTs.
Suggestions for Future Research
Further research should be focused on the correlation of ICDMT student
interaction time and the resulting impact on overall student progression throughout
UCDMT PFT, including the 552nd
OGV EOCE. All ICDMT and UCDMT should be
tracked using an hourly basis as the accumulation of time. This accumulation of
interaction should be reviewed in comparison with overall student achievement at the
culmination of their 966th AACS PFT.
38
Due to the provincial gender, age, and intelligence quotient of the selected Sn and
Tn UCDMT research subjects, future research into the effects of CBT familiarization
should also be focused on the relevant background factors of future UCDMTs and their
resulting performance on the 552nd
OGV EOCE. On occasion, but minimally involved in
this study, UCDMTs were selected amongst a diverse pool of ethnic and socioeconomic
backgrounds. Past UCDMT students have learned English as a second language to their
various native tongues of Chinese, Swahili, Tagalog, French, and Spanish, and any
resulting research from the aforementioned scenario could be beneficial to future 966th
PFT students and their respective curriculum by assessing the feasibility to instruct
UCDMT students with language barriers serving in the United States Air Force or our
allies respective E-3 AWACS PFT students in their respective Air Forces around the
world.
39
References
Barker, Harley H., (2002). Reducing adolescent career indecision: The ASVAB career
exploration program. The Career Development Quarterly, 50, 359-370.
Bocij, P., & Greasley, A. (1999). Can computer-based testing achieve quality and
efficiency in assessment? International Journal of Educational Technology, 1, 1.
Bridge, P.D., Musial, R.K., Thomas, R., & Sawilowsky, S. (2003). Measurement
practices: Methods for developing content-valid student examinations. Medical
Teacher, 25, 414-421.
Busch, T. (1995). Gender differences in self-efficacy and attitudes toward computers.
Journal of Educational Computing Research, 12, 147-158.
Chisholm, I.M., Carey, J., & Hernandez, A. (1999). Access and utilization of computer
technology by minority university students. Proceedings from the Society for
Information Technology & Teacher Education International Conference. San Antonio,
TX.
Chisholm, I.M., Carey, J., & Hernandez, A. (1998). University minority students:
Cruising the superhighway or standing at the on-ramp? Proceedings from the
Society for Information Technology & Teacher Education International Conference.
San Antonio, TX.
Clariana, R., & Wallace, P. (2002). Paper based versus computer-based assessment:
Key factors associated with the test mode effect. British Journal of Educational
Technology, 33, 593-602.
Curran, L.T., & Jordan, L.A. (1996). Implementation of the computerized adaptive
version of the armed services vocational aptitude battery. Proceedings from the
Annual Meeting of the National Council on Measurement in Education. New York,
NY.
Mills, C.N., Potenza, J.J., & Fremer, W.C. (2002). Computer based testing: Building
the foundation for future assessments. New Jersey: Lawrence Erlbaum Associates,
Inc.
Noyes, J., Garland, K., & Robbins, L. (2004). Paper based versus computer based
assessment: Is workload another test mode effect? British Journal of Educational
Technology, 35, 111-113.
Park, Jennifer. (2003). A test taker’s perspective. Education Week, 22, 15.
Parshall, C.G., Spray, J.A., & Davey, T. (2002). Practical considerations in computer
based testing: Issues and applications. New York: Springer-Verlag.
40
Peterson, M. W., Gordon, J., Elliott, S., & Kreiter, C. (2004) Computer based testing:
Initial report of extensive use in a medical school curriculum. Teaching and
Learning in Medicine, 16, 51-59
Rabinowitz, S. & Brandt, T. (2001). Computer based assessment: Can it deliver on its
promise? WestEd, 70, 1-7.
Russell, M., Goldberg, A., & O’Connor, K. (2003). Computer-based testing and
validity: A look back into the future. Assessment in Education, 10, 279-293.
Russo, A. (2002). Mixing technology and testing: Computer based assessments lend
flexibility, quick turnaround and lower costs, supporters say. American
Association of School Administrators.
Rex, L. A., Nelson, M. C., (2004). Teachers’ professional identities in classrooms.
Teachers College Record, 106, 1288-1331.
Sands, W.A., Waters, B.K., & McBride, J.R. (1997). Computerized adaptive testing:
from inquiry to operation. Washington DC: American Psychological Association.
Smith, B., & Caputi, P. (2004). The development of the attitude towards computerized
assessment scale. Journal of Educational Computer Research, 31, 407-422.
Stanley, L.D., (2003). Beyond access: Psychosocial barriers to computer literacy.
Taylor & Francis Group, 19, 407-416. Retrieved April 1, 2006, from
http://taylorandfrancis.metapress.com/(4ysb0245k0a30f45zuktv1ih)/app/home/contrib
ution.asp?referrer=parent&backto=issue,6,7;journal,13,46;linkingpublicationresults,1:
100659,1
Togo, D. (2002). Topical sequencing of questions and advance organizers impacting on
students’ examination performance. Accounting Education, 11, 203-216.