UNIVERSITY OF HAWAI`I AT MĀNOA
EDUCATIONAL TECHNOLOGY MASTERS PROGRAM
321 COUNTDOWN
ANALYSIS RECOMMENDATIONS REPORT
Prepared by Dain A. Shimabuku
for Central Curriculum Office for the State Department of Education
Maui, Hawaii December 2009
Memorandum
The Central Curriculum office for the State Department of Education is currently
developing a professional development program for teachers who work in remote regions
in the state. The purpose is to help teachers lean how to use 321 Countdown, a tool for
assessing math K-6. Teachers at these rural schools find it difficult to attend professional
development activities due to their location. At this time the goal is to pilot the program
in a few schools with the overall intention to implement statewide. Don Garthon, an
instructional design instructor at a local university and Susan Harper, a recent graduate
student are developing professional development activities to assist in implementing the
321 Countdown program.
Table of Contents Needs Assessment……………………………………………………….. pg. 1 Gap Analysis…………………………………………………….. pg. 1 Task Analysis……………………………………………………………. pg. 2 Learner and Context Analysis…………………………………………… pg. 3 Learner Analysis………………………………………………… pg. 3 Performance Context……………………………………………. pg. 4 Learning Context………………………………………………... pg. 4 Social Factors……………………………………………………………. pg. 6 Adoption Analysis………………………………………………. pg. 6 Identifying Adopters…………………………………….. pg. 6 Implementation………………………………………………………….. pg. 10 References……………………………………………………………….. pg. 11
Needs Assessment According to Rouda and Kusy (1995), “A Needs Assessment is a systematic
exploration of the way things are and the way they should be. These “things” are usually
associated with organizational and/or individual performance.” Needs assessment is the
first step that must be done prior to analyzing and implementation. A gap analysis is
performed in a needs assessment
a. Gap Analysis According to Rouda and Kusy (1995), “The difference the “gap” between current
and necessary will identify our needs, purposes, and objectives.” A gap analysis identifies
the current situation and the desired situation. The outcome of the analysis reveals
strengths, problems or deficits, opportunities, and impending changes.
Identifying the gap before implementing 321 Countdown will reveal where the
program will help teachers in assessment. It will reveal strengths and areas of where
improvement is important. Performing the gap analysis will display future concerns and it
provides a general timeline for addressing the concerns.
- 1 -
Task Analysis: The purpose of a task analysis is to determine what you expect the learner is
going to learn and how they are going to apply what is learned. As Jonnassen et al.
(1999) stated, “Task Analysis for instructional design is a process of analyzing and
articulating the kind of learning that you expect the learners to know how to perform”
(p.3). A task analysis determines instructional goals, objectives, tasks, learning outcomes,
order of tasks, and assesses ones learning and media environment. Performing a task
analysis may reveal the necessary tools for this project to be successful. A task analysis
may disclose hidden challenges as well as recommend various alternatives to what the
end users need.
The end users in this situation are the teachers who will use the 321 Countdown
program. According to Hodell (2006), “Four levels of detail exist in a task analysis: job,
task, skill, and subskill.” Identifying the tasks of the teachers who will learn and utilize
the program is valuable to implementing the program. Consider the following example:
Job: Math Teacher Task: Utilize 321 Countdown to assess addition and subtraction through counting by ones. Skills: Monitoring students in addition and subtraction Subskill: Monitor one student and assist in addition and subtraction when needed
Often overlooked in instructional design is the importance of communication with
end users. Tosca is quoted as saying, “effective communication among all individuals is
an essential skill that is becoming more important as we move to a wider base of workers
and businesses.” (Reiser and Dempsey, 2007 p. 115). Communicating with end users on
what is effective and inefficient will reveal what needs to be improved to be successful.
Understanding how end users learn may assist in a successful implementation of the 321
Countdown state wide.
- 2 -
Learner and Context Analysis: a. Learner Analysis
Performing an analysis on learners provides how they learn and how to improve
individual learning environments. Factors such as motivation, entry behaviors, prior
knowledge, learning preference, attitudes toward trainers, and group characteristics may
influence the outcome of the project. Motivated learners will determine the success of
the project. If one is determine and engaged in the content being provided they will be
motivated to learn. Entry behaviors and prior knowledge allows the designer to determine
the level of mastery that the learners acquire. Lecture, web-based, workshops, or web
instruction determines ones learning preference. Attitude towards trainers determines the
type of personalities you will be interacting with. Group characteristics determine they
type of population you are working with.
Collecting information on the rural teachers can be in the form of surveys or
questionnaires. According to Hodell (2006), “Open-ended questions lead to open-ended
answers, but for quantifiable data, designers must ask quantifiable questions and supply
specific ranges of answers” (p.32). When developing questions provide rating questions
that allows you to measure the answer provided by the teachers. You may want to ask
teachers this question:
How would you assess your ability to use computer technology? (a) no problems, (b) minor problems, (c) many problems, (d) nothing but problems.
Asking questions that are ratable allows you to view the strength and weakness of
the teacher’s technology skill level. You may want to ask questions on their learning
environment, internet connection, attitude, target population, and learning preference.
- 3 -
b. Performance Context:
Performance Context is analyzing the end users work environment following the
completion of instruction. The work environment deals with managerial support, tools or
resources, social aspects, relevancy, and constraints. This allows the designer to consider
the areas where the end users work whether it’s social, relevant, or physical constraints. It
also allows the designer to take into account the tools or resources that are available or
necessary upon completion of the instruction.
The locations of the schools involved in the implementation of the program have
been the grounds for initializing this program. Understanding the resources that are
available and needed is essential to sustaining the program. According to Virginia Tech
Universities Instructional Technology program (2009), “Describe the context in which
the learners will use their new skills and knowledge after the instruction is completed.”
It’s important to determine how teachers will use the 321 Countdown program and
implement it in math assessment.
c. Learning Context:
According to Virginia Tech Universities Instructional Technology program
(2009), “The context in which learning will occur may affect the accomplishment of your
goal.” It’s important to identify the resources that are available at each site.
Each school across the state differs whether it’s the amount of teachers or the
technological resources that are available to them. The schools that are pilot schools
should represent the characteristics of schools that are going to utilize the program
statewide. When the implementation of 321 Countdown goes state wide it’s important to
- 4 -
take in effect on how rural teachers are going to be trained. Since location and time is an
issue the evaluation of online training needs to be taken into consideration.
- 5 -
Social Factors It’s important to take in social factors when implementing a new program that
will change a task. According to Havelock (pg, 118), “It is impossible to understand how
individuals adopt without also considering the social relationships and group structures
which bind individuals together.” A group of people usually have common issues such as
interests, needs, and backgrounds. Identifying groups and their issues gives the designer
insight to utilize effective pathways in implementation.
a. Adoption Analysis i. Identifying Adopters
According to Edmonds (1999), “When a technological innovation is
introduced into an organizational system, some individuals within the
organization are more open to adaptation than others.” Identifying the individuals
who are willing to accept new technology may be interested in the initial
implementation phase.
When choosing schools that are piloting the 321 Countdown programs it’s
important to ensure they are interested in change. According to Edmonds (1999),
“Faculty and staff development is a change process that must be carefully
planned, managed, and evaluated; it should not only improve instructional and
organizational processes, but also create an environment amenable to innovation
and change.” When adopting a new system people are categorized according to
their interests and personality. There are five categories innovators, early
adopters, early majority, late majority, and laggards
a. Innovators
Innovators are willing to try new technology or new ideas.
According to Rogers (1995), “Their interest in new ideas leads them out of
- 6 -
a local circle of peer networks and into more cosmopolite social
relationships.” If the adoption is proves to be ineffective the innovators
will not feel dejected.
When implementing the program into the pilot schools it is
important to identify the innovators. The innovators will take risks and
implement the system. Innovators will also provide feedback for areas of
strengths and weaknesses. If successful innovators could be the spokes
person for school wide and state wide implementation.
b. Early Adopters “The early adopters have a high degree of opinion leadership.”
Rogers (1995). They set an example for others and provide confidence for
adopting innovation. Often time’s early adopters are highly respected and
considered to be successful by their peers.
Identifying the early adopters in pilot schools will increase the
probability of school wide innovation. Highly respected people are often
leaders in the school and senior teachers. If these teachers approve of the
321 Countdown program it allows other teachers and leaders to accept the
program.
c. Early Majority Noted as cautious or vigilant to change the early majority need
some time to analyze innovation. According to Rogers (1995), “The early
majority may deliberate for some time before completely adopting a new
idea.” The early majority is often characterized as seldom leaders but is
willing to accept new ideas.
- 7 -
Identifying early majority increases the likelihood of successful
adoptions. Teachers who are mentees are often identified as early majority
since their mentors usually hold leadership positions.
d. Late Majority Late Majority often feels pressured by peers to change since they
make up one third of the system; the early majority is another third.
According to Rogers (1995), “Innovations are approached with a skeptical
and cautious air, and the late majority do not adopt until most others in
their system have already done so.” The majority of the system must favor
the innovation for the late majority to adopt change.
Identifying late majority in the school system infers that the
innovation has been adopted. The early and late majority make up two
thirds of the system which means the chances of the innovation being
adopted has increased.
e. Laggards Last in the system of adoption are the laggards. According to
Edmonds (1999), “The Laggards’ adoption of innovations, technologies,
and programs lags behind their awareness and knowledge of innovation.”
They often have no leadership position and are suspicious of change.
When laggards’ accept change a new innovation may already have been
introduced and started by the innovators.
Laggards in a school system may be those whom are indirectly
affected by the change. These may include support staff, educational
assistants, as well as parents of the students. Once the 321 Countdown
- 8 -
program is accepted by laggards the adoption of the innovation is
complete.
- 9 -
Implementation Often used in instructional design is the ADDIE model. The ADDIE model is
referred to analysis, design, development, and evaluation. This report recommended steps
and rationale to completing an analysis for the Central Curriculum office for the State
Department of Education innovation of the 321 Countdown program for assessing math.
Upon completion of the analysis data should recommend the next two steps before
implementation, design and development.
When designing instruction the designer must consider all recommendations from
the analysis. In this stage objectives and content is written. Using the ABCD’s (Audience,
Behavior, Condition, and Degree) format to create objectives should be considered. An
outline is created with specifications to complete the project.
The development stage is where materials are produced and pilot testing begins.
Identifying innovators play an important role in the pilot testing. According to Hodell,
“The pilot testing process allows organizations to implement any necessary changes in
the project before the expenses associated with materials development are realized.” (pg.
13). If pilot testing is deemed successful the next step of implementation can occur.
The implementation stage is where the content is delivered to the learner. In this
stage it’s important to evaluate if objectives are being met. Assisting in the effectiveness
of the implementation process could be Kirkpatrick’s levels of evaluation.
It is recommended to do a complete analysis prior to moving on to the design,
development, and implementation stage. Performing a proper analysis may reveal the
necessary needs, tasks, learner’s ability, and how adoption occurs when a change or
innovation is introduced.
- 10 -
- 11 -
References Dabbagh, N. (2009). The Instructional Design Knowledge Base. Retrieved December, 9,
2009 from Nada Dabbagh's Homepage, George Mason University, Instructional Technology Program. Website: http://classweb.gmu.edu/ndabbagh/Resources/IDKB/index.htm
Ertmer., P.A., & Quinn, James. (2007). The Id Casebook Case Studies in Instructional
Design. Columbus, OH: Pearson . Gerald S. Edmonds "Making Change Happen: Planning for Success" The Technology
Source, March 1999. Available online at http://ts.mivu.org/default.asp?show=article&id=1034.
Havelock, R.G. (1973). Diagnosis: from pains to problems to objectives. The Change
Agent's Guide to Innovation in Education, Havelock, R.G. (1973). Gaining Acceptance. The Change Agent's Guide to Innovation in
Education, Hodell., C. (2006). ISD From The Ground up. Alexandria, VA: ASTD Press. Reiser, R. A., & Dempsey, J.V. (2007). Trends and issues in instructional design and
technology. Upper Saddle River, NJ: Pearson Merrill Prentice Hall. Rogers, E. (1995). Difference of innovations. New York, NY: The Free Press Rouda, R. H., & Kusy Jr., M. E. (1995). Needs assessment the first step. Development of
Human Resources, 2. Retrieved from http://alumnus.caltech.edu/~rouda/T2_NA.html Virginia Tech, Initials. (n.d.). Lesson 5- learner and context analysis. Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson5.htm