Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 23

CALIFORNIA STATE UNIVERSITY

MONTEREY BAY

An Overview of Admissions Transcript Processing

CAPSTONE PROPOSAL

Submitted in partial satisfaction of requirements for the degree of

MASTER OF SCIENCE in
Instructional Science and Technology

Sarah Barnhart
September 3, 2019

Capstone Approvals: (At least one advisor and capstone instructor should approve)

_________________________________ ____________
Advisor Name Signature Date

_________________________________ ____________
Capstone Instructor Name Signature Date
2

Table of Contents
Executive Summary ........................................................................................................................ 3

Background ..................................................................................................................................... 4

Needs Analysis............................................................................................................................ 4

Learner Analysis ......................................................................................................................... 5

Solution Description ....................................................................................................................... 6

Project Goals ............................................................................................................................... 6

Learning Objectives .................................................................................................................... 7

Proposed Solution ....................................................................................................................... 8

Learning Theories ....................................................................................................................... 9

Anticipated Challenges ............................................................................................................. 10

Methods and Procedures ............................................................................................................... 11

Design and Development .......................................................................................................... 11

Implementation Plan ................................................................................................................. 14

Resources ...................................................................................................................................... 14

Timeline ........................................................................................................................................ 16

Evaluation ..................................................................................................................................... 18

Formative .................................................................................................................................. 18

Summative ................................................................................................................................ 18

References ..................................................................................................................................... 21

Appendix A. .............................................................................................................................. 22

Appendix B. .............................................................................................................................. 22
3

Executive Summary
The position of the Admissions Specialist in the Office of Admissions at California State
University, Monterey Bay is responsible for processing all student documentation that is
submitted to the office. High school and college transcripts are viewed by the majority of the
Admissions staff for freshman, transfer, and graduate level evaluation. Determination of
admissibility based on eligibility for admission to CSUMB is decided by the respective
Admissions staff that read the applicants’ transcripts to make that determination. In order for an
Admissions staff member to evaluate a transcript, it must first be viewable in the system.
After transcripts are scanned into the document database, called OnBase, the Admissions
Specialist is responsible for making sure transcripts are entered into the student system software,
called Oasis. Oasis tracks all student accounts from both the student perspective and the internal
office perspective. The Admissions Specialists are expected to update each student account as
transcripts are received by the Admissions Office; this process is called posting. The way in
which an account is updated requires specific knowledge of a transcript, including district
schools, campus codes, correlating system codes, and multiple navigations in Oasis.
Though it is the responsibility of the Admissions Specialist to make sure accounts are
updated, the Office of Admissions relies on Student Assistant employees to help enter all
information where account access is not limited. With multiple people entering data into the
system, there is an expected percent of human error. Errors are corrected by the Specialists
weekly from data integrity queries. Office errors are also provided in a semi-annual Enrollment
Reporting Services Student and Enrollment Reporting Services Applicant report, also known as
ERSS/ERSA. System errors that are missed by the Specialists during the semester are corrected
just prior to census, when all CSUs report their data and statistics to the Chancellor’s Office.
The solution that would decrease human error and increase productivity is a thorough
training on the Oasis system application for transcript entry, as well as, a standard training on
current Admission transcript processing business practices. The goal in creating an effective
training solution is to reduce the interruption time that Specialists spend answering daily
processing questions, but also to eliminate one-on-one individualized time that Specialists
currently spend training new Student Assistant, temporary, and internship employees. In
addition, the final project will consist of bite-sized elements of training and instruction that will
later be compiled into an internal office training website. The final deliverables that are still
currently in progress will be finalized during the next two upcoming months.
4

Background
Needs Analysis
Currently, there is no training that explains how the required fields in Oasis can affect a
student’s account in OASIS. Also, there is no supplementary support, such as a job aid that can
help answer questions about the tasks. There are also preventative measures that can be
employed to lower the Fall and Spring semester ERSS/ERSA errors, which will not only require
better training, but will also improve communication between staff members, as well.
Understanding the effects of missing information and gaining the knowledge to prevent any
negative or timely outcomes will be vital in filling the gap. Information is entered into the system
incorrectly or fields are missed often due to either lack of understanding on the part of the
employee or simply due to negligence by working too fast. The neglected fields eventually circle
back to our office in the form of ERSS/ERSA system errors, which is also a time-
consuming task.
Performance Gap
Errors are preventable at the time the information is entered into the system;
however, the current practice does not prioritize daily data integrity checks, because of the
amount of time this would consume. Currently, trainees are given minimal one-on-one training
without the use of a job aid, so when student assistants have questions, they must walk to the
cubical of a Specialist and ask, which is disruptive to all parties involved. In an effort to close the
gap, any employee entering information into the system will understand the correct pieces of
information to enter into all fields, which will minimize errors. The student’s education page will
be entirely filled out according to the information on the transcript. The posting and course entry
processes will be clear, concise, and consistent.
Survey
Because many of the Specialist duties act as a baseline for other positions in the
Admissions Office, it is very important that Specialists’ work is as accurate as possible. The
entries that begin with the Specialists and student assistants will eventually be seen by whomever
is working with the prospective student’s admission. Accuracy ultimately affects others in the
office. Therefore, I created a survey, which the majority of Admissions staff has completed.
Results from the survey signify the lack of formal trainings that take place internally (Appendix
A). This survey helps me to recognize the fields in Oasis most vital to the Evaluators,
5

Counselors, and Coordinators that view these applications on a regular basis. Feedback from the
survey has helped identify particular focal points for this project that will be necessary to include
in the modules (Appendix B). The majority of responders suggested that the external education
page would be the most helpful in terms of training. The external education page is the entry
page in Oasis for all transcript data entry.
Attitudes
The Admissions department is one of the busiest offices on campus. Admissions receives
thousands of documents on a semesterly basis. There are hard deadlines that require timely
processing, as well as hours of data entry. During stressful times the common attitude in the
office seems to be enter now and fix later. The workload, by default, often leaves the processing
team getting the information posted first, and then dealing with the errors in bulk later, which
needs to change. It takes additional and unnecessary time identifying errors from the
ERSS/ERSA error report because Specialists are required to go into each student record to fix
the error, rather than enter the information correctly at the time of the application.
Internal Issues
Technological equipment is an important consideration in the development of this
design. Unfortunately, due to budget constraints, the office uses very outdated computers and the
training application should not be so large that it causes our computers to slow down.
Learner Analysis
The Admissions Office includes workers with a wide-range of professional backgrounds.
Student assistant employees work for our office part-time during fall and spring semesters and
are in the process of obtaining their bachelor’s degrees. My audience is targeted towards both
student assistants and full-time staff employees, which is why the student responses are included
in my survey. For this project in-particular I will be directing this training towards any new full-
time and temporary Admissions Specialists, as well as student assistant employees. This training
is intended to help new employees of these position titles gain an understanding of the basic data
entry processes.
Experience
The Office of Admissions includes full-time staff members, who have a bachelor’s
degree or higher. The staff, including upper management, range from approximately 26 to 60
years old. Some of my colleagues have worked for the university for over 10 years. However,
6

very few of them have been in the Office of Admissions for more than 5 years. I have become
close with the majority of my colleagues and am aware of their backgrounds; therefore, I did not
request additional information in this regard.
Fortunately, the culture in the Admissions office is very open and communicative.
Because everyone is an expert in the work pertaining to their job titles, there is a valued rapport
of relying heavily on each other’s support for internal knowledge, information, or requests. Some
of my colleagues have over 10 years of experience at the university and some are only a few
weeks in to their current positions. The student assistants, on the other hand, are almost always
young adults in their early 20s that are all working towards their bachelor’s degrees. Though, we
are all adult learners and it will be important not to undermine anyone’s current knowledge. The
Admissions office also has a director, as well as managerial positions that help supervise and
support the team in its entirety; their support and approval will be important throughout the
implementation process.
Motivation
Unfortunately, the Admissions Office continues to remain understaffed as the university
grows larger each year, which means many of the employees are overworked. Many of the
workers do what is necessary in their positions to get their work done and keep their attention on
high priority tasks. It is a very fast-paced office, tasks are placed on the back burner—which
includes training—if there is no immediate upcoming deadline. Getting people on board with this
project may be a difficult task, because it will require additional time carved out of their day for
testing. However, Specialists will be asked to spend additional time on details that will save them
time later down the road, which is a fact that will need to be emphasized.

Solution Description
Project Goals
The desired goals are to eliminate system errors, as well as errors that appear on the semi-
annual ERSS/ERSA reports that are received from the Chancellor’s Office. In addition, the goal
is to have less disruption to in-office employees that are working with students’ admission; if an
error is caught, the solution usually requires a Specialist’s intervention, which prevents
employees from moving forward. Another very important goal is to increase productivity by
eliminating time spent pausing a task to ask and/or answer questions between the student
7

assistants and the Specialists. Lastly, having a training that sets a standard will eliminate
potential miscommunication, and retain consistency during the learning process.
These project goals will be achieved with the following:
 Providing a complete overview of entry fields in Oasis.
 Explaining the complete internal office cause and effect cycle of mis-entered information.
 Providing a complete overview of the differing types of transcript data.
 Meetings with the senior subject matter expert for overview of the design.
 Implementing appropriate interactive adult-centered learning with consistent knowledge
checks.
 Conducting a survey for users to gain an understanding of design effectiveness
Learning Objectives
Terminal Objective: Employees entering transcript data into the system will enter all
system required fields accurately (affective domain).
1. Given a flowchart of internal data circulation, employees will be able to identify how
errors effect other positions in the Admissions office.
2. Given a transcript, employees will be able to identify the term, year, school subject,
course number, course name, units, and grade on most transcript formats.
3. Given access to Oasis, employees will be able to identify the appropriate fields for
data entry and enter all appropriate fields correctly without error.
4. Given the course entry navigation in Oasis, employees will be able to apply the
proper default settings to all required fields.
5. Given a list of district transcripts, employees will be able to distinguish the
differences of information on a district transcript and enter all information in Oasis
with 100% accuracy.
6. Given a list of possible answers, employees will be able to identify the correct grade
code to apply in the system for the task of entering courses with 100% accuracy.
7. Given a scenario, employees will be able to identify the best action to take that will
prevent ERSS/ERSA errors in the system.
8. Given a list of errors, workers will be able to identify the most commonly found error
in the system and avoid making those errors when entering data.
8

9. Given a simulation in Oasis, employees will be able to enter all required fields
without error.
Proposed Solution
This complete asynchronous online training tool is designed to provide a full explanation
of the navigation through the Admissions processing posting and course entry processes. The
data entry required fields will be emphasized and explained. Upon implementing the training into
our office, new employees that are hired to help with hard copy back-office processing will be
required to complete the training; this practice will minimize errors and decrease time spend on
correcting errors from the data integrity queries and from the ERSS/ERSA error reports.
An Overview of Admissions Document Processing Course
1. Course Introduction
a. Admissions Specialist introductory welcome and course objectives
b. Paper processing overview
c. Posting and course entry relevance to the Admissions Office
d. Knowledge Check
2. Effects of Mis-Entered Data and Error Reports
a. Explain the cause and effect cycle within the office
b. Explain the ERSS/ERSA semi-annual error reporting
c. Explain date integrity queries
d. Knowledge Check
3. Transcripts Overview
a. Term, year, school subject, course number, course name, units, and grade
b. Transcript Legend
c. Multiple transcript formats
d. District transcripts
e. 4 feeder schools
f. Knowledge check
4. Posting Transcripts
a. Intro to student profile: Admissions Inquiry
b. Intro to Education Page
c. How to post a transcript in all formats
9

i. High School
ii. College
iii. District
iv. School Codes
d. Knowledge check/Practice
5. Course Entry
a. Courses and Degrees tab
b. Setting up defaults
c. Blank entry Fields
d. Auto-fill course subjects and units
e. Knowledge check/Practice
Learning Theories
Andragogy
Andragogy is the learning theory for this design and project. The Andragogy Learning
Theory was developed by Malcom Knowles in the 1970s. Since this theory is still relevant today,
and I work with adult learners, I will be applying the main elements from this theory. The main
components of Knowles’ theory will be applied: need for knowledge, motivation/willingness,
prior experiences, self-direction, and orientation to learning (Gutierrez, 2018).
The learner will be given clear reasons for why they will want to know the intended
topics of instruction. The learner will also be able to see what they will gain from taking the
course. I would like to speculate that reducing personal error may be reason enough for an adult
learner to want to improve their knowledge-base for this task, which also falls into the
willingness category. Though, explaining the error process and need for performance
improvements will have to be explained in detail. Thus, I will consider the cognitive, social,
affective, and conative sources for motivation (Huitt, 2011). Prior experiences will be considered
by not over explaining materials the participants may already know. The goal is not to undermine
anyone’s intelligence. There will also be an option to choose the starting module so that trainees
will not be required to go through the entire lesson if they already know how to do one of the
tasks. The design will allow the user to make their own decisions in some areas of the course.
Lastly, implementing task-oriented learning exercises will help orient the learner to the job.
ARCS
10

John Keller’s ARCS model of motivational design theory will be followed for media
usage and knowledge retention. Keller explains attention, relevance, confidence, and satisfaction
are key elements to multimedia instructional design (David, 2014). Thus, the design will
incorporate ways to draw the learner’s attention and reinforce materials and subject matter. As
with Andragogy, relevance to learning must be established early, or there will be potential for
losing motivation. The benefits of the training will be explained to the learner at the forefront.
As for confidence, reasonable objectives for obtaining necessary on-the-job skills have
been established. Though, the modules will build upon prior knowledge, rather than expecting
the learner to complete a task without any initial context. Posting a transcript will act as a
scaffold for the course entry process. Feedback will be provided throughout practice
assessments, knowledge checks, and interactive activates, while still having control over their
direction. This is to ensure learners feel a sense of satisfaction by being able to apply the
knowledge learned to the job. Satisfaction will be promoted by positive feedback and updating
the course design upon survey results.
Media Components
The training course will be developed using Adobe Captivate, with the support of
Camtasia for video editing. Access to a test environment in Oasis has been granted by the
Admissions department to prevent violations against FERPA, Family Educational Rights and
Privacy Act, and to protect current student information. Regular access to OnBase—the CSUMB
document database—will be necessary for collecting screenshots of high school and college
transcripts. A simulation video will be implemented using both media environments: OnBase
and Oasis. Knowledge checks and test questions will be developed within Captivate, as well as
interactive flowcharts and information presentations. Closed captioning will be mandatory for all
videos and public media that is borrowed from outside sources. Access to the Admissions
Processing guide will be granted to all new CSUMB employees involved with processing.
Anticipated Challenges
Establishing a consistent practice with this training may pose as a challenge. The office
culture has established a desire for shortcuts in training since the workload is heavy and
employee resources are limited. Implementing this change may require some convincing or
reminders to break the routine of one-on-one trainings from the people involved in processing.
Anticipated challenges are also expected with the design phase. Adobe Captivate is a new
11

program to me and will require personal extensive training for learning how to use all of the
desired capabilities that Captivate offers. Concerns also present themselves in the understanding
that CSUMB is in the process of switching to a new application, which may disrupt or alter some
of the Peoplesoft application features that are currently being used by its students, applicants, and
employees. Although, this is a minor concern, as the new system is already in place and the
interface and features which involved posting and course entry have not changed.

Methods and Procedures


ADDIE—analysis, design, development, implementation, evaluation—is the basis model
for this design. The analysis has uncovered the performance gap through a thorough needs
assessment. During the design phase, there will be a complete layout of materials and steps to be
completed in the storyboard. The product will come together in the host site during the
development phase. The implementation phase will require implementing these changes in
training; all new and current student assistants will be asked to take this training. The evaluation
phase will consist of regular reviews, including summative and formative evaluations to track
learner success and knowledge transfer.
Design and Development
The development plan timeline is rather lengthy due to the fact that the majority of the
materials will need to be created, and existing materials will need to be revised. In addition, there
will be a need for assistance from other parties in order to create the program successfully. The
majority of the course design is online; though, there will also be a ten-minute session at the end
of the training to sit with a Specialist to answer any questions that the learner may have. This
one-on-one time has been factored into the hour-long capstone training. The Specialists work on
a number of tasks, but upon assessment and review, student assistants spend the most time
helping the Specialists with posting transcripts and entering coursework. Therefore, there has
been careful consideration for this training and the internal needs of the office.
Because there are two main topics of training, the design outline is strategically created to
consider the order of events in which data must be entered into the system. Posting and course
entry are individualized tasks, but they do require order and initial understanding of the effects of
entering information incorrectly. Thus, a regular review of information and module testing will
occur throughout development.
Detailed Production List.
12

1. Admissions Specialists meeting for content review


2. Gather materials or decide on need for materials for course content
3. Storyboard draft
4. Storyboard draft assessment with management and all parties involved with
resources
5. Captivate draft review with Specialists
6. Product development
a. Admissions Specialists will review upon completion of each module
b. Communication for potential changes will be discussed with managerial
positions, and all parties involved with implementation
c. Admissions temps and the student intern will test for design accuracy
7. Beta Test/Evaluation
a. All student assistant workers will test the design
b. Survey and assessment data will be recorded for summative evaluation
Major Deliverables
Item Description Status
Course Outline An initial overview of the Complete
Admissions transcript
processing.
Content Outline An assessment of the required Complete
topics of the training.
Content Design Document A detailed description of the In Progress
required course materials and
descriptive review of the
course outline.
Storyboard A descriptive review of the In Progress
course outline in a format
which will display the content
and voiceover script.
Multimedia The design will include the In Progress
most appropriate media
13

sources suitable to the lesson


that can be implemented in
the design. Such media will
include video, images,
graphs, simulations,
flowcharts, and other
interactive learning of
information integrated and
available with Captivate.
Resource Content The resources content will In Progress
provide additional supportive
materials that are available to
the learner through links to
the Admissions processing
team drive in Google. Such
materials will include job
aids, student academic level
definitions, lists of schools,
and website links to school
code searches.
Evaluation Content Multiple knowledge checks In Progress
will be integrated throughout
the design, but other practice
opportunities will be
available by means of
simulations. Surveys, pre and
post questionnaires will also
be implemented.
Host Site The host site will be decided Not Started
on by the Admissions
Communications Analyst and
14

will allow the user to access


the training materials from
any location on the CSUMB
server.
Final Training Package The final product will allow Not Started
all new employees to
experience a consistent and
informative on-the-job
training.

Implementation Plan
The Specialists will play a key role in the implementation of this training, as the training
is designed for a task that a Specialist is responsible for making sure is complete. However, other
administrators will be trained as well.
The Communications Analyst will be providing the network to host the online training
materials and will need to be involved in the training setup. I will request support from our
department’s Operations Manager, as far as guidance on timeline requirements. It is important
that our office manager is involved in the implementation of the training so that we will have her
support on proper training procedures. Our manager will also need to be aware of when training
is taking place so that we can document who has the knowledge necessary to complete this task.
The training will take place at the learner’s computer station and will not require a reserved
room.
Since there is another student in the MIST program that also works in the Admissions
department at CSUMB, we will be partially collaborating. The collaboration is merely for design
consistency in the look and feel of our projects since both trainings will be offered for the same
office.

Resources
The costs for the actual design are minimal. However, the Admissions office has paid for
me to attend an Adobe Captivate training, which has costed a total of $799. The Director of
Admissions hopes to see the return on investment through the development of this training with
15

the expectation that there will be a reduction on time spent with on-on-one training and an
increase in productivity.
Acquired
 Adobe Captivate for primary training and storyboarding
 Oasis test environment for student profile simulations
 OnBase for screen shots of transcripts
 Camtasia for video editing and security
 Onsite conference room for design discussion meetings
 Individual computer stations for access to training modules
 Microsoft Word for draft development documents
 Google Drive for design communication
 Time with Admissions employees for interviews
Technical Skills
The skills necessary to complete this design have been acquired in the courses I have
completed in the MIST program so far. I will also be using the training that I receive from the
Adobe Captivate Specialist Certificate program to develop the design in Adobe Captivate.

Project Costs

Resource ~ Hours ~ Cost

Web Hosting 64 Provided by Client

Access to Adobe Creative NA Provided by CSUMB


Cloud

Captivate for Storyline and 640 Owned by MIST student


Design

Captivate Training for MIST 16 $799 - Funded by Client


Student Instructional
Designer
16

Timeline
In a meeting with the processing team, the following will be discussed to establish the
most appropriate materials and evaluation for the course design:
 Decide which job aids and other materials will be used
 Decide the grading scale and an acceptable error percentile upon completing the
training
 Agree upon the proposed timeline

Deliverables/Timeline

Task Duration Start Finish Resource

Design 42 days 08/01/2019 10/11/2019 Milestone

Course Outline 5 08/01/2019 08/05/2019 MIST Student

Content Outline 5 08/06/2019 08/11/2019 MIST Student

Content Design 4 08/12/2019 08/16/2019 MIST Student


Document

Storyboard 7 09/17/2019 09/24/2019 MIST Student

Client Approval 4 09/23/2019 09/27/2019 MIST Student

Design 11 09/26/2019 10/06/2019 MIST Student,


Deliverables Admissions
Specialists and
managerial
Team

Formative 4 10/07/2019 10/11/2019 MIST Student


Evaluation

Development 10 10/12/2019 10/23/2019 Milestone

Website Host 3 10/12/2019 10/15/2019 MIST Student,


Setup Admissions
17

Communications
Analyst

Multimedia 2 10/16/2019 10/18/2019 MIST Student,


Content Admissions
Processing Team

Website Draft 1 10/19/2019 10/20/2019 MIST Student

Website Final 2 10/21/2019 10/23/2019 MIST Student

Implementation 5 days 10/24/2019 10/29/2019 Milestone

Edits/Updates 5 10/24/2019 10/29/2019 MIST Student

Evaluation 14 days 10/30/2019 11/14/2019 Milestone

Summative 7 10/30/2019 11/06/2019 MIST Student


Evaluation level
3

Summative 7 11/07/2019 11/14/2019 MIST Student


Evaluation Level
4

Final Capstone 23 days 11/15/2019 12/10/2019 Milestone

Capstone 18 11/15/2019 12/03/2019 MIST Student


Summary

Capstone 5 12/05/2019 12/10/2019 MIST Student


Presentation

o Total time for development approximately four months.


o With current software and free public programs available for use, department budgeting
and additional expenses will not be necessary.
18

Evaluation
Formative
Specialists, who are the subject matter experts, will meet before and after the completion
of each training design module to examine course materials. Also, the Communication Analyst
will test the materials in the online host environment to ensure all links and materials work
accordingly. In addition, developmental tests with additional Admissions employees will be
conducted at the end of each module to check for inconsistencies and completeness of materials
in the course. Because the evaluators are often the folks requesting course entry, they know what
to expect from the course entry module, and whether the learner gains the appropriate knowledge
necessary to enter courses without errors. I will make appropriate adjustments to this module
based on feedback provided by the Evaluators.
Summative
For my level 1 summative evaluation, and in an effort to collect a qualitative and
quantitative response, I will request users to complete a questionnaire after they finish their
training. The questionnaire will contain questions regarding user-friendly applications,
understanding, information retention, and suggested changes and improvements; it will also
contain a section for additional commentary. I will also collect data from a pre- and post-test.
Partial testing has already been conducted on one module. I created a pre- and post-test to the
course entry module and discovered significant learning had occurred.
Despite some usability challenges, all users scored higher on the post-test than on the pre-
test (table 1.3). The pre- and post-test scores were obtained manually by comparing individual
user scores from both tests. Since users took the tests from Google forms, the results were
retrieved by extracting all results from the pre- and post-tests in a Google sheet. One point was
granted for each correct answer on both the pre-test and the post-test. Once the scores were
totaled manually, they were placed in an excel file (Table 1.1) and compared for statistical
significance. In total, only 12 of the 13 questions were graded and included in the learners’
scores. The results were meant to determine if the lesson had any effect on the user’s
understanding of how to enter coursework correctly in Oasis.
The post-test mean value of 7.714 was significantly higher than the pre-test mean value
of 4.142 (Appendix C). To measure results, a paired two sample for means t-test was run for
dependent samples with 6 degrees of freedom (table 1.2). The data provides evidence for
19

learning transfer and proves the effectiveness that the course lesson has on the learner. Since the
hypothesis is directional, the one-tail values represent the true values for comparison. The t-stat
value of 7.42 is much larger than the t-critical value of 1.94. In addition, the p-value of 0.00015
is much smaller than the standard 0.05 alpha level. Therefore the null-hypothesis is rejected. In
conclusion, the results are statistically significant.
In addition, the effect size was calculated to determine if the results were practically
significant. The pre-test experimental scores prior to training (M=4.14, SD=1.21) and the
observed post-test scores after training (M=7.71, SD=1.25) differed significantly [t(7)=2.84,
p<.05 and d=2.85]. The effect size of 2.85 is larger than the standard effect size of 0.8; thus, the
observed difference between the two tests are practically significant and the training did have a
significant effect on the learner. I plan on repeating this process with the final product containing
all modules to determine the effect on the learner from the overall product.
In addition, a level 2 evaluation will be based on the declarative and procedural test
results. Particularly, if the learner has managed to successfully post a transcript and enter all
courses without errors, then I will know that the user has gained an accurate understanding of the
information provided throughout the training.

Pre-Test Score Post-Test Score


6 10
5 8
2 8
4 8
4 7
4 7
4 6
Table 1.1
t-Test: Paired Two Sample for Means

Variable 1 Variable 2
Mean 4.142857143 7.714285714
Variance 1.476190476 1.571428571
Observations 7 7
Pearson Correlation 0.46897905
Hypothesized Mean Difference 0
df 6
20

-
t Stat 7.426106572
P(T<=t) one-tail 0.000153396
t Critical one-tail 1.943180281
P(T<=t) two-tail 0.000306792
t Critical two-tail 2.446911851
Table 1.2

Table 1.3
21

References
David L, "ARCS Model of Motivational Design Theories (Keller)," in Learning Theories, July

23, 2014, https://www.learning-theories.com/kellers-arcs-model-of-motivational-

design.html.

Gutierrez, K. (2018, April 28). Adult Learning Theories Every Instructional Designer Must

Know. Retrieved from https://www.shiftelearning.com/blog/adult-learning-theories-

instructional-design

Huitt, W. (2011). Motivation to learn: An overview. Educational Psychology Interactive.

Valdosta, GA: Valdosta State University. Retrieved from

http://www.edpsycinteractive.org/topics/motivation/motivate.html

Rothwell, W. J., Benscoter, G.M., King, M., & King S.B., (2016). Mastering the Instructional

Design Process: A Systematic Approach (5th ed.). Hoboken, NJ: John Wiley & Sons.
22

Appendix A.

Appendix B.
23

Appendix C. Descriptive Statistics


Pre-Test Score Post-Test Score

Mean 4.142857143 Mean 7.714285714


Standard Error 0.459221465 Standard Error 0.473803541
Median 4 Median 8
Mode 4 Mode 8
Standard Standard
Deviation 1.214985793 Deviation 1.253566341
Sample Variance 1.476190476 Sample Variance 1.571428571
Kurtosis 1.778772112 Kurtosis 1.492561983
-
Skewness 0.366392178 Skewness 0.739707742
Range 4 Range 4
Minimum 2 Minimum 6
Maximum 6 Maximum 10
Sum 29 Sum 54
Count 7 Count 7

You might also like