Download as pdf or txt
Download as pdf or txt
You are on page 1of 47

Evaluation Plan

Sunny Nash, Erika Lisi, Cable Davis

Department of Education, Concordia University

ETEC 636: Evaluation in Education and Training

Dr. Julie Corrigan

December 7, 2020
Executive Summary
Dr. Julie Corrigan and a team of researchers are involved in a research project aiming to design
and develop instructional interventions for Quebec teachers and students to help improve their
ability to evaluate online information. At the outset, the project focused on exploratory research
and what individuals did not like about previous modules that were initially created for the
program. The following will include an evaluation focus revolving around three stages: Year 1
(pilot testing of the program), Year 2 (quality of instruction in accordance with participants’
expectations), and Year 3 (assessing learners’ achievement).

While no budget is specified for the research project, financial assistance in the form of a grant
was awarded when applying for aid to fund the project. The process and outcome evaluations of
this program will assess whether activities led to successful outcomes following instructional
interventions along with identifying any barriers present. Evaluation questions were devised for
each year and evaluate the extent in which teachers and students evaluate the credibility of
online information, measure satisfaction, how teachers utilize interventions and to what extent
the program’s effectiveness had on learning and knowledge acquired. The design of the
evaluation focus will lead to findings and data that will be collected by Dr. Julie Corrigan and the
team of researchers.
Stakeholders
Primary
Dr. Julie Corrigan
Secondary students in Québec
Secondary teachers in Québec
Other researchers on the project
Volunteers (partaking in study assessing how to approach online usage)

Secondary
Government of Québec, Political leaders
Other teachers and educators
Quebec school support staff

Tertiary
Other residents of Québec and Canada
Education and Media Researchers
Fonds de Recherche de Quebec team
Intended Use and Users
A research project conducted by Dr. Julie Corrigan and a team of researchers aims to improve
digital literacy among Quebec students and teachers in secondary school settings. Designing and
developing instructional interventions in order to help students and teachers improve their ability to
evaluate online information is the desired outcome of the project. To determine criteria needed to
assess trustworthy online content, evaluating how interventions are affecting student performance
and how teachers evaluate the credibility of online resources is the primary stakeholders’ focus.
Executing the program by addressing how activities and outputs can improve current instructional
methods is the intent.

Stakeholders’ uses focus on achieving desired outcomes of the initiative and the purpose of the
evaluation is to address the unique needs of primary, secondary and tertiary stakeholders.
Intended uses further detail resources, activities, outputs, and outcomes over time and reflect
intended results and effects of the interventions. Moreover, Dr. Corrigan will have access to
evaluation results of the direct and indirect effect of the processes.
Narrative Description
The Internet is increasingly part of today’s culture, especially for children and youth, for whom
schoolwork, online gaming, and social networking are among the most popular activities. However,
while the internet makes information more accessible than ever before, it also makes the quality of
that information more difficult to discern. In order to be able to determine credibility of online
information properly, improving secondary students’ ability involves employing critical thinking skills
which would allow them to evaluate online information in order to develop good judgement.

Developing a series of instructional interventions aimed at helping secondary school students to


improve and develop the 21st century competency of critical thinking is the basis of the evaluation
program. The development of a series of instructional interventions is aimed examining how teachers
and students interact with online information. Evaluating the credibility of online information
contributes to the practice of critically assessing resources and improving perception.
Mission

To develop instructional interventions with Quebec teachers for Quebec students in


order to help students improve their ability to evaluate online information credibility.

Stage of development of program


This program is in the planning stage as it is finalizing the project scope, defining the
detailed work breakdown, assessing risk, identifying resource requirements, finalizing
the schedule, and preparing for the actual work.
Goals & Objectives of the Project
Through this project, students will:

● Develop the ability to critically evaluate online information


● Understand the multiple dimensions of critical evaluations
● Recognize markers for accuracy, relevancy, and credibility
● Investigate author credentials
● Detect bias and stance
● Calibrate information
● Negotiate multiple perspectives
● Cultivate the habit of referring to a set of questions when evaluating the credibility of
online information
● Employ reasoned judgements about the overall quality of information being presented
Goals & Objectives of the Project
Through this project, teachers will:

● Develop the ability to teach students how to critically evaluate online information
● Teach the multiple dimensions of critical evaluations to students
● Become familiar with the markers for accuracy, relevancy, and credibility so that
they can teach them to students
● Be able to explain the importance of investigating author credentials to students
● Be able to demonstrate how to detect bias and stance to students
● Be able to illustrate how to calibrate information to students
● Show students how to negotiate multiple perspectives
● Prepare a set of questions to give to students to use when evaluating the
credibility of online information
● Teach students how to use reasoned judgements about the overall quality of
information being presented
Visualizing the Evaluation Plan

In the following two slides, we created a simple logic model for the program in
order to better understand and focus on what we are evaluating.

This process is necessary for creating an evaluation plan because it allows for
organizing, explaining, and reflecting on an evaluation (Pell Institute, n.d.).

Note: The following figures were adapted from Handbook on Practical Program Evaluation (p. 65) by
Newcomer, Kathryn E. et al., 2015.
Situation
First-year Québec secondary students are ill-equipped to evaluate online information

Short-Term Outcomes

● Teachers have the knowledge


and skills to teach students
how to critically evaluate online
Resources / Inputs Activities Outputs (#) information
● Students have the knowledge
● Dr. Corrigan ● Implement and skills to be able to critically
● Funding (Fonds evaluate online information
interventions for
de Recherche ● Trained
teachers
de Québec) Teachers
● Implement Intermediate Outcomes
● Teachers ● Trained
interventions for
● Students Students
students
● Research ● Website/web
● Conduct record ● Teachers use their skills in the
Assistants resources classroom
keeping on a
● Instructional ● Sponsors ● Students are mindful when
regular basis for
Designers grant report
viewing online information for
● LMS school
● Public awareness is created

External Factors
Long-term outcomes

Schedule of the school year, COVID-19 pandemic, privacy concerns, language


barriers, politicization, misinterpretation of news sources, conspiracy theories, ● Students think critically about
influence of family/friends, participant inconsistency, resistance of teenagers information found online
(delicate age-group), funding, time constraints. ● Consumption of false online
information is reduced
Situation
First-year Québec secondary students are ill-equipped to evaluate online information

Short-Term Outcomes

us
Foc ● Teachers have the knowledge
ion
luat and skills to teach students
Eva
how to critically evaluate online
Resources / Inputs Activities Outputs (#) information
● Students have the knowledge
● Dr. Corrigan ● Implement and skills to be able to critically
● Funding (Fonds evaluate online information
interventions for
de Recherche ● Trained
teachers
de Québec) Teachers
● Implement Intermediate Outcomes
● Teachers ● Trained
interventions for
● Students Students
students
● Research ● Website/web
● Conduct record ● Teachers use their skills in the
Assistants resources classroom
keeping on a
● Instructional ● Sponsors ● Students are mindful when
regular basis for
Designers grant report
viewing online information for
● LMS school
● Public awareness is created

External Factors
Long-term outcomes

Schedule of the school year, COVID-19 pandemic, privacy concerns, language


barriers, politicization, misinterpretation of news sources, conspiracy theories, ● Students think critically about
influence of family/friends, participant inconsistency, resistance of teenagers information found online
(delicate age-group), funding, time constraints. ● Consumption of false online
information is reduced
Creating Evaluation Questions
After creating the logic model of the program, we proceeded to create evaluation
questions to focus the evaluation plan.

This process is suggested by the CDC Program Evaluation Framework (CDC,


2019). We first created broad evaluation questions that focused on the program as
a whole and then refined the questions to better reflect the specific foci for each
year of the program.

After this first brainstorming session, we created evaluation design matrixes that
informed our choices for proposed evaluation tools.
Evaluation Questions for Teachers
● How many teachers have been training to teach online literacy and critical
thinking skills?
● Are teachers using their new skills in the classroom?
● What are teachers’ perceptions of online literacy and critical thinking?
● What are teachers’ overall satisfaction with the interventions?
○ Specific interventions?
● Do teachers have any suggestions for improvement?
● How satisfied are the participants with the website/web resources?
Evaluation Questions for Students
● How many students have been taught/trained in online literacy skills?
● What are students perceptions of online literacy and critical thinking?
● What are students overall satisfaction with the interventions?
○ Specific interventions?
● Do students have any suggestions for improvement?
● How satisfied are the participants with the website/web resources?
Proposed
Evaluation ● Post-Pilot Teacher Interview
Tools ●

Student Focus Group
Observations
● Surveys
Year 1
Evaluation Focus (Year 1)
Phase: After the pilot tests of the program

Evaluation type: Formative Evaluation

Purpose: Continuous program improvement

The evaluation for year one should compare the goals of the program with the
results from the pilot test. This evaluation should also be exploratory because it
will be searching for areas of improvement and gauging the reaction of the
participants.
Evaluation Design Matrix Year 1
Evaluation Question Indicators Information Source Sampling Data collection Quality Assurance
method

What are the students’ Attitudinal and affective Students who have All students should be Focus groups Focus group led by
reactions to the reaction of the students received the intervention accounted for experienced researcher
interventions? Surveys
A set procedure and
question list

Avoiding repetition of
questions across focus
groups and survey

What are the teachers’ Attitudinal and affective Teachers who have All teachers should be Interviews Interviews conducted by
reactions to the reaction of the teachers taught students using accounted for trained researchers
interventions? the interventions (estimated 6)
A set interview protocol
in place

Are teachers using the Performance Objectives Teachers who have All teachers should be Observation Observations conducted
interventions as created during the taught students using accounted for by trained researchers
anticipated? planning of the project the interventions (estimated 6) Checklists
A set observation
protocol in place

A checklist will be
created based on the
program objectives
Tool: Post-Pilot Teacher Interview
An interview protocol was already developed for this project that explores the
project before its first implementation, the interview protocol proposed here is for
after the pilot test in year 1.

This focus on this interview protocol is the affective reaction to the interventions
from the teachers as well as suggestions for improvement. This serves not only as
a tool for evaluating the success the program but also involving the teachers as
stakeholders in the program.
Planning the Interview and Quality Assurance
● Teachers who volunteer to pilot the interventions in the classroom need to be
willing and able to give an interview after the pilot class.
● Interviewers need to be trained or experienced in giving interviews
● Use the interview protocol, but it should be viewed as a flexible and “living”
document
● Respect the privacy concerns of the participants
Interview Protocol for Teachers
An interview protocol document can be found here.

The text of the document can also be found in the speaker notes of this slide.
Analyzing the Data from the Interviews
The data from the interview should be coded, which is an iterative and cyclical
process of assigning brief phrases (no more than 5 words) to the relevant data
(Methodology Related Presentations - TCSPP, 2019).

While certain themes and phrases may show up, it is important to have broad
categories in mind based on an initial reading of the data and the goals of the
evaluation (Harding, 2015).

Some suggest broad categories to begin with are: Affective Reaction to the
Interventions, Program Strengths, Program Areas of Improvement
Tool: Post-Pilot Student Focus Group
● Students should also get their chance to participate in the evaluation of this
program as stakeholders.
● The proposed class size (6 students) for the pilot tests lends itself to focus
groups (Mod•U: Powerful Concepts in Social Science, 2016).
● A focus group could be a natural extension of the class. After the class
finishes, the students can be asked their thoughts on the lesson.
● Students in a focus group may feel more comfortable with their peers, rather
than just with researcher
Additional Possible Evaluation Tools: Year 1
Classroom observation would serve as a way to measure the affective reaction of
the participants as well as measuring the engagement between students and
teachers. An outline of a proposed observation protocol can be found in the Year 2
section, here.

Surveys or questionnaires could be done in addition to the interviews and focus


groups to gather more quantitative data on the reaction of participants.

UX testing could also be done on the interventions that are web-based.


Use, Dissemination, and Sharing: Year 1
The results from all the evaluation tools from Year 1 will be used to formatively
evaluate the success of the interventions. The results should first go to the
researchers and developers of the interventions in order for them to make
adjustments based on the results.

These results will shape how the interventions look like in their second year of
testing for their wider roll out. Researchers can adjust the interventions to create
more positive affective reactions from the participants and integrate the
suggestions for improvement to increase stakeholder participation.
Proposed
Evaluation
Observation
Tools
Year 2
Evaluation Focus (Year 2)
This evaluation plan is focused on measuring the outputs and short-term outcomes of the project:

● Phase: New program (process evaluation)


● Type: Process evaluation
● Purpose: Continuous program improvement (improve of fine-tune existing program
operations, processes or strategies)
● Users: Who will employ the evaluation findings? The stakeholders involved in program
improvement such as the instructors and those involved in the program’s development.
● Uses: To measure whether the actual program is faithful to the initial plan, to locate areas of
improvement in the course delivery, and to increase student engagement.
● Methods: Observation and feedback technique using an adapted version of the Teaching
Dimensions Observation Protocol (TDOP).
● Agreement: The evaluation will be conducted by research assistants that have received
training on the protocol being employed.
Observation
Tool:Observation

Observation Protocol: Teaching Dimensions Observation Protocol (TDOP)

Goals of observation: To improve the quality of the program by measuring student engagement*
and clarifying expectations for effective teaching and learning**, and helping teachers meet those
expectations through high quality feedback and support.

*For the purpose of this evaluation, the term student engagement refers to individual student learning, students’ involvement in
governance and leadership, and student engagement with respect to identity, including the extent to which benefits vary for different
students. Broadly, it refers to the degree of attention, curiosity, interest, optimism, and passion that students show when they are
learning or being taught, which extends to the level of motivation they have to learn and progress in their education (Balwant, 2018).

**For the purpose of this evaluation, and in response to the program’s content, effective teaching and learning is described as being
student-centered. Student-centered instruction is where students and instructors share the focus, instead of the focus being only
centered on the teacher. Instead of listening to the teacher exclusively, students and teachers interact equally. Group work is
encouraged, and students learn to collaborate and communicate with one another while also having the opportunity to direct their own
learning (Weimer, 2013).
Evaluation Matrix (Year 2)
Evaluation Matrix (Year 2) Cont’d
Structure of TDOP
(Wisconsin Center for Education Research, n.d.)
Observation Indicator Dimensions
(Wisconsin Center for Education Research, n.d.)

The following dimensions were chosen as they best suit the needs for the year 2
evaluation which is to assess program effectiveness and student engagement.

Basic Dimensions:

● Dimension 1: Instructional Practices (teacher-focused vs. student-focused)


● Dimension 2: Teacher-Student Dialogue (teacher-led or student-led)
● Dimension 3: Instructional Technology (used for teaching)

Optional Dimension:

● Dimension 4: Potential Student Cognitive Engagement


The TDOP coding bank template was adapted to fit the needs of this evaluation. In doing so, the three basic dimensions were used, and we added
the additional dimension of ‘student engagement’. Furthermore, the coding indicators were also modified to suit our needs, which resulted in some
being removed and others added (the codes that were added appear with an Asterix in the coding template provided).
Classroom Observation Indicators Dimensions (Wisconsin Center for Education Research, n.d.)
Basic Dimensions:
Dimension 1: Instructional Practices (student-centered vs. teacher-centered)
Sixteen indicators were coded to assess whether instructional practices are mostly student-centered (8 indicators) or
teacher-centered (8 indicators). Ideally, instructional practices should mostly be student-centered (majority of codes
selected should fall within the student-centered category). The benefits of conducting student-centered instruction are that it
improves instruction, improves retention and knowledge, boosts performance, develops problem-solving skills, fosters
communication and collaborative learning skills, increases engagement and facilitates personalized learning (Balwant,
2018).
Dimension 2: Student-Teacher Dialogue (student-led vs. teacher-led) Six
indicators were coded to assess the student-teacher dialogue in classroom learning. In doing so, eight indicators were
selected to assess whether the instructional dialogue is mostly student-led (3 indicators) or teacher-led (3 indicators).
Ideally, classroom learning should be indicative of desirable practices such as students working together, effective lecturing
techniques, or learner-centered instruction (Balwant, 2018). Student-led learning can empower and motivate pupils to drive
better personal learning outcomes (Balwant, 2018) and for the purpose of this intervention, most indicators selected should
fall within this category.
Dimension 3: Instructional Technology (used for teaching)
Seven indicators were coded to assess the degree to which different instructional technologies are incorporated in the
classroom. Incorporating a variety of instructional technologies in the classroom is important in order to maintain the focus
on the students and to improve effectiveness of the material being presented.
Optional Dimension: Dimension 4:
Potential Student Engagement
For the purpose of this evaluation, we chose one optional dimension- potential student engagement. Five indicators were
coded to assess whether students appear to be engaged with the course content.
Steps in using the Observation Protocol
(Wisconsin Center for Education Research, n.d.)

1. Rationale for conducting observations and desired outcomes: to provide formative data for peer
observations and/or professional development consultations with instructors (primary goal of informing and
enhancing teaching and learning, as well as to increase student engagement)
2. Determine dimensions in the TDOP that are to be used (we chose the 3 basic dimensions (instructional
practices, student-teacher dialogue, & instructional technology) and 1 additional dimension (student
engagement) as these best meet our needs).
3. Codes required within the selected dimensions (the codes were adapted from the TDOP template, however,
it should be noted that some were removed and others added to better suit the needs of this particular
evaluation).
4. Create observation template (cover sheet and code sheet).
5. Conduct training for raters to become acquainted with the protocol and engage in inter-rater reliability (IRR)
testing (RA’s need prior training with coding system to ensure proper use and optimal results).
6. Conduct observation (observation to be conducted during all modules and for all 6 complete classes
included in study for year 2).
7. Analyze and interpret data (use data obtained from observation to detect areas of improvement).
8. Provide feedback (to instructors and other stakeholders that have an interest in improving the program).
Observation templates (year 2)
The TDOP templates were adapted to fit the needs of this evaluation. In doing so, the three basic dimensions were used, and
we added the additional dimension of ‘student engagement’. Furthermore, the coding indicators were also modified to suit our
needs, which resulted in some being removed and others added (the codes that were added appear with an Asterix in the
coding template provided).

1. Observation Protocol Cover Sheet

2. Observation Code Bank

3. Observation Coding Template

± These templates were adapted from: Hora MT, Oleson, A., & Ferrare, J.J. (2013). Teaching Dimensions Observation Protocol (TDOP) User’s
Manual. Madison. Wisconsin Center for Education Research, University of Wisconsin-Madison .
How to use the TDOP and evaluate data
(Wisconsin Center for Education Research, n.d.)

Conducting Observation: Using a hard copy or an online version of the adapted


TDOP template created for this evaluation, observers will circle codes for each
behavior observed during every two-minute interval of a class session. Moreover,
the observer will have the opportunity to add comments that may be of value in
assessing the data collected for this intervention.

Analyzing and Interpreting Data: Finally, the data will need to be analyzed and
interpreted by users. For the purpose of this observation, the format of our
analyses will focus on teaching efficacy in terms of content delivery and student
engagement, and will focus on using the acquired data to make program
improvements and provide instructors with valuable feedback to improve teaching
practices.
Proposed
Evaluation
Summative Evaluation:
Tools ● Self-Assessment

Year 3
Evaluation Questions for Teachers and Students (Year 3)
Quantitative questions:

● To what extent did the program adhere and contribute to intended outcomes indicated?
● To what extent did teachers and students have access to means and resources to improve their
skillsets in evaluating online information?
● To what extent are teachers and students analyzing online information with an analytical
approach?
● Did grants cover the cost of the program and delivering activities?

Qualitative questions:

● How did features of the program contribute to a positive learning experience?


● What component(s) of the program needs to be modified to enhance the structure?
● What parts of the program corresponded with recognizing previous skills teachers and students
already had?
Evaluation Focus (Year 3)
Phase: At end of program
Type: Outcome evaluation
Purpose: Summative evaluation to evaluate learning and practice
Users: Teachers and students
Uses: To measure learners’ achievement and impact of the project
Methods: Behavioral competency self-assessment form

A summative evaluation takes place following formative assessments in year one and two
about behaviors demonstrated that contribute to individual performance in evaluating online
information related to the research project.

This evaluation plan is focused on measuring the outputs and short-term outcomes of the
project and outputs and short-term outcomes are the focus of the evaluation at the request of
the principal investigator.
Summative Evaluation Year 3 - Gathering Data
To assess competencies in evaluating online information for teachers and students, a
summative evaluation will be used to measure short-term outcomes in the logic model used
as the evaluation focus. With an outcome evaluation type, a self-assessment will be created
to rate how teachers and students mastered goals and objectives of the project while also
giving them the opportunity to reflect on areas needing improvement.

Sample: Teachers and students


Collect and retrieve data: Data will be collected via online medium (i.e. Google forms)
Manage data
Combine
Analysis
Visualize
Evaluation Design Matrix (Year 3)
Evaluation Questions Indicators Data Source Data collection method

To what extent did the program Teacher and # of students Formative assessment Questionnaire
adhere and contribute to intended (20-30 in a classroom) can from year 1 and 2
outcomes indicated? name credible news sources

To what extent did teachers have Teacher satisfaction with online Interview protocol Focus groups/interviews
access to means and resources modules as well access to
to improve their skill sets in additional aids to accompany
evaluating online information? instruction for remedial activities

To what extent are teachers and Percent of teachers and Observation protocol Observation
students analyzing online students examine website
information with an analytical credentials and if information is
approach? recent

Did grants cover the cost of the Number of resources were used Dr. Corrigan and the Secondary data analysis
program and delivering activities? appropriately and effectively and team of researchers
met financial target
Evaluation Design Matrix (Year 3)
Evaluation Questions Indicators Data Source Data collection method

How did features of the program Increased knowledge and Formative assessment Survey
contribute to a positive learning engagement from year 1 and 2
experience?

What component(s) of the Number and types of Formative assessment Questionnaire


program needs to be modified to activities that reinforce best from year 1 and 2
enhance the structure? practices

What parts of the program Teachers’ and students’ Interview protocol Focus groups/interviews
corresponded with recognizing self-knowledge of online
previous skills teachers and sources and their reputability
students already had? and legitimacy
Competency Self-Assessment Form
To evaluate the minimum level of achievement specified, a behavioral competency
assessment form can be found here.
Triangulation
According to Corrigan (2020), incorporating more than one data collection method improves
credibility. In this instance, the research project’s baseline is currently focused on what can
students currently do and how teachers perceive current interventions in place. To improve
evaluating online information credibility, using several methods of data with different
instruments enables to view factors and aspects influencing how the program will help with
online literacy. It was important for us to include both formative and summative evaluation
methods to examine changes in different ways and with observations, interviews, focus
groups and a self-assessment, results obtained can bring forth different findings.

To evaluate the impact of the program, findings from data obtained from the methods allow
for triangulation of assessment as a result of methods complimenting each other to check
reliability as well as discovering new information (Ammenwertha, Illerb, & Mansmannc,
2003).
Use, Interpretation, Dissemination, and Sharing Plan
Data obtained and uncovered by this evaluation will be used to gain insight as to how to create
appropriate and effective instructional interventions that result in successful outcomes for outputs.
To view if activities were implemented as intended, interpretation of the data can identify how
activities can be modified or refined. With formative, summative, qualitative, and quantitative data,
we can then share final results with primary, secondary, and tertiary stakeholders.

For a supposed sharing plan and with results delivered to different stakeholders at each phase
after each year, data of the program can be shared with a typed, written report detailing
experiences and perceptions of participants.

While not every unintended consequence can be accounted for, taking steps to ensure the quality
of the data from the evaluation tools and involving the stakeholders throughout the entire process
can help mitigate negative consequences. Positive outcomes may involve raising awareness about
the importance of evaluating credibility of web sources. Negative outcomes may include not
producing outputs according to stakeholder requirements.
References
Abrami PC, Bernard R.M., Borokhovski E., Wadem A., Surkes M.A., Tamim R., Zhang D. (2008). Instructional interventions affecting critical thinking skills and
dispositions: A stage 1 meta-analysis. Rev. Educ. Res. 78:1102–1134.

Alkin, M. C., & Vo, A. T. (2017). Evaluation essentials, 2nd Ed.: From A to Z. ProQuest Ebook Central.
https://ebookcentral-proquest-com.lib-ezproxy.concordia.ca/lib/concordia-ebooks/detail.action?docID=5050337

Ammenwertha, A., Illerb, C., Mansmannc, U. (2003). Can evaluation studies benefit from triangulation? A case study. International Journal of Medical
Informatics, 70, 237-248. doi:10.1016/S1386-5056(03)00059-5

Balwant, P. T. (2018). The meaning of student engagement and disengagement in the classroom context: lessons from organisational behaviour. Journal of Further
and Higher Education, 42(3), 389–401.

CDC. (2018, August). Evaluation Briefs: Data Collection Methods for Program Evaluation: Interviews.
https://www.cdc.gov/healthyyouth/evaluation/pdf/brief17.pdf

CDC. (2019a, July 15). Framework Step 1 Checklist | Program Evaluation. https://www.cdc.gov/eval/steps/step1/index.htm

CDC. (2019b, July 15). Framework Step 3 Checklist | Program Evaluation. https://www.cdc.gov/eval/steps/step3/index.htm

Corrigan, J. (2020). Step 4 of UFE: Gather Credible Evidence [PowerPoint slides]. Retrieved from
https://moodle.concordia.ca/moodle/pluginfile.php/4550804/mod_resource/content/1/Slides%20for%20Step%204%20-%20Gather%20credible%20evidence%2C%
20Parts%201%20-%202.pdf

Gessler, M. (2009). The correlation of participant satisfaction, learning success and learning transfer: an empirical investigation of correlation assumptions in
Kirkpatrick’s four-level model. International Journal of Management in Education, 3, (3-4): 346- 358. DOI: 10.1504/ijmie.2009.027355

Harding, J. (2015). Identifying Themes and Coding Interview Data: Reflective Practice in Higher Education. SAGE Publications, Ltd.
https://doi.org/10.4135/9781473942189
References
Hora, M.T. (2014). Exploring the Use of the Teaching Dimensions Observation Protocol to Develop Fine‐grained Measures of Interactive Teaching in
Undergraduate Science Classrooms. Retrieved from [PDF] Exploring the Use of the Teaching Dimensions Observation Protocol to Develop Fine‐grained Measures
of Interactive Teaching in Undergraduate Science Classrooms | Semantic Scholar

Hora, M., & Ferrare, J.. (2010). The Teaching Dimensions Observation Protocol (TDOP). Madison, WI: University of Wisconsin-Madison, Wisconsin Center for
Education Research. Retrieved from Home (wceruw.org)

International Society for Technology in Education. (2020). Digital Literacy Assessments. We are ISTE.
https://www.iste.org/standards/seal-of-alignment/digital-literacy-assessment

Newcomer, K. E., Hatry, H. P., & Wholey, J. S. (2015). Handbook of practical program evaluation. ProQuest Ebook Central
https://ebookcentral-proquest-com.lib-ezproxy.concordia.ca (p.348)

Methodology Related Presentations - TCSPP. (2019, November 9). Coding Qualitative Data: A Practical Guide to Completing Qualitative Data Analysis.
https://www.youtube.com/watch?v=4KOpSG7myOg&feature=emb_title

Mod•U: Powerful Concepts in Social Science. (2016, October 19). How Focus Groups Can Help Your Research: Qualitative Research Methods.
https://www.youtube.com/watch?v=ng8SnDIre4o&feature=emb_title

Pell Institute. (n.d.). Using a Logic Model. Retrieved December 7, 2020, from http://toolkit.pellinstitute.org/evaluation-guide/plan-budget/using-a-logic-model/

Petrucco, C. & Ferranti, C. (2017). Developing Critical Thinking in online search. Journal of e-Learning and Knowledge Society, 13.
References
U.S. Dept. of Education. Reform Support Network. (July 2015). Using Observation to Improve Teacher Practice. Retrieved from Using Observations to Improve
Teacher Practice - How States Can Build Meaningful Observation Systems (PDF) (ed.gov)

U.S. Dept. of Health and Human Services. (2013, October 9). Card Sorting. Usability.Gov; Department of Health and Human Services.
https://www.usability.gov/how-to-and-tools/methods/card-sorting.html

Weimer, M. (2013). Learner-Centered Teaching: Five Key Changes to Practice. San Francisco, CA: A Wiley Imprint.

You might also like