Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

MOE DATA ANALYTICS TRAINING

REPORT
JANUARY 26, 2021
ACCELERATED QUALITY EDUCATION FOR
LIBERIAN CHILDREN

MOE DATA ANALYTICS TRAINING REPORT


January 26, 2021

USAID/LIBERIA ABE: ACCESS IDIQ CONTRACT


AID-OAA-I-14-00073/AID-669-TO-17-00001
Prepared for:
Andrea Plucknett, Contractor Officer (CO)
Office of Acquisition and Assistance
United States Agency for International Development/Liberia
c/o American Embassy
502 Benson Street
Monrovia, Liberia

Prepared by:
Education Development Center
43 Foundry Avenue
Waltham, MA 02453-8313
USA
ACCELERATED QUALITY
EDUCATION FOR LIBERIAN
CHILDREN (AQE)

REPORT

AQE/MOE
Data Analytics Training

Compiled: January 26, 2021


Table of Contents

Acronyms and Abbreviations ......................................................................................................................... 1

1. Training Background and Context............................................................................................................... 2


1.1 Introduction ............................................................................................................................................. 2
1.2 Training Participants ................................................................................................................................ 3

2. Description of Activity ................................................................................................................................ 4


2.1 Training Design ........................................................................................................................................ 4
2.2 Training Preparation ................................................................................................................................ 4
2.3 Training Methodology ............................................................................................................................. 5
2.4 Monitoring and Evaluation ...................................................................................................................... 5

3. Achievements ............................................................................................................................................ 5
3.1 Planning and organization ....................................................................................................................... 5
3.2 Delivery .................................................................................................................................................... 6
3.3 Pre & Post Tests Results........................................................................................................................... 6

4. Fidelity of Implementation ......................................................................................................................... 7

5. Training Evaluation .................................................................................................................................... 7

6. Feedback from ‘3, 2, 1’ Evaluation .............................................................................................................. 8

7. Lessons Learned ....................................................................................................................................... 10

8. Quotes from Participants ......................................................................................................................... 10

9. Recommendations ................................................................................................................................... 11

10. Photo Gallery ......................................................................................................................................... 12

11. Annexes ................................................................................................................................................. 13


Training Program......................................................................................................................................... 13
Acronyms and Abbreviations
List in alphabetical order

AE Alternative Education

API Application Programming Interface

AQE Accelerated Quality Education for Liberian Children

EDC Education Development Center, Inc.

EMIS Education Management Information System

MOE Ministry of Education

TWG Technical Working Group

Power BI Microsoft Power Business Intelligence

URL Uniform Resource Locator

USAID United States Agency for International Development

USAID AQE MONITORING & EVALUATION TEAM 1


1. Training Background and Context

1.1 Introduction

As USAID-AQE project comes to end in July 2021, the MOE will assume direct implementation of ALP.
Thus, the MOE in collaboration with USAID-AQE Project has embarked on a transition and
sustainability approach that will allow the Ministry to run ALP classes alongside conventional lower
basic classes during regular school time in the morning, or in the morning and afternoon. The MOE
will also take full responsibility for the continuous monitoring, evaluation and reporting on both
conventional and ALP schools. MOE M&E
Officers at the county level work along with About AQE:
and support the County Education System
to monitor, evaluate and make evidence- USAID-Accelerated Quality Education for Liberian Children
based decisions at the county level. At the Project (AQE) is a 4-year intervention designed to increase
national level, the Division of Planning, access to primary education for approximately 48,000 over
Research Development calibrates the data age and out-of-school children (ages 8 – 15 years), allowing
processing system, and ensure that them to re-enter formal schooling in a shorter amount of
evidence, based on data gathered, are time thus giving them a boost to pursue further education or
available on a timely basis to make training. Under Result 1, the Accelerated Quality Education
informed decisions benefiting the school for Liberian Children supported the MOE to adopt national
system. EMIS supports the MOE with ALP Policies, standards, strategies and processes, including,
monitoring data for decision-making. All of learner eligibility policy, certification policy, a national ALP
these structures are expected to work
curriculum, and the ALP School quality assessment
together to ensure decisions in the
standards, tools and process. The activity trained CEOs on
education sector are made based on data
that need to be processed an analyzed. the usage of ALP EMIS monitoring data for decision-making
and on budgeting, supported DEOs to visit centers and
The USAID-AQE M&E Team conducted the oversee their accreditation and certification, and trained
AQE/MOE Data Analytics training from school principals in supervising their ALPs. Finally, the activity
January 19 – 21, 2021 in Monrovia, at the strengthened community awareness of ALP policies and
Cape Hotel in Mamba Point. Attendants opportunities, trained PTAs on the ALP framework, and
during the training included MOE central created regular feedback loops between communities and
level staff from the Division of Planning, the local education authorities.
Research & Development Division (M&E),
as well from the AE Division and the EMIS Department. The training also included M&E Officers from
across the six counties (Grand Bassa, Margibi, Montserrado, Bong, Lofa and Nimba) where USAID-AQE
implements its activity. Cumulatively, 28 persons attended the training. USAID-AQE Database
Manager led the training delivery with support from the M&E Coordination and the two Transitional
M&E Officers. This training, organized in response to the request of the Assistant Minister for Planning,
Research and Development at MOE, is an innovative move in meeting, “AQE intermediate result 1:
ALP Regulatory Framework Institutionalize”. The Data Analytics Training builds on previous training
in KoBo Toolbox that AQE offered 60 education officers. This time, emphasis was on data analysis and
how to link data collected in Kobo with Power BI dashboards for effective visualization and
interpretation and evidence-based decision-making.

The training was straightly hands-on, using mix of experiential and participatory learning approaches,
which enabled participants to enhance their knowledge through practice and simulation. AQE
designed the training to help in building the capacity of the MOE, i.e., the Division of Planning Research
and Development, and EMIS, as well as that of M&E Officers at the county level. To help achieve this
goal, the Data Analytics Training outlined the following objectives and outcomes:

USAID AQE MONITORING & EVALUATION TEAM 2


Objectives:
By the end of the training participants would be able to:
• Develop data collection mechanisms on Kobo toolbox
• Download and export data from Kobo toolbox
• Link datasets on kobo to Power Bi
• Analyze and visualize data with Power Bi

Outcomes:
• Participants create their individual KoBo accounts and are able to develop sample databases in
KoBo and input some basic data.
• Participants create their own Power BI accounts, activation and logins and learn the various
interface in Power BI desktop and Power BI Services.
• Participants are able to review the KOBO toolbox browser interface and are able to access KoBo
API interface and generate link from KoBo to Power BI.
• Participants understand how to create relationships between related data sources; that is they
understand the flow and usage of Power BI, accessing and connecting to various data sources
(excel file and API link) and importing these into Power BI.
• Participants know how to build reports with various types of aggregations and filters and
understand the various types of possible visualizations in Power BI and how to use graphs.
• Participants know how to create powerful reports and dashboards using Power BI and are able to
publish their reports and dashboards on the Internet and view those using laptops, tablets or
smartphones. In short, participants are able to generate tables, graphs and publish results on the
internet.

1.2 Training Participants


The Data Analytics Training brought together fifteen (15) technical staff from the MOE Planning,
Research and Development Division, the AE Division, and EMIS Department at the central level of
ministry. It also brought together six (6) County M&E Office from selected counties. These are believed
to be some of the most critical teams involved with data collection and processing needed for
evidence-based decision making, especially at this point of transition and backstopping of the ALP
beyond AQE. The table below gives a summary of attendance during the training, including USAID-
AQE team members.

Table 1: Data Analytics Training Attendance Summary

Categories of Attendants
TOTAL
MOE Planning, MOE County USAID-AQE Tech USAID-AQE
County Research & MOE EMIS
MOE Alternative
M&E Officers Team & Senior M&E Team Attendance
Education
Development Management
M F M F M F M F M F M F Male Female Total
Monrovia Office 9 0 3 1 1 1 N/A N/A 1 2 2 - 16 4 20
Bong N/A N/A N/A N/A N/A N/A 1 - N/A N/A 1 - 2 0 2
Grand Bassa N/A N/A N/A N/A N/A N/A 1 - N/A N/A - - 1 0 1
Lofa N/A N/A N/A N/A N/A N/A 1 - N/A N/A - - 1 0 1
Margibi N/A N/A N/A N/A N/A N/A 1 - N/A N/A - - 1 0 1
Montserrado N/A N/A N/A N/A N/A N/A 1 - N/A N/A 1 - 2 0 2
Nimba N/A N/A N/A N/A N/A N/A 1 - N/A N/A - - 1 0 1
Total 9 0 3 1 1 1 6 0 1 2 4 0 24 4 28
USAID AQE MONITORING & EVALUATION TEAM 3
2. Description of Activity
2.1 Training Design
The AQE MOE Data Analytics Training was a 3-day technology-based training designed to enhance the
capacity of the MOE through its Division of Planning, Research and Development. The training also
targeted Division of Alternative Education and EMIS. In addition, in order to decentralize the capacity
of the MOE in Data Analytics, the training also targeted MOE County M&E Officers from Bong, Lofa
and Nimba (in-land counties) as well as Grand Bassa, Margibi and Montserrado (coastal counties). The
training was arranged into interactive sessions, with content and activities focused on range of
outcomes. Base on the caliber of participants attending the training, a mix of experiential and
participatory approaches were incorporated into the training design. The training included several
practice and simulation activities. A pre-test and post-test to measure knowledge participants gained
from the training. In order to assess the effectiveness of the training, the training design also allowed
for participants’ feedback and evaluation.

2.2 Training Preparation


AQE M&E developed PowerPoints and datasets for the training with technical input from the Senior
M&E Specialist, AQE ICD/CES Team, and Deputy Chief of Party for Programs. Prior to the training, a
technical dry run involving the AQE M&E Team was held. Printing of training resources and
organization of tech materials (tablets, laptops), and other logistics were done in Monrovia. The M&E
team including those from transitional offices worked along with the MOE (central and local) to inform
participants in advance of the training and made reminder calls few days to the training.
Table 2: Overview of Training Preparation Tasks
Time Period Activity Tasks
Development of Training Materials:
§ Training objectives and outcomes mapped by M&E Team and shared with EDC SMT and
Tech Team for review and feedback.
§ Training objectives and outcomes revised and finalized based on feedback and comments
Jan 4 – 13, 2021 from EDC SMT and Tech Team.
§ M&E Team developed session plans in line with training objectives and outcomes and
shared with EDC SMT and Tech Team for review and input.
§ Training guide finalized and technical dry run held for EDC M&E and Tech Teams.
§ M&E Team reproduce training resources in needed quantities and pack stationery for
rollout of the training in Monrovia.
Organize Training Logistics:
§ M&E Team works with MOE (central and county level) to develop participants’ list with
all required information needed for the training.
§ M&E develops list of all necessary materials and technological accessories needed to
implement the training.
§ Training budget drafted and submitted to Finance team for review and further action as
Jan 6 – 18, 2021 needed.
§ M&E team works with Logistics team to ensure that all logistics are procured and or made
available (training venue and catering arrangement finalized, tablets and computer and
other accessories handed over to M&E for programming, provisions made for training
accessories and other logistical support needed)
§ M&E Team sends out save-the-date notices to all stakeholders, including relevant
information regarding the training.

USAID AQE MONITORING & EVALUATION TEAM 4


Time Period Activity Tasks
Dry run session for AQE M&E Team:
§ Introduce AQE Transitional M&E Officers and other members of the training teams to the
training content and delivery methodologies.
§ Explore ways training team could provide technical backstopping for smooth
Jan 14, 18, 2021 implementation of the training.
§ Database Manager rollout training sessions, and allowed other training staff to practice
the same.
§ M&E Coordinator follows up with participants via phone or email and remind them about
date, venue, among others.
Conduct AQE MOE Data Analytics Training
Jan 19 - 21, 2021
§ Training team conducts training as planned.

2.3 Training Methodology


A blend of experiential and participatory learning methods were employed during implementation of
the training. Key among the methods used were demonstration, practice and simulation. A couple of
simulations was held at different intervals in the training thereby allowing participants to demonstrate
knowledge and skills gained. A pretest and posttest was administered to gauge knowledge participants
gained because of the training, while evaluation was administered to gather participants’ feedback
about the quality with which the training was delivered.

2.4 Monitoring and Evaluation


For monitoring and documentation purposes, participants signed onto the attendance sheet during
the start and close of each training day. During the first day training, all attendants received preprinted
name badges, which made it easier for everyone to identify each other by name. At the close of the
training, participants used an online KoBo evaluation form to provide feedback about the training. The
feedback focused on relevance of the training content and resources, effectiveness of methodology
and facilitation, knowledge of the trainers, and quality of the participation. Using post it notes, further
evaluation was done allowing participants to highlight the following:
a. List three (3) things they had learned because of the training.
b. List two (2) things they saw as challenges, and
c. Make one (1) recommendation for improvement

3. Achievements
3.1 Planning and organization
With knowledge on contents and experience, the key facilitator along with the rest of the training
team demonstrated hands-on skills, went through the training objectives with participants. The
training team used modelling of the KoBo Toolbox, Power BI Desktop and Power BI Services during
training. In other words, the training was hands-on and participants were drilled through using a series
of practice and simulation activities.

To go through the training smoothly, each participant was required come along with a computer
(laptop) and have an up-to-date web browser (Google Chrome/Firefox), an active internet connection
and an android device (tablets) for testing. Each participant was required to have two email accounts,
a google and an organizational email account (examples, datatraining@gmail.com or
data.training@moe.lr). A google account was important for working with KoBo Toolbox while an
organizational account was essential in working with Power Desktop and Power BI Services.

USAID AQE MONITORING & EVALUATION TEAM 5


3.2 Delivery
All training sessions were successfully delivered within the space of three (3) working days and not
two-and-a-half day (2.5) as initially planned. Participants were actively engaged, mostly individually,
through training methodologies, and given sufficient time to try out and do simulations. Each
participant came to the training with a laptop, and for those who did not, EDC provided temporary
laptops for the training. EDC also provided temporary tablets for further practice and simulation. Most
of the participants did not have an MOE email account, therefore, EDC provide them access to
temporary EDC email accounts during the training. Links, data files and other relevant training
handouts were distributed via email accordingly.

AQE M&E team with some support from tech team members took the lead in rolling out the training,
with additional from AQE SMT as needed. A pre/post test was administered to assess participants’
knowledge level improvements made as a result of the training. The training team held daily debrief
meetings to assess the quality of training delivery and made adequate adjustments as much as
possible and or necessary.

3.3 Pre & Post Tests Results


Before the training began, the AQE M&E team administered a pre-test to participants in order to
measure participants’ knowledge prior to the training. At the end of the training, the team also
administered a post-test to evaluate training outcomes (knowledge gained). Participants filled out
these questionnaires electronically, using the Kobo app. Averages scores of the pre-test indicated that
only 21% of the participants were knowledgeable to some of the technology introduced, while up to
79% had never used the any of those technologies. The results also shows that 94% of the participants
had NO idea in developing an online database using KoBo Toolbox, while 83% said they had NEVER
used the Power BI platform. The post-test results indicated a rather positive result. The post-test
results show that an average of 76% of the participants believe they are now knowledgeable, while
29% still have some challenges using some of the technology introduced. Post-test scores show a very
steep increase, from 6% to 94%, knowledge in using KoBo Toolbox, and an increase from 28% to 82%
acquiring the relevant skill in working with Power BI Desktop and Power BI service. The tables
following give us an overview of those assessments:

Figure 1: Pre-Test Results

Pre Test Technology Assessment


94%
100% 83% 78% 79%
72% 67%
80%
60%
28% 33%
40% 17% 22% 21%
20% 6%
0%
Do you have Have you Have you work Have your Do you know Average
any experience develop with data created a how to work
in developing database in visualization in dashboard in with
an online Kobo toolbox? Power BI? Microsoft PowerQuery in
Database? excel? Excel?

Yes in Percent No in Percent

USAID AQE MONITORING & EVALUATION TEAM 6


Figure 2: Post-Test Results

Post test technology assessment


94% 94%
100% 82% 76%
80% 65%
59%
60% 47% 41%
40% 29%
24%
20% 12% 12%
0%
Do you have Have you Have you work Have your Do you know Average
any experience develop with data created a how to work
in developing database in visualization in dashboard in with
an online Kobo toolbox? Power BI? Microsoft PowerQuery in
Database? excel? Excel?

Yes in Percent No in Percent

4. Fidelity of Implementation
All training contents contained in the session plans were covered. The training team with the support
of the EDC Senior Management and Tech Team, as well as the TWG followed training agenda and
session plan for each topic during the training. Initially, the training team planned to cover the training
in 2.5 days, but eventually covered all contents in 3 days instead. Participants completed all practice
and simulations activities outlined in the training plan. Participants were able to use their personal
laptops, or temporary laptops assigned by EDC, to assess databases, links and other documents for
the training.

5. Training Evaluation
At the end of the training, the team also conducted an evaluation gather participants’ feedback on
different aspect of the training and how they intend to apply knowledge and skills acquired from the
training. Seventeen (17) participants completed the training evaluation and provided mostly positive
rating about the quality with which the training delivered as indicated in Fig. 3 below.

Fig. 3: Summary of Training Evaluation

ANALYTICS TRAINING EVALUATION


The training venue is ideal.
Average Score
2 The training met my
expectations.
2
Adequate time was I will be able to apply the
provided for questions… 1 knowledge learned.

Participation and 1 The training objectives for


interaction were… each session were clear
0

The trainer modelled The training objectives for


knowledge and skills that… each session were met.

The quality of instruction The training content was


was good. organized and easy to…
The trainer was The materials distributed
knowledgeable. were relevant and useful.

USAID AQE MONITORING & EVALUATION TEAM 7


Fig. 4: Synopsis of Training Evaluation

How do you rate the training overall

Average
29%
Good
47%

Excellent
24%

Average Excellent Good

6. Feedback from ‘3, 2, 1’ Evaluation


Participants were asked to highlight three (3) things they had learned from the training, list two (2)
things they saw as challenges, and make one (1) recommendation for improvement. This layer of
evaluation showed that significant gains were realized in various aspects of the training, including
knowledge gained in development of sample databases in KoBo and inputting some data. Among
other things, some participants believed they are now able access KoBo API interface and generate
links from KoBo to Power BI. Some participants indicated that though the training was good, having
another round of training would enhance the level of knowledge and skills they have acquired. Other
participants, especially the M&E from the county education offices, further indicated that if they must
practice and implement the knowledge and skills gained, they needed to be furnished with the
laptops, tablets and necessary peripherals; otherwise, they may not be able to implement what they
have learned. Table 4 below gives us some of the key findings from participants’ feedback.

Table 3: Themes from ‘3, 2, 1’ Evaluation


Challenges Recommendations for
Learning Outcomes Things Learned
encountered improvement
Participants create their § All participants indicated § Some participants § AQE to facilitate a
individual KoBo accounts they have learned how to indicated that second (follow-up)
and are able to develop create a KoBo account; cascading in KoBo or advance phase
sample databases in § Most people claim they are was a challenge for of the Data
KoBo and input some able to develop basic them. Analytics Training.
basic data. forms, or data collection § A few persons § Provide additional
tools, including setting indicated that training in Power BI
logics, in KoBo. creating a KoBo and how to links
§ Most participants also said collect database datasets in KoBo to
they are now able to create was a challenge Power BI.
databases with KoBo § Trainers need allot
Toolbox. more time during
§ A few participants indicated the training for
that they understood how effective practice
simulation
to input data from Excel to
activities.
Kobo in developing forms.
USAID AQE MONITORING & EVALUATION TEAM 8
Challenges Recommendations for
Learning Outcomes Things Learned
encountered improvement
Participants are able to § Some participants claimed § Some participants § Include step-by-
review the KOBO toolbox they have learned how to mentioned that step hand-outs to
browser interface and use API to link KoBo to they struggled to be use by
are able to access KoBo Power BI. understand how participants, during
API interface and § Other say they have link KoBo to Power and after the
generate link from KoBo learned how to link a BI training.
to Power BI. database to using a URL § Other also said they § Assistant MOE with
did not understand logistics such as
how to link KoBo to laptops, tablets and
PowerPoint. other peripherals,
Participants understand § Some participants indicated § Some participants to do their work.
how to create that they had learned how said working with
relationships between to create a Power BI Power BI and
related data sources; that account and how to link it Power Query was a
is they understand the to Excel and KoBo other challenge for them.
flow and usage of Power data sources. § Some more persons
BI, accessing and said they had
connecting to various challenges linking
data sources (excel file databases.
and API link) and
importing these into
Power BI.
Participants know how to § A few also indicated that § Many participants
build reports with they understood how to indicated they still
various types of create a dashboard in struggled to
aggregations and filters Power BI. analyze and
and understand the visualize data using
various types of possible Power BI
visualizations in Power § Other also said they
BI and how to use needed more
graphs. training in
designing tables in
Power BI
Participants know how to § A few participants indicated § Some said they had
create powerful reports that they understood how challenges working
and dashboards using to create an online with Power BI.
Power BI and are able to database.
publish their reports and § Some say they have also
dashboards on the learned how to build a
Internet and view those Power BI link.
using laptops, tablets or
smartphones. In short,
participants are able to
generate tables, graphs
and publish results on
the internet.
General § § Almost all
participants
indicated that the
time allotted for

USAID AQE MONITORING & EVALUATION TEAM 9


Challenges Recommendations for
Learning Outcomes Things Learned
encountered improvement
the training was not
enough.
§ Furthermore, most
participants
indicated that they
needed more time
for practice and
simulation.
§ Limited or poor
internet activity
was highlighted as
a major challenge
§ Unavailability of
logistics, such as
laptops
§ Regular late arrival
of participants to
training.

7. Lessons Learned
• The daily debrief session held each day significantly helped the trainers to step up on the next
days’ activities
• With all the right accessories and technologies available, it is easier to rollout this nature of
training without much difficulty for the most part.
• The inclusion of the MOE County M&E Officers during the training meant a great deal to the
ministry as highlighted by the Assistant Minister for Planning Research & Development and
emphasized by the Director of EMIS. Paraphrasing their words, this meant each county could
do some data analysis within their local context and make some evidence-based quick impact
decisions, without going through the long wait of doing so at national/central level.

8. Quotes from Participants


• The Director of EMIS, Leah Tomah Zinnah, wished the training had happened a long time ago
– according to her, before they spent a lot of time and money on outsourcing the work to a
provider. The remarks from the Director of EMIS – referred to:
o team understanding of reports presented to Ministry,
o decentralization emphasis of involving the County M&E Officers,
o focus on context where each county has their own particular challenges which need to be
captured through M&E processes,
o capacity now for county level reporting and representation of data /visualization of issues
– to inform county policy.
• Additionally, according to remarks from both the Assistant Minister for Planning, Research &
Development and the EMIS Director, the training is a real sense of empowerment – a big
achievement; owning their database; interpreting and reporting to the Minister on what
needs to be done and corrected.

USAID AQE MONITORING & EVALUATION TEAM 10


9. Recommendations
Based on the challenges encountered as well as lessons learned from implementation of the Data
Analytics Training, the below recommendations are thus proffered:
• Organize another round of training, allowing enough time for practice and simulation
activities. This will include liaising with MoE to involve M&E offers in future data collection
and analysis activities.
• Laptops disposition to the counties – given that they have nothing to go back to; to practice
with; to implement learning;
o Fast track disposition of some laptops to the counties; assigning them through the
County Education Offices (sign MOU designating laptops not for individuals – but for
the county offices).
• Dispose additional tablets to MOE at the national and county level to help with data collection
processes.

USAID AQE MONITORING & EVALUATION TEAM 11


10. Photo Gallery

AQE Database Manager demonstrating steps


involved in developing sample KoBo forms

PowerPoint display shows how to review KOBO form using the display tab

EMIS Director Practices steps with colleague during Data Analytics Training; Other colleagues from Planning
Research & Development and Alternative Education Division, respectively, seen in background

MOE County M&E Officers simulate what they have learned

USAID AQE MONITORING & EVALUATION TEAM 12


11. Annexes
AQE MoE Data Analytics Training
Training Program
Time Activities Facilitator/
Organization
Day 1 Jan 19
8.00 – 8.30 Breakfast
8.30 – 9.00 Introduction to KOBO Technology AQE M&E Team
• Review of KoBo Toolbox browser interface
• Review and Generating link from KoBo Api interface
9.00 – Introduction to Power BI AQE M&E Team
10.00am • Introduction to Power BI Service and Power BI Desktop
• Signing up for Power BI and download training file to desktop
• Loading data into Power BI service and Power BI Desktop
• Practical activities
10.00 – Creating report in Power BI Desktop AQE M&E Team
12.00am • Creating tables and generating results in Power BI
• Table Style, Formatting and percentage calculation
• Filtering Data in Power BI
• Practical activities
12.00 – 13.00 Lunch
13.00 – Graph and Visualization AQE M&E Team
17:00Pm • Data visualization using: Clustered column graphs, Stacked and
100% graphs, Graphs options, Area graphs, Ribbon graphs, Trend
Analysis Graph;
• Scatterplots and Bubbleplots
• Decomposition Tree
• Practical activities
Day 2 Jan 20
8.00 – 8.45 Breakfast
8:45 – 12:00 Interactive Dashboards AQE M&E Team
• Creating interactive Dashboard
• Publishing Reporting to Power BI Service
• Pinning Visualization to Dashboards
• Mobile Reports
• Using Themes in Power BI
• Practical activities
12.00 – 13.00 Lunch
13.00 – 14:00 Interactive Dashboards cont. AQE M&E Team
- Practical activities
14:00-16:00 Introduction to Relationship in Power BI AQE M&E Team
- Creating and managing relationship in Power BI
- Practical activities
Day 3 Jan 21
8.00 – 8.45 Breakfast
8:45 – 12:00 Practical activities (Group work) AQE M&E Team
• Create a simple data base in Kobo
• Input data and generate result from Kobo to PowerBi
• Develop tables and graphs in PowerBI using link.
• Export Data from Kobo ToolBox to excel and link to PowerBI
• Publish Reports to Power BI Service
12.00 – 13.00 Lunch
13:00 – 1400 Departures

USAID AQE MONITORING & EVALUATION TEAM 13


Data Analytics Training
PRE-TEST QUESTIONNAIRE (Computer/Tablet-Based)
Duration - 5 Minutes

Direction (via email): AQE is conducting a quick survey ahead of the analytics training this morning.
Please click on the link and submit an entry. https://ee.kobotoolbox.org/x/l93eZfqg

1. Do you have any experience in developing an online Database?

2. Have you ever developed a database in Kobo toolbox?

3. Have you worked with data visualization in Power BI?

4. Have you ever created a dashboard in Microsoft Excel?

5. Do you know how to work with PowerQuery in Excel?

Data Analytics Training


POST-TEST QUESTIONNAIRE (Computer/Tablet-Based)
Duration - 5 Minutes

Direction (via email): Please click on the following link, complete the post training assessment and
submit your entry. https://ee.kobotoolbox.org/x/jVJ1sgNh

1. Do you understand how to develop an online Database?

2. Do you understand how to develop a database in Kobo toolbox?

3. Are you able to work with data visualization in Power BI?

4. Are you able to create a dashboard in Microsoft Excel?

5. Do you understand how to work with PowerQuery in Excel?

USAID AQE MONITORING & EVALUATION TEAM 14


Data Analytics Training
TRAINING EVALUATION QUESTIONNAIRE
(computer/Tablet-Based)

Instructions: Thanks for participating in the capacity-building training. Please complete this training
evaluation as objectively as possible in order to help us improve the quality of future training. Please
complete and submit the evaluation using the following link: https://ee.kobotoolbox.org/x/TL00Efli

Strongly Agree Neutral Disagree Strongly


Agree Disagree
1. The training venue is ideal. ¨ ¨ ¨ ¨ ¨
2. The training met my expectations. ¨ ¨ ¨ ¨ ¨
3. I will be able to apply the knowledge learned. ¨ ¨ ¨ ¨ ¨
4. The training objectives for each session were clear. ¨ ¨ ¨ ¨ ¨
5. The training objectives for each session were met. ¨ ¨ ¨ ¨ ¨
6. Training content was organized and easy to follow. ¨ ¨ ¨ ¨ ¨
7. Materials distributed were relevant and useful. ¨ ¨ ¨ ¨ ¨
8. The trainer was knowledgeable. ¨ ¨ ¨ ¨ ¨
9. The quality of instruction was good. ¨ ¨ ¨ ¨ ¨
10. The trainer modelled knowledge and skills that ¨ ¨ ¨ ¨ ¨
they want you to acquire.
11. Participation and interaction were encouraged. ¨ ¨ ¨ ¨ ¨
12. Time was adequate for questions discussion. ¨ ¨ ¨ ¨ ¨
13. Pre and post tests covered materials taught in the ¨ ¨ ¨ ¨ ¨
training.
14. How do you rate the training overall?

Excellent Good Average Poor Very poor


¨ ¨ ¨ ¨ ¨

15. What new thing did you learn from the training, and how do you intend to use it in teaching?

16. What aspects of the training was most relevant and what aspects was least relevant?

17. What aspects of the training could be improved?

USAID AQE MONITORING & EVALUATION TEAM 15

You might also like