Chapter V

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 14

CHAPTER V

SUMMARY OF FINDINGS AND RECOMMENDATIONS

5.1 Introduction

Learning analytics are used to research and build models in several areas that can

influence online learning systems. One area is user modeling, which encompasses what a learner

knows, what a learner’s behavior and motivation are, what the user experience is like, and how

satisfied users are with online learning. At the simplest level, analytics can detect when a student

in an online course is going astray and nudge him or her on to a course correction. At the most

complex, they hold promise of detecting boredom from patterns of key clicks and redirecting the

student’s attention. Because these data are gathered in real time, there is a real possibility of

continuous improvement via multiple feedback loops that operate at different time scales—

immediate to the student for the next problem, daily to the teacher for the next day’s teaching,

monthly to the principal for judging progress, and annually to the district and state administrators

for overall school improvement.

5.2 Statement of the problem

Title of the Present study is “Effectiveness of Learning Analytics in School Based


Assessment.”

This study aimed to design learning analytics model for school based assessment and

train the teachers to use ways and means in learning analytics with the help of technology. The

easy and easily available software are selected and train the teachers on it. This has been studied

that the learning level of the teachers and their experience. Besides, the present study to

emphasize the process of assessment in terms of quick and reliable measurement and

interpretation of data comprehensively.


5.3 Objectives of the Study

The objectives of present research are

 To design learning analytics model (LAM) underlying inclusion in School Based

Assessment.

 To implement learning analytics model (LAM) in School Based Assessment and

observe the elements as keys of quality enhancement

 To evaluate learning analytics model (LAM) in School Based Assessment and affords

meta-learning, remedial teaching, decision making and feedback mechanism.

 To suggest learning analytics model (LAM) for further adoption.

5.4 Hypotheses of the study

The present study concerns two areas to study them in depth in a specific and concrete way

the use of learning analytics in school based assessment. The formulated hypotheses for the this

study are

1. There is no significant difference between pre-test mean scores of control group and

experimental group of teachers in their knowledge and skill in implementation of LA in

SBA.

2. There is no significant difference between male and female teachers in pre-test of control

group in their knowledge and skill in implementation of LA in SBA.

3. There is no significant difference between rural and urban teachers in pre-test of control

group in their knowledge and skill in implementation of LA in SBA.


4. There is no significant difference between teachers having above 10 years of experience

and below 10 years in pre-test of control group in their knowledge and skill in

implementation of LA in SBA.

5. There is no significant difference between male and female teachers in pre-test of

experimental group in their knowledge and skill in implementation of LA in SBA.

6. There is no significant difference between rural and urban teachers in pre-test of

experimental group in their knowledge and skill in implementation of LA in SBA.

7. There is no significant difference between teachers having above 10 years of experience

and below 10 years in pre-test of experimental group in their knowledge and skill in

implementation of LA in SBA.

8. There will be significant difference between post-test mean scores of control group and

experimental group of teachers in their knowledge and skill in implementation of LA in

SBA.

9. There is no significant difference between male and female teachers in post-test of

experimental group in their knowledge and skill in implementation of LA in SBA.

10. There is no significant difference between rural and urban teachers in post-test of

experimental group in their knowledge and skill in implementation of LA in SBA.

11. There is no significant difference between teachers having above 10 years of experience

and below 10 years in post-test of experimental group in their knowledge and skill in

implementation of LA in SBA.

12. There is no significant difference between post-test mean scores of control group and

experimental group of teachers in their knowledge and skill in implementation of LA in

SBA with relate to following dimensions


 Knowledge in fundamentals of learning analytics

 Skill of implementation of school based assessment

 Feasibility in implementation of LA in school based assessment

 Learning analytics in Learning support

 Data aspects – comparability, efficiency and transparency

13. The constructive implementation process of LA and SBA will be arrived through focus

group discussion with relate to following dimensions

 Knowledge in fundamentals of learning analytics

 Skill of implementation of school based assessment

 Feasibility in implementation of LA in school based assessment

 Learning analytics in Learning support

 Data aspects – comparability, efficiency and transparency

5.5 Design of the study

The present research, mixed method of research design has adopted. The term “mixed

methods” refers to an emergent methodology of research that advances the systematic

integration, or “mixing,” of quantitative and qualitative data within a single investigation or

sustained program of inquiry. The basic premise of this methodology is that such integration

permits a more complete and synergistic utilization of data than do separate quantitative and

qualitative data collection and analysis.

In quantitative part, the investigator has adopted the parallel group experimental design i.e

control group and experimental group. For qualitative study, feedback schedule has developed

and focus group interview are conducted for data assortment process on effectiveness of learning

analytics in school based assessment. This study focused on developing skills of the teacher to
practice learning analytics in the school based assessment through training conducted on school

based assessment and learning analytics.

5.6 Sample and sampling

The sample of present research consisted of 30 teachers for control group and 30 teachers

for experimental group, includes 17 female teachers and 13 male teachers in control group and

16 female and 14 male teachers in experimental group from various blocks of Salem district.

Purposive sampling technique has been adopted for this study.

5.7 Tools used for the study

The researcher has used this method in which information is obtained with the help of

questionnaire which is prepared exclusively for the specific purpose. A questionnaire consists of

a number of questionnaires printed in a definite order on a form. Questionnaire and schedule are

increasingly used for collection of varied and diverse data in survey research. In this method a

questionnaires is personally given to the respondent with the request to answer the questions and

return the questionnaire. The presence study used a tool for obtaining the scores on the

effectiveness of learning analytics in school based assessment to be made by the teachers in

various aspects.

1. Achievement questionnaire:

In order to find out the impact of the training on learning analytics in school based

assessment – one day workshop and online review meetings conducted for the teachers to

make an insight on analysis of students learning outcome in school based assessments

such as formative assessment and other qualitative assessment. The tool has 15 questions

are to check the knowledge of learning analytics and about school based assessments.
2. Feedback and evaluation schedule –

This schedule consists of 20 items, this includes 15 items are in five point scale

and the rest of 5 items are in open ended. Every question thus has minimum 1 and

maximum 5 marks. The marks for every question and total marks for every parameter

and overall total has become the base for all statistical analysis and interpretation. Open

ended questions carry no marks but it gives insight on the acknowledgement of the

teachers on learning analytics in school based assessment.

3. Focus Group Interview

Focus group interview tool has used for this study.

5.8 Major findings of the study

The major findings of the present study are

The calculated value of the control and experimental group teachers had shown no

difference between control and experimental group teachers in their mean scores in pre-

test. It is concluded that the knowledge and implementation skill of the teachers are same

in pre-test towards learning analysis and school based assessment.

The mean difference of the male and female teachers of control group in pre-test, no

difference noticed. It is concluded that the knowledge and implementation skill of the

male and female teachers are same in pre-test towards learning analysis and school based

assessment.

The mean difference of the rural and urban teachers of control group in pre-test, no

difference noticed. It is concluded that the knowledge and implementation skill of the
male and female teachers are same in pre-test towards learning analysis and school based

assessment.

The mean difference of the teachers having teaching experience above 10 years and

below 10 years of control group in pre-test, no difference noticed. It is concluded that the

knowledge and implementation skill of the male and female teachers are same in pre-test

towards learning analysis and school based assessment.

The mean difference of the male and female teachers of control group in post-test, no

difference noticed. It is concluded that the knowledge and implementation skill of the

male and female teachers are same in pre-test towards learning analysis and school based

assessment.

The mean difference of the rural and urban teachers of control group in post-test, no

difference noticed. It is concluded that the knowledge and implementation skill of the

male and female teachers are same in pre-test towards learning analysis and school based

assessment.

The mean difference of the teachers having teaching experience above 10 years and

below 10 years of control group in post-test, no difference noticed. It is concluded that

the knowledge and implementation skill of the male and female teachers are same in pre-

test towards learning analysis and school based assessment.

There is difference between control and experimental group teachers in their mean scores

in post-test. It is concluded that the teachers of experimental group have gained the

knowledge and implementation skill learning analysis and school based assessment

through workshop.
There is no difference in post-test scores of the teachers with relate to gender, locality of

the teacher and years of experience. It is concluded that teachers of experimental group

have gained the knowledge and implementation skill learning analysis and school based

assessment through workshop.

5.9 Discussion

The constructive implementation process of LA and SBA have been arrived through focus group

discussion with relate to following dimensions

 Knowledge in fundamentals of learning analytics

 Skill of implementation of school based assessment

 Feasibility in implementation of LA in school based assessment

 Learning analytics in Learning support

 Data aspects – comparability, efficiency and transparency

The consultation brought about valuable information also in terms of current technology

in student assessments and the inclusion of innovative strategies in it. In terms of

implementation of learning analytics tools used in assessment and interpretation purpose,

essentially positive feedback was received on the concept of LA. Similarly, teachers are accepted

the existence of sorting errors and lack of pictorial representation of results may be affected the

reproduction of the results in future perspective. Overall the teachers are agreed with the tools

and materials proposed in present study for analyzing the learning outcome of the students in

various level of evaluation.

The main concerns raised with regards to the conditions in present school environment,

simply put, learning analytics is the collection of data about a student’s academic performance,

and analyzing it to derive trends and patterns that reveal areas which need improvement. One of
the main reasons why educational institutes use learning analytics tools is to find out potential

problem areas of each student and take timely action to address them.

Education institutions pioneering the use of data mining and learning analytics are

starting to see a payoff in improved learning and student retention working from student data

can help educators both track academic progress and understand which instructional practices are

effective. The guide describes also how students can examine their own assessment data to

identify their strengths and weaknesses and set learning goals for themselves. Recommendations

from this guide are that K–12 schools should have a clear strategy for developing a data-driven

culture and a concentrated focus on building the infrastructure required to aggregate and

visualize data trends in timely and meaningful ways, a strategy that builds in privacy and ethical

considerations at the beginning (Koedinger, McLaughlin, and Heffernan 2010). However, the

feasibility of implementing a data-driven approach to learning is greater with the more detailed

learning micro-data generated when students learn online, with newly available tools for data

mining and analytics, with more awareness of how these data and tools can be used for product

improvement and in commercial applications, and with growing evidence of their practical

application and utility in K–12 and higher education. There is also substantial evidence of

effectiveness in other areas, such as energy and health care (Manyika et al. 2011).

Educators should develop a culture of using data for making instructional decisions. This

brief builds on the recommendations of the U.S. Department of Education (2010b) report calling

for development of the mind-set that using data more strategically can drive school

improvement. Educators need to experience having student data that tell them something useful

and actionable about teaching and learning. This means that instructors must have near-real-time

access to easy-to-understand visual representations of student learning data at a level of detail


that can inform their instructional decisions. Scores on an achievement test taken six months ago

do not tell a teacher how to help a particular student tomorrow. The present study is also raised

the same argument in terms of inclusion of LA tools in assessment process.

Districts and institutions of higher education need to understand that their information

technology department is part of the effort to improve instruction but is not the only

responsible department. Establishing a data-driven culture requires much more than simply

buying a computer system. District staff from the information technology department need to

join with assessment, curriculum, and instruction staff, as well as top decision makers, and work

together to iteratively develop and improve data collection, processing, analysis, and

dissemination. A Department of Education report (Hamilton et al. 2019) suggests that districts

foster a culture of using data by beginning with such questions as: Which instructional materials

or approaches have been most effective in promoting student learning of this area of math

content? Are there differences in course success rates for students coming in to our high schools

from different feeder schools? Are there teachers who are particularly successful in terms of their

students’ learning gains whose practice might serve as a model for others?

Understand all details of a proposed solution. When purchasing learning software or

learning management systems, districts should demand details about the kinds of learning

analytics the system will generate and make sure the system will provide teachers and school

leaders with information they can use to improve teaching and learning: What are the analytics

based on? Have these measures been validated? Who gets to see the analytic data and in what

format, and what do they have to do to gain access? If students, teachers, and district

administrators will use visualizations or other reports from a data mining or an analytics
package, they should evaluate the solution to make sure the data are presented in a

comprehensible way.

Help students and parents understand the source and usefulness of learning data.

As colleges and schools move toward the use of fine-grained data from learning systems and

student data aggregated from multiple sources, they need to help students understand where the

data come from, how the data are used by learning systems, and how they can use the data to

inform their own choices and actions. Feedback is an important variable in changing behavior,

and research on systems like Purdue’s Signals suggests that many students will respond

appropriately in the face of feedback that they understand. Similarly, parents can help their

children make smarter choices if they have access to student data and understand how the data

are generated and what they mean.

5.9 Recommendations

This study highlights the importance of integrating learning analytics tools in school

assessments. It is quite apparent that the proliferation of the internet, as well as the supporting

digital tools that are ubiquitous in today’s culture, are leading a paradigm shift in training

process. The study leaves a few recommendations are

The present study shown how learning analytics can be applied in the design of a

teaching module. Through the corresponding questions, we have shown that we can get

targeted feedback during online learning from logged data from the learning

environment. However, this also needs to be tested in educational practice. This will

create a stronger connection between learning analytics and educational practice and lead
to increased insight into students’ learning processes. In this way, we will be able to learn

what data and interventions really have a positive impact on student success.

The practice of School-based assessment (SBA) techniques in school. Although the

national policy on education made provision for continuous assessment scores to be part

of the traditional end-of-course examination, some of the techniques of assessing students

are not been practiced in science classes by the teachers. The teachers are of the opinion

that the practice School-based assessment (SBA) techniques are time consuming, difficult

to implement as well as slow down teaching and learning. It was also evident from the

findings of this study that the qualified teachers with various years of experiences are not

totally clear about the fundamentals of School-based assessment (SBA) practices.

All the School-based assessment (SBA) techniques should be incorporated into the

assessment of all subject in the secondary education and be used extensively or

exclusively to provide information about student achievement.

School-based assessment should be aligned with and embedded within the broader

educational philosophy of “assessment for learning”. Assessment for learning is any

assessment in which the main aim is to enhance students’ learning. An assessment

activity can help learning if it provides information that can be used as feedback by

teachers and by students in order to improve the teaching and learning process in which

they are engaged. This could provide a leeway for a gradual shift from assessment of

learning, which is designed primarily to serve the purposes of accountability, ranking, or

certification of competence.

To build a more coherent and stronger assessment for learning culture, all assessments,

including traditional summative assessments, need to be reoriented towards improving


learning and teaching. This means school-based assessment should be integrated

naturally into the normal teaching-learning cycle, and should include a continuous

process of reflection, observation and monitoring, recording and reporting, with feedback

and self and peer assessment being integral components of all teacher-student interaction.

R&D in educational data mining and learning analytics occurs in both academic and

commercial organizations. Research and development are tightly linked, as the field

seeks to understand basic processes of data interpretation, decision making, and learning

and to use those insights to develop better systems.

Conduct research on the usability and impact of alternative ways of presenting fine-

grained learning data to instructors, students, and parents. Data visualizations provide an

important bridge between technology systems and data analytics, and determining how to

design visualizations that practitioners can easily interpret is an active area of research.

Solving this problem will require identifying the kinds of choices or decisions that

teachers, students, and parents want to make with fine-grained learning data, and the time

pressure and cognitive load factors present when different kinds of decisions are made.

Develop decision supports and recommendation engines that minimize the extent to

which instructors need to actively analyze data. The teacher in a truly instrumented

classroom would have much more than access to student scores on state and district tests.

Diagnostic real-time assessment tools and decision support systems would enable the

instructor to work with automated systems to make decisions “on the fly” to improve

instruction for all students.Continue to perfect the anonymization of data and tools for

data aggregation and disaggregation that protect individual privacy yet ensure

advancements in the use of educational data.


Develop models for how learning analytics and recommendation systems developed in

one context can be adapted and repurposed efficiently for other contexts. Differences in

educational contexts have made it a challenge to transfer developed predictive models

across educational settings. Because students, administrative policies, course programs

(e.g., four-year vs. community colleges), Understanding how this process can become

more efficient will be key to scaling up the use of learning analytics.

5.10 Conclusion

Learning analytics offers exciting possibilities for education and assessment, particularly

in the areas of personalising learning, empowering learners, improving data literacy and

visualising feedback. However, the excitement is moderated by a number of challenges that need

to be addressed by the educational community. The present study reflect what consider to be

‘good’ education, pedagogy, and assessment practices, which are all fundamental issues at the

heart of learning analytics development The field’s potential to support effective learning is

being increasingly recognized.

Intelligent recognition of verified data patterns also supports predictive modelling, which

analyses learner-centred data in order to probabilistically predict future outcomes. These

predictions can generate feedback suggesting adaptations, improvements or recommendations for

learners, teachers, or wider institutions. These can include prompting a learner to self-assess their

progress or informing teachers about students at risk of failing due to poor attendance or late

submission of assignments. Learning analytics also supports assessment methods that can

increase agency for the learners, such as the use of ‘learner-facing’ tools that represent data back

to learners and support self-regulation of learning.

You might also like