Professional Documents
Culture Documents
ConstructionofaquestionnireVelasco Tjar 2017
ConstructionofaquestionnireVelasco Tjar 2017
ConstructionofaquestionnireVelasco Tjar 2017
net/publication/321354082
CITATION READS
1 985
2 authors:
All content following this page was uploaded by Juan-Carlos Tójar on 12 March 2018.
From this perspective, [7] they point out that it is essential students. These authors state that these results are due to
that educators make explicit their preconceptions, purposes questions of experimentation and adaptation, that is, the
and interests related to the evaluation in order to achieve a students value better those evaluation strategies with which
better degree of transparency, commitment and responsibility they are accustomed to be evaluated because they are the
in the evaluation process. They also emphasize that involving most used and, consequently, have acquired more mastery in
students in evaluation processes requires that the rules, the Course of the Degree. From this perspective, one could
conditions and evaluation criteria are explicit and clear so that conclude that students seem to prefer more traditional
the student can recognize in which moments of his/her methodologies because they do not know, nor have been
learning process s/he is [8], [9], or [10]. In the same way, evaluated with other evaluation methods and strategies.
some authors like [11] or [12] indicate that from the Results that are reaffirmed by previous literature [19] and
beginning of the course, educators must show their students, [20] point out that educators only use those assessment tools
in a transparent and precise way, the learning objectives, the with which they feel familiar.
evaluation criteria and the weight that the activities will have The degrees in engineering are a good example of university
in the final grade, since it will have more impact on learning studies in which they have been working for competences for
than poorly structured feedback [13]. More concretely, [14] a long time (at least professional skills) [21]. Likewise, these
points out that teaching staff must select methods, tools and degrees do not stand out because of a great transparency in
evaluation strategies that allow concrete, clear, detailed and the evaluation. Even, [22], in a study on motivation in
public criteria to be established to guarantee the objectivity of engineering students, indicate that while evaluation can be a
the process, as well as fair and equitable treatment for driving force for learning, it can also be used to hinder, or
students. Nevertheless, [15] have found that raising more impede quality learning, even when classes taught by teachers
explicit and transparent evaluation criteria is problematic in can be considered of excellent. These authors emphasize in
theory and difficult to develop in practice. In this regard, some their results the presence of an assessment model based on a
authors [7] and [16] insist that it is not enough to inform single exam, with a limited time of realization that includes
students about the evaluation criteria, but rather it is different questions to what has been intensely worked in
necessary to involve them in the construction of these criteria class. Thus, it follows that engineering faculty does not employ
to promote a more meaningful learning and, therefore, to patterns of action that are sufficiently transparent and explicit
make the students more aware of their learning process. to promote student interest and understanding.
In a study conducted by [17] in the degree of Computer From the above, it can be deduced that conducting a pilot
Engineering, it was emphasized that the lack of transparency study in the field of engineering can be very useful to obtain a
and the use of continuous evaluation are some of the main solid instrument with which to evaluate transparency of the
problems of the Degree. These authors warn that focusing on evaluation methods in Higher Education. This paper presents
the final result and not making explicit the evaluation criteria the initial phase of construction of the instrument.
does not favour the continuous monitoring of the students. In
this sense, it is proposed to establish methodologies (e.g. 2 OBJECTIVES
rubrics) that allow continuous evaluation with feedback at the
The objective of the first stage of this study was to validate a
key moments of the learning process, establishing an
measurement instrument that would allow to evaluate the
evaluation protocol based on public and objective criteria.
transparency of evaluation methods and strategies designed
In this sense, [5] in a study with students of Primary
by the educators of the Engineering degrees.
Education, Pharmacy and different Engineering degrees, also
pointed out that the students value the presence of a 2.1 General objective
continuous evaluation, ensuring as a key to their evaluation
the use of case studies and problem solving. However, in the Construct an instrument capable of evaluating the
results it can be seen that continuous evaluation, although transparency of the evaluation methods and strategies used
considered necessary and valuable, is relegated to the by teachers in Higher Education.
background because of the preponderance of final exams,
which are the most used as an evaluation methodology, but at 2.2 Specific objectives
the same time, the least valued by the students for their — Construct an instrument / questionnaire with items
competential development. related to transparency in the evaluation of the
By contrast, other authors [18] point out that the evaluation learning of university students of Engineering from
strategies most used and considered useful by students of the dimensions indicated in the related scientific
Engineering degrees are reports, projects, developmental literature.
tests and oral exams; being, on the contrary, the portfolio, the — Conduct expert validation of the instrument /
observations or the scales of attitudes that have less presence questionnaire on transparency in the evaluation of
in the evaluation and less utility are attributed to them by the learning.
2
Construction of a questionnaire to know the transparency in the
TEEM 2017, October 2017, Cádiz Spain
evaluation of the learning of Engineering students
— Conduct a pilot study with students of different the pilot study included a reliability study, based on the
degrees of engineering applying the questionnaire Cronbach Alpha coefficient, a principal components factorial
on transparency. analysis (for metric variables) and a categorical analysis of
— Analyse the reliability and validity of the measure principal components (CATPCA) taking into account the
obtained with the application of the questionnaire ordinal and nominal nature of the data.
and determine possible improvements in it. All quantitative data treatment was performed using the
statistical analysis package SPSS v.22 (2013).
3 METHODS
In order to respond to the objectives of the study, a survey 4 RESULTS
methodology was used. The study, with a quantitative The validation of experts helped to redefine several items but
approach, involved the elaboration and validation of the did not change the number of them. The valuations suggested
survey. Firstly, validation was done from experts. Finally, the defining and better specifying some of the items to make them
validation was completed with a pilot study. more comprehensible by the students and to resize some of
the variables.
3.1 Procedure, techniques of information collection The analysis of the pilot study was started with the Cronbach
and analysis Alpha coefficient. The result of this coefficient (2 = 0.923)
indicated a high internal consistency.
First, from the review of the scientific literature, a first version
The factorial analysis of principal components allowed
of the questionnaire was developed. The instrument was
obtaining a model of 19 components with a explained
subjected to expert review (n1=5). Several university
variance of 85.29%. Prior to the accomplishment of this
professors specialized in Higher Education evaluation acted as
factorial analysis, a study of the conditions of application
experts.
(KMO and Bartlett sphericity test) was carried out. Although
After this process of validation of experts, the survey was
these tests gave positive results, favourable to the principal
tested in a pilot study. For the pilot study was used a sample
component factor analysis, due to the ordinal nature of the
composed of 50 students of 4th year of different Engineering
data, we chose to perform a CATPCA. As an added advantage,
Degrees at the University of Malaga (n2= 50). Sampling was
the categorical analysis of principal components does not have
intentional. For this purpose, the database University of
as many restrictions in relation to the sample size.
Engineering students was used. The average age of the sample
was 23 with a standard deviation of 3.32, age ranging from 21 Table 1 presents a summary of the model with Cronbach's
Alpha of each dimension
to 37 years old. The sample was composed of 84% male and
14% female, who 60% were students of Industrial
Engineering and 30% of Telecommunications Engineering. Table 1: CATPCA model with dimensions and Cronbach’s
The questionnaire was composed of 66 items that allow to Alpha
know the characteristics of the evaluation process of the
different subjects of the Engineering Degrees. These items Variance
Dimension Alpha of Cronbach
were grouped into 13 dimensions: 1) Teaching guides; 2) Total (autovalue)
Evaluation criteria; 3) Score; 4) Theoretical and / or practical 1 .944 24.252
work (reports, projects, prototypes, etc.); 5) Exam; 6) 2 .919 20.559
Knowledge assessed; 7) Material resources; 8) Management of 3 .890 13.068
the evaluation system; 9) Guidance; 10) Follow-up; 11) 4 .867 9.824
Doubts; 12) Evaluation modalities and 13) Satisfaction in 5 .827 6.397
evaluation. 6 .822 6.249
The items were organized on a Likert scale ranging from 0 7 .784 4.384
(totally disagree) to 5 (totally agree). Total .997a 84.733
The scale was formed by evidence obtained from the scientific
literature and by the evaluation of 5 experts in Education and As we can see in Table 1, all dimensions have a high internal
Engineering who assessed the adequacy of the items consistency (ranging from .784 to .944). The dimensions of
represented in the questionnaire. After an analysis and the model are as follows:
consideration of the contributions of the experts, the final — Dimension 1: Transparency, involvement and
version of the questionnaire was elaborated. accountability in evaluation.
The experts analyzed qualitatively the instrument / — Dimension 2: Rating information level.
questionnaire contributing valuations and improvement
proposals that helped to redefine some items. The analysis of
3
TEEM 2017, October 2017, Cádiz Spain L.C. Velasco-Martínez & J.C. Tójar-Hurtado
—
Dimension 3: Characteristics of evaluation The weight of the exam in the final grade
30 .625
modalities (hetero-evaluation, self-evaluation, co- seems correct
evaluation, etc.). I usually understand the assessment
— Dimension 4: Educator Feedback. 6 section when I consult it in the teaching .618
— Dimension 5: Educational resources and agents for guides of the subjects of the Degree
acquiring learning (Teaching guides, material, virtual Before the day of the exam, educators
campus, the educator itself, and other students). 25 usually inform me in class about the weight .609
— Dimension 6: Procedures, strategies and plans to of each question on the exam
guide the student's integral learning (sharing, The faculty presents a series of steps,
negotiating, evaluating, supporting, orienting, 51 indications or guidelines to carry out .602
reviewing, communicating and qualifying). evaluable work, reports or practices
— Dimension 7: Availability of material resources used I usually agree with the percentages
by educators. 18 assigned to the different components of the .600
The following tables include each dimension with rating
corresponding items and factor loads: The faculty guides on the level of depth
53 .595
with which to answer the exam questions
Generally I know the information that
Table 2: Dimension 1: Transparency, involvement and
1 includes the teaching guides of the subjects .460
accountability in evaluation
of the Degree
Transparency, involvement and I am accustomed to consult the subject of
Num. ij 3 the subject of the Degree in the teaching .477
accountability in evaluation
I like educators to agree on the evaluation guides
13 .814 Before the day of the exam, teachers usually
criteria with students
Educators show examples of exercises, 26 give clues in class about what is more or .442
50 tasks, questions, ... like the ones that later .766 less important for the exam
count for the grade The faculty communicates the aspects to
Usually the evaluation criteria are 55 improve of the works and other tasks in .406
8 .755 several moments during their realization
presented in an understandable language
The faculty brings samples of real tasks that Educators use different channels of
49 .739 56 information to give evaluation feedback .406
are carried out in the professional field
The faculty reorganizes the subject (or (interviews, meetings, forum, etc.)
46 parts of it) if it finds gaps in the students' .725 The work of the subjects of the Degree is
learning 20 one of the tasks that has more weight in the .403
I have clear the criteria of correction that qualification
27 .723
the educator apply in the exams
I usually know the evaluation criteria well Dimension 1 brings together items related to transparency,
9 in advance of the tests (eg work, .713 involvement and accountability in evaluation (see Table 2).
examinations, ...) This first dimension is the clearest, with the highest number of
Generally, the type of exam that is carried items (24) and represents the most substantial factor of the
23 out (type test, development, short .708 questionnaire.
questions, etc.) is well reflected.
I usually understand the evaluation criteria Table 3: Dimension 2: Rating information level
19 for the performance of work in the subjects .676
of the Degree Num. Rating information level ij
Normally all the evaluation criteria that I know the minimum score required to
7 .671
appear in the teaching guides are fulfilled 14 overcome the theoretical part and to be able .966
The evaluation system is consistent with to add other score
44 the objectives, contents and methodology of .639 I usually know what each part of the
the subjects of the Degree 16 assessment is worth in the sum of the final .916
When there is the possibility of delivering grade of the subjects of the Degree
22 additional tasks and / or work, I know .634 I usually understand the assessment section
clearly what contributes to the final grade 6 when I consult it in the teaching guides of the .814
The faculty again explains the evaluation subjects of the Degree
24 .633
system when the exam is approaching 15 I know the minimum qualification required .757
4
Construction of a questionnaire to know the transparency in the
TEEM 2017, October 2017, Cádiz Spain
evaluation of the learning of Engineering students
to overcome the practical part and to be able Table 5: Dimensión 4: Educator Feedback
to add other qualifications
In summary, how would you rate (from 0 to Num. Educator Feedback ij
66 5) the level of satisfaction with the evaluation -.488 Normally, the educators clearly explain the
systems carried out in the Degree? 5 evaluation section of the subjects of the -1.00
Degree
Table 3 shows the items that are related to the level of The faculty communicates the aspects to
information perceived by the students (knowing, 54 improve of the works and other tasks once -.613
understanding). This dimension, represented by 5 items, has they are delivered
been called Rating Information Level. The negative result of I use the tutorials to clarify doubts about
the load of item 66 can be commented as a remarkable fact. 58 the evaluation system of the subjects of the -.524
This result could be taken as an indicator of overall poor Degree
satisfaction with evaluation systems. This result should be To surpass the subjects it is essential to
carefully studied in future investigations. 21 deliver a work (or report, project, -.475
prototype, etc.)
Educators use different channels of
Table 4: Dimension 3: Characteristics of the evaluation 56 information to give evaluation feedback -.432
modalities (hetero-evaluation, self-evaluation, co- (interviews, meetings, forum, etc.)
evaluation, etc.)
The feedback that educators give their students is represented
Num. Characteristics of the evaluation modalities ij
as a factor in dimension 4 (see Table 5). The item that has the
In general, I think it is appropriate for some
63 .816 highest factor load (item number 5, with a load of 1 in
students to score others students
absolute value) is related to a clear explanation of the
Educators usually give instructions for
evaluation by the educators. The negative value of the
52 testing that do NOT correspond to the way .771
factorial load is also a data to continue researching, as it could
they correct
be showing evidence of a feedback deficit in the sample
The faculty takes into account students'
studied.
37 attitudes towards evaluation in their .738
subjects
64
In addition to educator assessment, there is
.728 Table 6: Dimension 5: Resources and educational agents
a chance that students are score themselves for the acquisition of learning (Teaching guides, material,
The exam of the subjects is one of tasks that virtual campus, the teacher itself and, other students).
28 -.572
has more weight in the subjects
Educators, in addition to content, usually Resources and educational agents for the
35 .556 Num. ij
assess skills specific to their subjects acquisition of learning
I usually consult in the teaching guides the The faculty checks if the students have the
38 .672
2 teaching methodology of the subjects of the .490 necessary knowledge before starting a task
Degree I usually use the forums of the Virtual
In the exams also is usually valued the 60 Campus of the University to clarify doubts .556
32 .432 about the subjects
ability to work against time
In general, I think it is appropriate for I use my classmates, rather than the faculty,
65 .423 59 to resolve doubts about the evaluation -.448
students to write notes to themselves
In the exams they are usually evaluated system of the Degree
33 more the rote learning than their .404 The own material used by educators to
understanding 40 teach their subject (e.g. slides) is available -.436
to students
Table 4 includes the items (10) that represent the I usually consult in the teaching guides the
characteristics of the different evaluation modalities: hetero- 4 section of evaluation of the subjects of the -.421
evaluation, self-evaluation, co-evaluation, etc. For example, Degree
the item with a higher factorial load (the number 63 with a In addition to evaluation by educators,
load of .816), speaks of the active participation of students in 62 there is a possibility that students will write .408
the evaluation of their peers (co-evaluation). notes to others
Educators provide additional material or
39 .406
resources if they find gaps in student
5
TEEM 2017, October 2017, Cádiz Spain L.C. Velasco-Martínez & J.C. Tójar-Hurtado
6
Construction of a questionnaire to know the transparency in the
TEEM 2017, October 2017, Cádiz Spain
evaluation of the learning of Engineering students
have a substantive meaning. Therefore, it is possible to affirm la evaluación de su aprendizaje. Revista Española de Pedagogía. 250 (Sept-
Dic. 2011), 401-425.
that all the objectives, both general and specific, have been [8] Kelly Burton. 2006. Designing criterion-referenced assessment. Journal of
fulfilled. Learning Design, 1, 2 (2006), 73-83. DOI:
http://dx.doi.org/10.5204/jld.v1i2.19
In the future, the sample will have to be expanded for new [9] Julie Wood et al. 2008. Engaging Students in the Academic Advising Process.
pilot studies. Likewise, the questionnaire should be applied to The Mentor: An Academic Advising Journal, 10 (2008), 23-32.
representative samples and effectively prove that the [10] Jorge Pérez et al. 2017. Development of Procedures to Assess Problem-
Solving Competence in Computing Engineering. IEEE Transactions on
instrument evaluates the transparency of the evaluation of Education, 60, 1 (Feb. 2017), 22-28
learning. Having an instrument of these characteristics, [11] Aurelio Villa and M. Poblete. 2011. Evaluación de competencias genéricas:
Principios, oportunidades y limitaciones. Bordón, 63 (Feb. 2011), 147-170.
correctly validated, can be a very important tool for [12] Verónica Villaroel and Daniela Bruna. 2014. Reflexiones en torno a las
educational improvement, both in engineering studies and in competencias genéricas en educación superior: Un desafío pendiente.
other studies of higher education. Psicoperspectivas, 13, 1 (2011), 24-34.
[13] Concepción Yaniz and Lourdes Villardón. 2012. Modalidades de evaluación
de competencias genéricas en la formación universitaria. Didac, 60 (Jul.-Dic.
ACKNOWLEDGMENTS 2012), 15-19.
Our thanks to educators and students who have participated [14] ENQA. 2015. Standards and Guidelines for Quality Assurance in the European
Higher Education Area (ESG). (2015). Brussels, Belgium.
in this work. [15] Mark De Vos and Dina Z. Belluigi. 2011. Formative Assessment as
Mediation. Perspectives in Education, 29, 2 (Jun. 2011), 39-47.
[16] Laura J. Leslie and Paul C. Gorman. 2017. Collaborative design of
REFERENCES assessment criteria to improve undergraduate student engagement and
[1] Juan. I. López-Ruíz. 2011. Un giro copernicano en la enseñanza performance. European Journal of Engineering Education, 42, 3 (May. 2017),
universitaria: formación por competencias. Revista de Educación, 356 (Dic. 286-301. DOI: http://orcid.org/0000-0002-7925-9589
2011), 279-301. [17] Mikel Villamañe et al. 2017. Desarrollo y validación de un conjunto de
[2] María Soledad Ibarra, Gregorio Rodríguez and Miguel A. Gómez Ruiz. 2012. rúbricas para la evaluación de Trabajos Fin de Grado. ReVisión, 10, 1 (En.
La evaluación entre iguales: beneficios y estrategias para su práctica en la 2017).
universidad. Revista de Educación, 359 (Sept.-Dic. 2012), 206-231 DOI: 10- [18] Joe Miró, Maite Fernández-Ferrer and Natividad Cabrera. 2015. La opinión
4438/1988-592X-RE-2010-359-092 y percepción de los estudiantes en la evaluación por competencias.
[3] Victor L. López-Pastor, Juan. C. Manrique-Arribas and Cristina Vallés-Rapp. ReVisión, 8, 3 (2015), 59-70.
2011. La evaluación y la calificación en los nuevos estudios de Grado. [19] De Miguel, M. (Coord.). 2006. Metodologías de enseñanza y aprendizaje para
Especial incidencia en la formación inicial del profesorado. Revista el desarrollo de competencias. Orientaciones para el profesorado
Electrónica Interuniversitaria de Formación del Profesorado, 14, 4 (Jun. universitario ante el Espacio Europeo de Educación Superior. Madrid:
2011). Alianza Editorial.
[4] A. Pérez-Pueyo et al. 2008. Evaluación formativa y compartida en la [20] Amparo Fernández-March, A. 2006. Metodologías activas para la formación
docencia universitaria y el Espacio Europeo de Educación Superior: de competencias. Educatio Siglo XXI, 24 (2006), 35-56.
cuestiones clave para su puesta en práctica. Revista de Educación, 347 [21] María Martínez et al. 2013. Una propuesta de evaluación de competencias
(Sept.-Dic. 2008), 435-451. genéricas en grados de Ingeniería. Revista de Docencia Universitaria, 11
[5] Elena Cano-García and Maite Fernández-Ferrer. (Eds.). 2016. Evaluación por (Nov. 2013), 113-139. DOI: https://doi.org/10.4995/redu.2013.5550
competencias: la perspectiva de las primeras promociones de graduados en el [22] Fernández-Jiménez et al. 2013. Educación a distancia: la enseñanza y el
EEES. Barcelona: Ediciones Octaedro. aprendizaje CET (Revista contemporaneidade educacao e tecnologia), 1, 3
[6] Gregorio Rodríguez-Gómez et al. 2012. La voz del estudiante en la (Apr.2013), 49-62.
evaluación del aprendizaje: un camino por recorrer en la universidad.
RELIEVE, 18, 2 (2012). DOI: 10.7203/relieve.18.2.1985
[7] Víctor Alvárez-Rojo et al. 2011. Análisis de la participación del alumnado en