ConstructionofaquestionnireVelasco Tjar 2017

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/321354082

Construction of a questionnaire to know the transparency in the evaluation of


the learning of Engineering students

Conference Paper · October 2017


DOI: 10.1145/3144826.3145370

CITATION READS

1 985

2 authors:

Leticia C. Velasco Juan-Carlos Tójar


University of Malaga University of Malaga
27 PUBLICATIONS 135 CITATIONS 76 PUBLICATIONS 395 CITATIONS

SEE PROFILE SEE PROFILE

All content following this page was uploaded by Juan-Carlos Tójar on 12 March 2018.

The user has requested enhancement of the downloaded file.


Construction of a questionnaire to know the transparency in the
evaluation of the learning of Engineering students*

Leticia-Concepción Velasco-Martínez Juan-Carlos Tójar-Hurtado


Universidad de Málaga Universidad de Málaga
Campus de Teatinos Campus de Teatinos
Málaga 29071 Málaga 29071
Spain Spain
leticiav@uma.es jctojar @uma.es
full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored.
ABSTRACT Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires

prior specific permission and/or a fee. Request permissions from permissions@acm.org.


The methodological proposals arising from the European
convergence area raise the need to promote higher levels of
involvement and participation of students in their own TEEM 2017, October 18–20, 2017, Cádiz, Spain
© 2017 Association for Computing Machinery.
evaluation process. Nevertheless, several authors indicate that ACM ISBN 978-1-4503-5386-1/17/10…$15.00

there are still insufficient evidences on the levels of https://doi.org/10.1145/3144826.3145370

participation that students have on their own evaluation


process, nor do we see the presence of evaluation practices KEYWORDS
that show the application of an authentic formative and Transparency; evaluation methodology; engineering degrees;
shared evaluation in relation to the acquisition of questionnaire; validation.
competences. Thus, it follows that engineering faculty does ACM Reference format:
not employ patterns of action that are sufficiently transparent
Leticia C. Velasco-Martínez and Juan-Carlos Tojar-Hurtado.
and explicit to promote student interest and understanding.
2017.. In Proceedings of The Fifth International Conference on
The objective of the study was to validate a measurement
Technological Ecosystem for Enhancing Multiculturality, Cadiz,
instrument that would allow evaluating the transparency of
Spain, October 2017 (TEEM’17), 7 pages.
the methods and evaluation strategies employed by the
10.1145/3144826.3145370
educators of the Engineering degrees. The pilot study, applied
to 50 students of Engineering has been analysed through a
principal components categorical analysis (CATPCA). The 1 INTRODUCTION
seven components in which the model has been structured The methodological proposals arising from the European
show as many relevant dimensions that reflect the factorial convergence area raise the need to promote higher levels of
structure of the questionnaire: transparency, level of involvement and participation of students in their own
information about score, evaluation modalities, feedback from evaluation process. This evaluative approach supposes to
teachers, resources and educational agents, procedures, break with the traditional unidirectional perspective of the
strategies and Guidelines; and material resources that evaluation process in which the student is the protagonist of
educators use.. his own learning and not a mere receiver of knowledge. In this
sense, educators are not considered as the only evaluating
CCS CONCEPTS agent capable of directing and deciding on the procedures,
techniques and evaluation criteria [1].
• General and reference~Reliability • General and
Nevertheless, several authors [2], [3], or [4] indicate that
reference~Measurement • General and
there is still insufficient evidence on the levels of participation
reference~Evaluation • General and reference~Validation
that students have on their own evaluation process, nor do we
• General and reference~Estimation • General and
see the presence of evaluation practices that show the
reference~Design • Social and professional topics~Model
application of an authentic formative and shared evaluation in
curricula • Social and professional topics~Student
relation to the acquisition of competences [5]. In the same
assessment
way, [6] they point out that there is no evidence in the
teaching schedules about the participation of students in the
promotion of supervision, and training of students in
evaluation procedures, in the design and elaboration of
evaluation instruments; and in the possibility of negotiating
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee and agreeing with the students the evaluation criteria.
provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the
TEEM 2017, October 2017, Cádiz Spain L.C. Velasco-Martínez & J.C. Tójar-Hurtado

From this perspective, [7] they point out that it is essential students. These authors state that these results are due to
that educators make explicit their preconceptions, purposes questions of experimentation and adaptation, that is, the
and interests related to the evaluation in order to achieve a students value better those evaluation strategies with which
better degree of transparency, commitment and responsibility they are accustomed to be evaluated because they are the
in the evaluation process. They also emphasize that involving most used and, consequently, have acquired more mastery in
students in evaluation processes requires that the rules, the Course of the Degree. From this perspective, one could
conditions and evaluation criteria are explicit and clear so that conclude that students seem to prefer more traditional
the student can recognize in which moments of his/her methodologies because they do not know, nor have been
learning process s/he is [8], [9], or [10]. In the same way, evaluated with other evaluation methods and strategies.
some authors like [11] or [12] indicate that from the Results that are reaffirmed by previous literature [19] and
beginning of the course, educators must show their students, [20] point out that educators only use those assessment tools
in a transparent and precise way, the learning objectives, the with which they feel familiar.
evaluation criteria and the weight that the activities will have The degrees in engineering are a good example of university
in the final grade, since it will have more impact on learning studies in which they have been working for competences for
than poorly structured feedback [13]. More concretely, [14] a long time (at least professional skills) [21]. Likewise, these
points out that teaching staff must select methods, tools and degrees do not stand out because of a great transparency in
evaluation strategies that allow concrete, clear, detailed and the evaluation. Even, [22], in a study on motivation in
public criteria to be established to guarantee the objectivity of engineering students, indicate that while evaluation can be a
the process, as well as fair and equitable treatment for driving force for learning, it can also be used to hinder, or
students. Nevertheless, [15] have found that raising more impede quality learning, even when classes taught by teachers
explicit and transparent evaluation criteria is problematic in can be considered of excellent. These authors emphasize in
theory and difficult to develop in practice. In this regard, some their results the presence of an assessment model based on a
authors [7] and [16] insist that it is not enough to inform single exam, with a limited time of realization that includes
students about the evaluation criteria, but rather it is different questions to what has been intensely worked in
necessary to involve them in the construction of these criteria class. Thus, it follows that engineering faculty does not employ
to promote a more meaningful learning and, therefore, to patterns of action that are sufficiently transparent and explicit
make the students more aware of their learning process. to promote student interest and understanding.
In a study conducted by [17] in the degree of Computer From the above, it can be deduced that conducting a pilot
Engineering, it was emphasized that the lack of transparency study in the field of engineering can be very useful to obtain a
and the use of continuous evaluation are some of the main solid instrument with which to evaluate transparency of the
problems of the Degree. These authors warn that focusing on evaluation methods in Higher Education. This paper presents
the final result and not making explicit the evaluation criteria the initial phase of construction of the instrument.
does not favour the continuous monitoring of the students. In
this sense, it is proposed to establish methodologies (e.g. 2 OBJECTIVES
rubrics) that allow continuous evaluation with feedback at the
The objective of the first stage of this study was to validate a
key moments of the learning process, establishing an
measurement instrument that would allow to evaluate the
evaluation protocol based on public and objective criteria.
transparency of evaluation methods and strategies designed
In this sense, [5] in a study with students of Primary
by the educators of the Engineering degrees.
Education, Pharmacy and different Engineering degrees, also
pointed out that the students value the presence of a 2.1 General objective
continuous evaluation, ensuring as a key to their evaluation
the use of case studies and problem solving. However, in the Construct an instrument capable of evaluating the
results it can be seen that continuous evaluation, although transparency of the evaluation methods and strategies used
considered necessary and valuable, is relegated to the by teachers in Higher Education.
background because of the preponderance of final exams,
which are the most used as an evaluation methodology, but at 2.2 Specific objectives
the same time, the least valued by the students for their — Construct an instrument / questionnaire with items
competential development. related to transparency in the evaluation of the
By contrast, other authors [18] point out that the evaluation learning of university students of Engineering from
strategies most used and considered useful by students of the dimensions indicated in the related scientific
Engineering degrees are reports, projects, developmental literature.
tests and oral exams; being, on the contrary, the portfolio, the — Conduct expert validation of the instrument /
observations or the scales of attitudes that have less presence questionnaire on transparency in the evaluation of
in the evaluation and less utility are attributed to them by the learning.

2
Construction of a questionnaire to know the transparency in the
TEEM 2017, October 2017, Cádiz Spain
evaluation of the learning of Engineering students

— Conduct a pilot study with students of different the pilot study included a reliability study, based on the
degrees of engineering applying the questionnaire Cronbach Alpha coefficient, a principal components factorial
on transparency. analysis (for metric variables) and a categorical analysis of
— Analyse the reliability and validity of the measure principal components (CATPCA) taking into account the
obtained with the application of the questionnaire ordinal and nominal nature of the data.
and determine possible improvements in it. All quantitative data treatment was performed using the
statistical analysis package SPSS v.22 (2013).
3 METHODS
In order to respond to the objectives of the study, a survey 4 RESULTS
methodology was used. The study, with a quantitative The validation of experts helped to redefine several items but
approach, involved the elaboration and validation of the did not change the number of them. The valuations suggested
survey. Firstly, validation was done from experts. Finally, the defining and better specifying some of the items to make them
validation was completed with a pilot study. more comprehensible by the students and to resize some of
the variables.
3.1 Procedure, techniques of information collection The analysis of the pilot study was started with the Cronbach
and analysis Alpha coefficient. The result of this coefficient (2 = 0.923)
indicated a high internal consistency.
First, from the review of the scientific literature, a first version
The factorial analysis of principal components allowed
of the questionnaire was developed. The instrument was
obtaining a model of 19 components with a explained
subjected to expert review (n1=5). Several university
variance of 85.29%. Prior to the accomplishment of this
professors specialized in Higher Education evaluation acted as
factorial analysis, a study of the conditions of application
experts.
(KMO and Bartlett sphericity test) was carried out. Although
After this process of validation of experts, the survey was
these tests gave positive results, favourable to the principal
tested in a pilot study. For the pilot study was used a sample
component factor analysis, due to the ordinal nature of the
composed of 50 students of 4th year of different Engineering
data, we chose to perform a CATPCA. As an added advantage,
Degrees at the University of Malaga (n2= 50). Sampling was
the categorical analysis of principal components does not have
intentional. For this purpose, the database University of
as many restrictions in relation to the sample size.
Engineering students was used. The average age of the sample
was 23 with a standard deviation of 3.32, age ranging from 21 Table 1 presents a summary of the model with Cronbach's
Alpha of each dimension
to 37 years old. The sample was composed of 84% male and
14% female, who 60% were students of Industrial
Engineering and 30% of Telecommunications Engineering. Table 1: CATPCA model with dimensions and Cronbach’s
The questionnaire was composed of 66 items that allow to Alpha
know the characteristics of the evaluation process of the
different subjects of the Engineering Degrees. These items Variance
Dimension Alpha of Cronbach
were grouped into 13 dimensions: 1) Teaching guides; 2) Total (autovalue)
Evaluation criteria; 3) Score; 4) Theoretical and / or practical 1 .944 24.252
work (reports, projects, prototypes, etc.); 5) Exam; 6) 2 .919 20.559
Knowledge assessed; 7) Material resources; 8) Management of 3 .890 13.068
the evaluation system; 9) Guidance; 10) Follow-up; 11) 4 .867 9.824
Doubts; 12) Evaluation modalities and 13) Satisfaction in 5 .827 6.397
evaluation. 6 .822 6.249
The items were organized on a Likert scale ranging from 0 7 .784 4.384
(totally disagree) to 5 (totally agree). Total .997a 84.733
The scale was formed by evidence obtained from the scientific
literature and by the evaluation of 5 experts in Education and As we can see in Table 1, all dimensions have a high internal
Engineering who assessed the adequacy of the items consistency (ranging from .784 to .944). The dimensions of
represented in the questionnaire. After an analysis and the model are as follows:
consideration of the contributions of the experts, the final — Dimension 1: Transparency, involvement and
version of the questionnaire was elaborated. accountability in evaluation.
The experts analyzed qualitatively the instrument / — Dimension 2: Rating information level.
questionnaire contributing valuations and improvement
proposals that helped to redefine some items. The analysis of

3
TEEM 2017, October 2017, Cádiz Spain L.C. Velasco-Martínez & J.C. Tójar-Hurtado


Dimension 3: Characteristics of evaluation The weight of the exam in the final grade
30 .625
modalities (hetero-evaluation, self-evaluation, co- seems correct
evaluation, etc.). I usually understand the assessment
— Dimension 4: Educator Feedback. 6 section when I consult it in the teaching .618
— Dimension 5: Educational resources and agents for guides of the subjects of the Degree
acquiring learning (Teaching guides, material, virtual Before the day of the exam, educators
campus, the educator itself, and other students). 25 usually inform me in class about the weight .609
— Dimension 6: Procedures, strategies and plans to of each question on the exam
guide the student's integral learning (sharing, The faculty presents a series of steps,
negotiating, evaluating, supporting, orienting, 51 indications or guidelines to carry out .602
reviewing, communicating and qualifying). evaluable work, reports or practices
— Dimension 7: Availability of material resources used I usually agree with the percentages
by educators. 18 assigned to the different components of the .600
The following tables include each dimension with rating
corresponding items and factor loads: The faculty guides on the level of depth
53 .595
with which to answer the exam questions
Generally I know the information that
Table 2: Dimension 1: Transparency, involvement and
1 includes the teaching guides of the subjects .460
accountability in evaluation
of the Degree
Transparency, involvement and I am accustomed to consult the subject of
Num. ij 3 the subject of the Degree in the teaching .477
accountability in evaluation
I like educators to agree on the evaluation guides
13 .814 Before the day of the exam, teachers usually
criteria with students
Educators show examples of exercises, 26 give clues in class about what is more or .442
50 tasks, questions, ... like the ones that later .766 less important for the exam
count for the grade The faculty communicates the aspects to
Usually the evaluation criteria are 55 improve of the works and other tasks in .406
8 .755 several moments during their realization
presented in an understandable language
The faculty brings samples of real tasks that Educators use different channels of
49 .739 56 information to give evaluation feedback .406
are carried out in the professional field
The faculty reorganizes the subject (or (interviews, meetings, forum, etc.)
46 parts of it) if it finds gaps in the students' .725 The work of the subjects of the Degree is
learning 20 one of the tasks that has more weight in the .403
I have clear the criteria of correction that qualification
27 .723
the educator apply in the exams
I usually know the evaluation criteria well Dimension 1 brings together items related to transparency,
9 in advance of the tests (eg work, .713 involvement and accountability in evaluation (see Table 2).
examinations, ...) This first dimension is the clearest, with the highest number of
Generally, the type of exam that is carried items (24) and represents the most substantial factor of the
23 out (type test, development, short .708 questionnaire.
questions, etc.) is well reflected.
I usually understand the evaluation criteria Table 3: Dimension 2: Rating information level
19 for the performance of work in the subjects .676
of the Degree Num. Rating information level ij
Normally all the evaluation criteria that I know the minimum score required to
7 .671
appear in the teaching guides are fulfilled 14 overcome the theoretical part and to be able .966
The evaluation system is consistent with to add other score
44 the objectives, contents and methodology of .639 I usually know what each part of the
the subjects of the Degree 16 assessment is worth in the sum of the final .916
When there is the possibility of delivering grade of the subjects of the Degree
22 additional tasks and / or work, I know .634 I usually understand the assessment section
clearly what contributes to the final grade 6 when I consult it in the teaching guides of the .814
The faculty again explains the evaluation subjects of the Degree
24 .633
system when the exam is approaching 15 I know the minimum qualification required .757
4
Construction of a questionnaire to know the transparency in the
TEEM 2017, October 2017, Cádiz Spain
evaluation of the learning of Engineering students

to overcome the practical part and to be able Table 5: Dimensión 4: Educator Feedback
to add other qualifications
In summary, how would you rate (from 0 to Num. Educator Feedback ij
66 5) the level of satisfaction with the evaluation -.488 Normally, the educators clearly explain the
systems carried out in the Degree? 5 evaluation section of the subjects of the -1.00
Degree
Table 3 shows the items that are related to the level of The faculty communicates the aspects to
information perceived by the students (knowing, 54 improve of the works and other tasks once -.613
understanding). This dimension, represented by 5 items, has they are delivered
been called Rating Information Level. The negative result of I use the tutorials to clarify doubts about
the load of item 66 can be commented as a remarkable fact. 58 the evaluation system of the subjects of the -.524
This result could be taken as an indicator of overall poor Degree
satisfaction with evaluation systems. This result should be To surpass the subjects it is essential to
carefully studied in future investigations. 21 deliver a work (or report, project, -.475
prototype, etc.)
Educators use different channels of
Table 4: Dimension 3: Characteristics of the evaluation 56 information to give evaluation feedback -.432
modalities (hetero-evaluation, self-evaluation, co- (interviews, meetings, forum, etc.)
evaluation, etc.)
The feedback that educators give their students is represented
Num. Characteristics of the evaluation modalities ij
as a factor in dimension 4 (see Table 5). The item that has the
In general, I think it is appropriate for some
63 .816 highest factor load (item number 5, with a load of 1 in
students to score others students
absolute value) is related to a clear explanation of the
Educators usually give instructions for
evaluation by the educators. The negative value of the
52 testing that do NOT correspond to the way .771
factorial load is also a data to continue researching, as it could
they correct
be showing evidence of a feedback deficit in the sample
The faculty takes into account students'
studied.
37 attitudes towards evaluation in their .738
subjects
64
In addition to educator assessment, there is
.728 Table 6: Dimension 5: Resources and educational agents
a chance that students are score themselves for the acquisition of learning (Teaching guides, material,
The exam of the subjects is one of tasks that virtual campus, the teacher itself and, other students).
28 -.572
has more weight in the subjects
Educators, in addition to content, usually Resources and educational agents for the
35 .556 Num. ij
assess skills specific to their subjects acquisition of learning
I usually consult in the teaching guides the The faculty checks if the students have the
38 .672
2 teaching methodology of the subjects of the .490 necessary knowledge before starting a task
Degree I usually use the forums of the Virtual
In the exams also is usually valued the 60 Campus of the University to clarify doubts .556
32 .432 about the subjects
ability to work against time
In general, I think it is appropriate for I use my classmates, rather than the faculty,
65 .423 59 to resolve doubts about the evaluation -.448
students to write notes to themselves
In the exams they are usually evaluated system of the Degree
33 more the rote learning than their .404 The own material used by educators to
understanding 40 teach their subject (e.g. slides) is available -.436
to students
Table 4 includes the items (10) that represent the I usually consult in the teaching guides the
characteristics of the different evaluation modalities: hetero- 4 section of evaluation of the subjects of the -.421
evaluation, self-evaluation, co-evaluation, etc. For example, Degree
the item with a higher factorial load (the number 63 with a In addition to evaluation by educators,
load of .816), speaks of the active participation of students in 62 there is a possibility that students will write .408
the evaluation of their peers (co-evaluation). notes to others
Educators provide additional material or
39 .406
resources if they find gaps in student

5
TEEM 2017, October 2017, Cádiz Spain L.C. Velasco-Martínez & J.C. Tójar-Hurtado

learning Table 8: Dimension 7: Availability of material resources


used by educators
In tables 6 to 8 the "most diffuse" dimensions are represented.
Factorial loads are generally minor and a wide variety of Num Availability of material resources ij
topics are represented at each dimension. For example, in The material that educators use to teach their
Table 6, items related to resources and educational agents, for 42 subject is available a few days before the .601
the acquisition of learning, are collected. Among the items subjects are taught
considered in this dimension are those referred to teaching The material that educators use to teach their
43 .498
guides, virtual campus, faculty and students. In item number subject is available after teaching the subjects
38, with a higher factorial load (.672), it is proposed if the The own material used by teachers to teach
teachers check if the students have sufficient knowledge or 40 their subject (e.g. slides) is available to .347
not before starting a task. students
All the material used by educators to teach
41 their subject is available at the beginning of the .301
Table 7: Dimension 6: Procedures, strategies and plans to course
guide student's comprehensive learning
Table 8 includes dimension 7. This dimension includes items
Procedures, strategies and plans to guide
Num. ij related to the availability of material resources used by
student's comprehensive learning
educators. For example, item number 42, with the largest
Educators use the results of assessments to
factorial load of this dimension (.601), refers to whether
make adjustments or improvements to their
materials are available a few days before the subject is taught.
45 subject (in the teaching method, the -.685
In general terms, the results show different dimensions with
resources they used, the evaluation
well-defined topics and good or acceptable factor loads. The
techniques, etc.)
internal consistency within each dimension (measured by
After the correction of a task it is possible to
57 .556 Cronbach's alpha coefficient) was already a statistical
improve the one to deliver a final version
indicator of this, but the substantive meaning of each
Teachers provide additional material or
39 .475 dimension is a qualitatively more powerful indicator of the
resources if they find gaps in student learning
structural validity of the questionnaire.
It is possible to negotiate the percentages
assigned to the different components of the
17
qualification (work, examination, practices,
.472 5 CONCLUSIONS
etc.) An instrument has been constructed to evaluate the
Educators have planned classes in which transparency of the learning assessment systems of
there is no progress agenda, but rather they engineering students. The instrument, in the form of Likert-
47 .472
are dedicated to help and guide in learning, type scale was constructed from related scientific literature
mistakes or gaps and education experts.
Educators usually take into account, for the The two-stage validation process has allowed the instrument
34 evaluation, questions related to the .416 to be debugged based on expert judgment and a pilot study.
professional competence of the degree The pilot study, applied to 50 students of Engineering, after a
successful analysis of the reliability, has been analysed
Dimension 6 includes topics such as procedures, strategies through a principal components factorial analysis and a
and plans to guide comprehensive student learning (see Table CATPCA. Both analysis have offered satisfactory solutions. The
7). One of the most representative items of this dimension is CATPCA is a more parsimonious model (less components)
whether educators use the results of assessments to make with a similar percentage of explanation of the variance.
adjustments or improvements to their subjects (teaching The seven components in which the model has been
methods, resources, assessment techniques, etc.). In this item structured show as many relevant dimensions that reflect the
the factorial load shows a negative result so it can be factorial structure of the questionnaire. In addition to
interpreted that the students of the sample have been inclined statistical support, the structure is sustained by the literature
to low values in their answers. Although this will have to be review (transparency in [7] and [17], level of information
studied in the future more closely, this result could be taken about score in [8] [9] and [10], evaluation modalities in [13],
as an indicator of the difficulty that educators have to make feedback from teachers [13] [14] and [17], resources and
changes as a consequence of their students' assessments. educational agents in [7] [11] and [12], procedures, strategies
and guidelines in [1] [6] [14] [18] [19] and [20]; and material
resources that educators use in [11] and [12]), and what is
more important from a qualitative perspective, all dimensions

6
Construction of a questionnaire to know the transparency in the
TEEM 2017, October 2017, Cádiz Spain
evaluation of the learning of Engineering students

have a substantive meaning. Therefore, it is possible to affirm la evaluación de su aprendizaje. Revista Española de Pedagogía. 250 (Sept-
Dic. 2011), 401-425.
that all the objectives, both general and specific, have been [8] Kelly Burton. 2006. Designing criterion-referenced assessment. Journal of
fulfilled. Learning Design, 1, 2 (2006), 73-83. DOI:
http://dx.doi.org/10.5204/jld.v1i2.19
In the future, the sample will have to be expanded for new [9] Julie Wood et al. 2008. Engaging Students in the Academic Advising Process.
pilot studies. Likewise, the questionnaire should be applied to The Mentor: An Academic Advising Journal, 10 (2008), 23-32.
representative samples and effectively prove that the [10] Jorge Pérez et al. 2017. Development of Procedures to Assess Problem-
Solving Competence in Computing Engineering. IEEE Transactions on
instrument evaluates the transparency of the evaluation of Education, 60, 1 (Feb. 2017), 22-28
learning. Having an instrument of these characteristics, [11] Aurelio Villa and M. Poblete. 2011. Evaluación de competencias genéricas:
Principios, oportunidades y limitaciones. Bordón, 63 (Feb. 2011), 147-170.
correctly validated, can be a very important tool for [12] Verónica Villaroel and Daniela Bruna. 2014. Reflexiones en torno a las
educational improvement, both in engineering studies and in competencias genéricas en educación superior: Un desafío pendiente.
other studies of higher education. Psicoperspectivas, 13, 1 (2011), 24-34.
[13] Concepción Yaniz and Lourdes Villardón. 2012. Modalidades de evaluación
de competencias genéricas en la formación universitaria. Didac, 60 (Jul.-Dic.
ACKNOWLEDGMENTS 2012), 15-19.
Our thanks to educators and students who have participated [14] ENQA. 2015. Standards and Guidelines for Quality Assurance in the European
Higher Education Area (ESG). (2015). Brussels, Belgium.
in this work. [15] Mark De Vos and Dina Z. Belluigi. 2011. Formative Assessment as
Mediation. Perspectives in Education, 29, 2 (Jun. 2011), 39-47.
[16] Laura J. Leslie and Paul C. Gorman. 2017. Collaborative design of
REFERENCES assessment criteria to improve undergraduate student engagement and
[1] Juan. I. López-Ruíz. 2011. Un giro copernicano en la enseñanza performance. European Journal of Engineering Education, 42, 3 (May. 2017),
universitaria: formación por competencias. Revista de Educación, 356 (Dic. 286-301. DOI: http://orcid.org/0000-0002-7925-9589
2011), 279-301. [17] Mikel Villamañe et al. 2017. Desarrollo y validación de un conjunto de
[2] María Soledad Ibarra, Gregorio Rodríguez and Miguel A. Gómez Ruiz. 2012. rúbricas para la evaluación de Trabajos Fin de Grado. ReVisión, 10, 1 (En.
La evaluación entre iguales: beneficios y estrategias para su práctica en la 2017).
universidad. Revista de Educación, 359 (Sept.-Dic. 2012), 206-231 DOI: 10- [18] Joe Miró, Maite Fernández-Ferrer and Natividad Cabrera. 2015. La opinión
4438/1988-592X-RE-2010-359-092 y percepción de los estudiantes en la evaluación por competencias.
[3] Victor L. López-Pastor, Juan. C. Manrique-Arribas and Cristina Vallés-Rapp. ReVisión, 8, 3 (2015), 59-70.
2011. La evaluación y la calificación en los nuevos estudios de Grado. [19] De Miguel, M. (Coord.). 2006. Metodologías de enseñanza y aprendizaje para
Especial incidencia en la formación inicial del profesorado. Revista el desarrollo de competencias. Orientaciones para el profesorado
Electrónica Interuniversitaria de Formación del Profesorado, 14, 4 (Jun. universitario ante el Espacio Europeo de Educación Superior. Madrid:
2011). Alianza Editorial.
[4] A. Pérez-Pueyo et al. 2008. Evaluación formativa y compartida en la [20] Amparo Fernández-March, A. 2006. Metodologías activas para la formación
docencia universitaria y el Espacio Europeo de Educación Superior: de competencias. Educatio Siglo XXI, 24 (2006), 35-56.
cuestiones clave para su puesta en práctica. Revista de Educación, 347 [21] María Martínez et al. 2013. Una propuesta de evaluación de competencias
(Sept.-Dic. 2008), 435-451. genéricas en grados de Ingeniería. Revista de Docencia Universitaria, 11
[5] Elena Cano-García and Maite Fernández-Ferrer. (Eds.). 2016. Evaluación por (Nov. 2013), 113-139. DOI: https://doi.org/10.4995/redu.2013.5550
competencias: la perspectiva de las primeras promociones de graduados en el [22] Fernández-Jiménez et al. 2013. Educación a distancia: la enseñanza y el
EEES. Barcelona: Ediciones Octaedro. aprendizaje CET (Revista contemporaneidade educacao e tecnologia), 1, 3
[6] Gregorio Rodríguez-Gómez et al. 2012. La voz del estudiante en la (Apr.2013), 49-62.
evaluación del aprendizaje: un camino por recorrer en la universidad.
RELIEVE, 18, 2 (2012). DOI: 10.7203/relieve.18.2.1985
[7] Víctor Alvárez-Rojo et al. 2011. Análisis de la participación del alumnado en

View publication stats

You might also like