Professional Documents
Culture Documents
Education For Chemical Engineers
Education For Chemical Engineers
Education For Chemical Engineers
a r t i c l e i n f o a b s t r a c t
Article history: Evaluating the effectiveness of teaching and learning core knowledge outcomes and professional skills is a
Received 15 March 2019 highly challenging task that has not yet been satisfactorily addressed at higher education level. The iTeach
Received in revised form 20 May 2019 European project consortium developed a framework for assessing the effectiveness of various pedagogi-
Accepted 5 June 2019
cal methodologies in chemical engineering education, including those aiming to promote important core
Available online 6 June 2019
competencies related to employability, in a range of geographical and educational contexts.
The framework was firstly implemented in a core chemical engineering area (reaction engineering)
Keywords:
to check its usability and robustness, and subsequently was also tested on a range of subject areas from
Teaching efficiency
Pedagogical methodologies
various branches of engineering and other disciplines, one of which is analysed in more detail in this
Course assessment contribution. The results of this broader assessment encompassed a much more diverse student body with
varying educational experiences and a wider range of different teaching methodologies. The outcomes
of this assessment are highlighted and the benefits of such an objective approach for evaluating teaching
effectiveness is discussed.
Crown Copyright © 2019 Published by Elsevier B.V. on behalf of Institution of Chemical Engineers. All
rights reserved.
https://doi.org/10.1016/j.ece.2019.06.001
1749-7728/Crown Copyright © 2019 Published by Elsevier B.V. on behalf of Institution of Chemical Engineers. All rights reserved.
22 C.V. Miguel et al. / Education for Chemical Engineers 29 (2019) 21–28
On the other hand, it is important that robust criteria for ing academics, students, graduates and employers, from different
assessing the effectiveness of the core knowledge and competency countries across Europe. Subsequently, it was implemented in a
delivery are implemented across the higher education sector to core chemical engineering area (reaction engineering) across all
ensure the achievement of key aims of education policies. After all, iTeach partners to check its usability and robustness, giving also
the famous quote of Peter Drucker that ‘What gets measured, gets the opportunity to improve/correct it, if required. Then, the frame-
improved’ (Drucker, 1992) is generally recognised. Thus, evaluat- work was extended to other disciplines, in other areas and branches
ing the effectiveness of teaching and competency development will of engineering, where students are exposed to different teaching
enable the identification of the most effective ways of promoting methodologies.
these competencies and ultimately improve their delivery. A clear
definition by the training centres, e.g. of chemical engineering, con-
2. Methods
cerning the objectives, contents and challenges in achieving these
(Favre et al., 2008; Felder et al., 2000b; Westmoreland, 2008, 2014),
2.1. Assessment framework development
as well as the strategies of educational choice, including new forms
of learning for students, is required. However, to define the objec-
The methodology involved in the development of the assess-
tives and the content, measureable indicators have to be defined
ment framework consisted of:
and determined for teaching effectiveness in various forms.
A preliminary survey of the pedagogical literature on measur-
• Promoting closer involvement of employer organisations in
ing the effectiveness of pedagogical approaches in education has
shown that most reports on assessing the effectiveness of teaching chemical engineering curriculum design;
• Establishing state-of-the art methodologies to assess the effec-
have been related to primary or secondary school teaching evalua-
tion, and not the higher education (see, for example, Baird et al., tiveness of teaching core chemical engineering knowledge;
• Defining various indicators of the effectiveness of teaching in
2016). In the last two decades, however, there have been some
studies and projects focused on establishing a system of measur- chemical engineering higher education;
• Investigating methods of effectively acquiring employability
ing the effectiveness of teaching in higher education courses (e.g.
Allan et al., 2009; Bembenutty, 2009) and vocational education competencies;
• Testing the framework at partner institutions focusing on various
and training (e.g. Burnett and Clarke, 1999), some of them specif-
ically addressing the engineering discipline (e.g. Cabrera et al., pedagogic methodologies.
2001; Chalmers, 2007). Attempts to establish metrics for teach-
ing effectiveness have assumed many forms but frequently they The outcome was the development of a toolkit in Microsoft Excel
have focused on student responses to questionnaires. Despite some for the framework exploitation by any higher education institu-
opposition to incorporating such student ratings in university eval- tion interested in the assessment, comparison and communication
uation, they are also widely used by governments to measure the of their teaching effectiveness in a given course, for instance,
effectiveness of educational provision. One such example is the when applying different pedagogical methodologies. This toolkit
use of student satisfaction in the Teaching Excellence and Student is available for download in the Supplementary Information sec-
Outcome Framework in the UK, which assesses excellence in teach- tion. The developed assessment framework considers six metrics
ing at universities and colleges to establish how well they ensure (cf. Table 1), evaluated on a scale ranging from 1 to 5 (cf. Table 2).
excellent outcomes for their students in terms of graduate-level The evaluation of a specific methodology or a course by any institu-
employment or further study (Department for Education, 2016). tion requires the distribution of the questionnaires available in the
The question of how to determine whether or not a specific Supplementary Information to the interested parties involved on
teaching approach or an instructional programme is effective is not the educational process (i.e. the stakeholders) and the collection of
a new challenge (Felder et al., 2000a). The traditional and the most students’ marks. The gathered answers must be uploaded into the
widespread assessment approaches used in engineering educa- spreadsheet which converts them into a single classification, for
tion (courses, curricula, and research investigations) are discussed each metric and stakeholder group (A – Academics, E – Employers,
by Olds et al. (2005). But, as outlined by the authors, one should G - Graduates (i.e. graduated over the last 5 years) and S – Students).
always bear in mind that the ultimate purpose of the assessment Calculation of each metric considers the mean value obtained in
of education (or any type of evaluation) should be to improve stu- all corresponding questions (see questionnaires in Supplementary
dent learning, which begins with setting objectives and reviewing Information).
them with each assessment activity (Olds et al.., 2005). Felder et al. Each metric score has a weighted contribution from the differ-
(2000a) argue that the assessment of teaching should be carried out ent categories of stakeholders. The stakeholders’ weightings used
for a clearly defined purpose — to evaluate teaching effectiveness for each metric, and the metrics themselves, were defined dur-
(summative assessment) or to improve it (formative assessment). ing the discussions that took place in focus group sessions with
According to Felder et al. (2000a), the assessment of teaching all the stakeholders from Portugal, UK, Germany, France, Slovakia
should be carried out in the context of published goals, measur- and Macedonia. For instance, metric 1 assesses the strategic nature
able performance criteria, and agreed-upon forms of evidence. The of the course/discipline under evaluation (more info about each
evidence should come from a variety of sources, including learn- metric is provided in Table 1). However, it is difficult for the stu-
ing outcome assessment, student end-of-course ratings, student dents enrolled in the programme to have a clear and sound opinion
surveys, focus groups, interviews, retrospective student evalua- about that topic (i.e., they do not yet have a strategic view of the
tions of courses and instructors, alumni and peer evaluations, and discipline without finishing the whole programme first), and there-
self-assessments. Nearly all these aspects are addressed in the fore their opinion is not taken into account in this metric, where
framework described herein. academics and employers have more relevant opinions (higher
This work presents a framework for assessing the effective- weighting) than the graduates (cf. Eq. 1). At the other extreme is
ness of various pedagogical methodologies in chemical engineering metric 4, related to the perception of the relevance of the pedagogi-
education, including those aiming to promote important core com- cal approach to be assessed. In this case, only the opinion of students
petencies related to employability, in a range of geographical currently enrolled in the course is taken into account (graduates’
and educational contexts. This framework was developed after opinion was not included in this metric, because at the time they
several focus group sessions with various stakeholders, includ- studied the course being evaluated, the pedagogical methodology
C.V. Miguel et al. / Education for Chemical Engineers 29 (2019) 21–28 23
Table 1
Assessment framework metrics and corresponding calculation formulas.
i=1 i=1
6 Evaluation of transfer M6 = (A + 2G + 2E) /5 (6)
Assesses not only the teaching efficiency of a single teaching unit (course) but also gives a measure
of the whole formation. The evaluation of transfer must be performed in professional context,
during internship if possible, or during the early career years.
*
A – Academics, E – Employers, G – Graduates, S – Students.
**
AM – average mark, STD – standard deviation, both referring to either course or student cohort; y – assessment year.
Table 2
Classification scale for each metric.
Classification scale
Metric
1 2 3 4 5
employed could have been different). It is worth mentioning that 2.2. Assessment framework implementation
other weights/stakeholder contributions can be considered and
changed in the developed toolkit by other higher education insti- 2.2.1. Assessment framework version 1.0 (without metric 5)
tutions testing the framework, if required in a specific context of Pilot testing of the assessment framework was performed in the
the institution. countries of the iTeach consortium members (Portugal, UK, France,
It is worth noting that the equation used to estimate metric 5 Germany, Slovakia and Macedonia). Chemical reaction engineer-
(Eq. (5)) is based solely on students marks, and so it is not obtained ing courses were chosen for testing version 1.0 of the assessment
from a questionnaire. To assess the acquisition of knowledge by the framework because this is a core course of chemical engineering
students when employing different pedagogical methodologies, Eq. programmes that is delivered by all the institutions participating in
(5) considers the average mark (AM) and the standard deviation this study. In addition, since the course contents are similar across
(STD) in the course and student cohort, during the assessed year (y) these institutions (thus the level of depth/knowledge/difficulty is
and the 3 previous years. Hence, when students are exposed to a similar), it provided the opportunity to assess the robustness of the
new pedagogical methodology, increase in marks should be as high assessment framework with each partner using different pedagog-
as possible, but related to the whole cohort performance so as to ical methodologies in course delivery. Table 3 lists the pedagogical
adjust for an overall improvement in the whole cohort, not just in approaches that each partner institution used for teaching chem-
a given course/module. Similarly, a decrease in the standard devi- ical reaction engineering when the pilot study took place (during
ation is expected with highly effective pedagogical methodologies, the academic year 2015/2016).
indicating a uniform understanding by the cohort of students (or
the absence of students who were lost in some parts of the course).
2.2.2. Assessment framework version 2.0 (including metric 5)
Moreover, with such arrangement, metric 5 is independent of the
Based on the outcomes of the first stage piloting, the final version
grading scale. The multiplication parameter of 3 in Eq. (3) refers to
of the assessment framework was carried out after simplify-
the expected average of the classification scale used (cf. Table 2).
ing/clarifying some questions of the online survey (upon comments
24 C.V. Miguel et al. / Education for Chemical Engineers 29 (2019) 21–28
Table 3
Institutions piloting the assessment framework version 1.0 and pedagogical
approaches used for teaching chemical reaction engineering course.
Fig. 2. iTeach assessment framework (v. 1.0) measurements for chemical reaction engineering courses in Portugal, UK, France, Slovakia and Macedonia.
Fig. 3. Number of invitations vs. participants per stakeholder group for piloting the assessment framework version 2.0.
Fig. 4. Survey completion rate per stakeholder group when piloting the assessment framework version 2.0.
26 C.V. Miguel et al. / Education for Chemical Engineers 29 (2019) 21–28
Table 4
Comparison of the scores given by stakeholders to selected questions for Disperse
Generation course.
• 1.2– Does it cover all the needs expected from a course of this
nature at this level?
• 1.6– Does this teaching unit (course) contribute to the attractive-
ness of the program for the formation of future graduates?
• 2.1– Is the content of the teaching unit (course) adequate?
• 6.2 – Does the course provide the opportunity to combine theory
and practice to analyse the problems encountered in professional
life?
• 6.5 – Does the course provide the opportunity to improve written
and/or oral communication skills?
• 6.7 – Does the course promote students’ management capabili-
ties?
Fig. 5. Parity plots of metric values for the pairs a) Graduates-Students and b) The Employers’ group assessment of the course in terms of
Academics-Employers when piloting the assessment framework version 2.0. responses to questions 1.2, 2.1 and 6.2 was lower than other stake-
holders (cf. Table 4). This is recognised by the course coordinator, as
the lack of time hinders the possibility to address other important
The overall assessment framework results are expressed topics not covered in the current programme in a highly tech-
through the six metrics and are plotted in Fig. 6. nological and rapidly progressing field of knowledge. This could
Fig. 6 shows that the Disperse Generation course was positively be achieved possibly through an introduction of another course,
assessed by the stakeholders in 5 out of the 6 metrics of the assess- or with a course with a higher ECTS value. On the contrary, the
ment framework. Metrics 1–4 and 6 are all ranked above 4, actually Employers were more enthusiastic than other stakeholders in
in the range 4.3–4.5, while metric 5 yielded a value of 3.0. The value terms of question 1.6, considering that Disperse Generation course
of metric 5 means that in the current year the pedagogical method- plays an important role in the Electrical and Computers Engineering
ology neither improved nor declined the teaching efficiency, which programme attractiveness as it is a very timely topic with increas-
Appendix A. Supplementary data Drucker, P., 1992. Managing for the Future. Butterworth-Heinemann, New York.
Favre, E., Falk, V., Roizard, C., Schaer, E., 2008. Trends in chemical engineering edu-
cation: process, product and sustainable chemical engineering challenges. Educ.
Supplementary material related to this article can be found, Chem. Eng. 3, e22–e27.
in the online version, at doi:https://doi.org/10.1016/j.ece.2019.06. Felder, R.M., Rugarcia, A., Stice, J.E., 2000a. The future of engineering education V -
001. assessing teaching effectiveness and educational scholarship. Chem. Eng. Educ.
34, 198–207.
Felder, R.M., Woods, D.R., Stice, J.E., Rugarcia, A., 2000b. The future of engineering
References education - II. Teaching methods that work. Chem. Eng. Educ. 34, 26–39.
Fink, L.D., Ambrose, S., Wheeler, D., 2005. Becoming a professional engineering edu-
Allan, J., Clarke, K., Jopling, M., 2009. Effective teaching in higher education: percep- cator: a new role for a new era. J. Eng. Educ. 94, 185–194.
tions of first year undergraduate students. Int. J. Teach. Learn. High. Educ. 21, Glassey, J., Brown, D., 2014. Enhancing Industry Exposure for Student Chemical
362–372. Engineers. Chemeca 2014, Perth, Australia.
Baird, M., Engberg, J., Hunter, G., Master, B., 2016. Improving Teaching Effectiveness: iTeach project, Improving Teaching Effectiveness in Chemical Engineering Educa-
Access to Effective Teaching: The Intensive Partnerships for Effective Teaching tion, https://research.ncl.ac.uk/iteacheu/, date of last access: 14 February 2019.
Through 2013–2014. RAND Corporation, Santa Monica, CA. Jollands, M., Jolly, L., Molyneaux, T., 2011. Project based learning as a contributing
Bembenutty, H., 2009. Teaching effectiveness, course evaluation, and academic per- factor to graduates’ work readiness. Eur. J. Eng. Educ. 37, 143–154.
formance: the role of academic delay of gratification. J. Adv. Acad. 20, 326–355. Kockmann, N., Lutze, P., Gorak, A., 2016. Grand challenges and chemical engineering
Burnett, P., Clarke, J., 1999. How should a vocational education and training course curriculum – developments at TU Dortmund university. Univers. J. Educ. Res. 4,
be evaluated? J. Vocat. Educ. Train. 51, 607–628. 200–204.
Cabrera, A.F., Colbeck, C.L., Terenzini, P.T., 2001. Developing performance indicators Mills, J.E., Treagust, D.F., 2003. Engineering education – is problem - based or project-
for assessing classroom teaching practices and student learning. Res. High. Educ. based learning the answer? Aust. J. Eng. Educ. 3, 2–16.
42, 327–352. Olds, B.M., Moskal, B.M., Miller, R.L., 2005. Assessment in engineering education:
Chalmers, D., 2007. A Review of Australian and International Quality Systems and evolution approaches and future collaborations. J. Eng. Educ. 94, 13–25.
Indicators of Learning and Teaching. The Carrick Institute for Learning and Westmoreland, P., 2008. Chemical engineering in the next 25 years. Chem. Eng. Prog.,
Teaching in Higher Education, Chippendale, NSW. 31–41.
Department for Education, ISBN 9781474132862, BIS/16/265 2016. Success As a Westmoreland, P.R., 2014. Opportunities and challenges for a golden age of chemical
Knowledge Economy: Teaching Excellence, Social Mobility and Student Choice. engineering. Front. Chem. Sci. Eng. 8, 1–7.
Cm9258, London.