Improvements To An Electrical Engineering Skill Audit Exam To Improve Student Mastery of Core EE Concepts

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

184 IEEE TRANSACTIONS ON EDUCATION, VOL. 54, NO.

2, MAY 2011

Improvements to an Electrical Engineering Skill


Audit Exam to Improve Student Mastery
of Core EE Concepts
David W. Parent, Member, IEEE

Abstract—The San José State University Electrical Engineering • Digital: Use a Karnough map (K-map) to reduce the
(EE) Department implemented a skill audit exam for graduating number of logics gates required to implement a Boolean
seniors in 1999 with the purpose of assessing the teaching and the logic function.
students’ mastery of core concepts in EE. However, consistent low
• Solid State: Explain what channel length means.
scores for the first years in which the test was administered sug-
gested that students had little incentive in reviewing for passing • Electromagnetics: Using Transmission Line Simulator
the test. To promote the concept that there is a set of basic skills data, determine the wavelength of a signal.
every graduating senior should master, improvements were imple- • Circuits: Determine the gain of an inverting or noninverting
mented that included creating an online exam used for review and OPAMP.
requiring students to earn a 70% or higher score on an in-class • Systems: Given transient current and voltage data from a
exam that is a requirement for graduation. After the improvements system, determine the level of damping.
were made, all students demonstrated at least 70% mastery of the
The student’s exam score counted for 10% of the student’s
core EE concepts as measured by the improved in-class skill audit
exam. The details of these improvements are presented. Anony- final grade in the senior project class. The grading was done by
mous survey results of graduating seniors indicate that students the faculty expert who wrote the items for the area.
feel that preparing for the skill audit exam was a good use of their After collecting data about student mastery of core EE con-
time and that the exam should be made more rigorous for future cepts (as measured by the skill audit exam), it was agreed by
students. the faculty that the mastery demonstrated by the students was
Index Terms—ABET, capstone engineering experience, elec- too low and that improvements needed to be made to improve
trical engineering education, exit exam, learning management student mastery of core EE concepts. The means of the scores
system. for each core topic area of the skill audit exam by semester can
be seen in Table I. The overall means and standard deviations
for the Digital, Solid State, Electromagnetics, Circuits, and Sys-
I. INTRODUCTION tems areas for the 14 semesters shown were (46.5%, 6.7), (50%,
11.9), (34.4%, 11.4), (47%, 13). and (52.6%, 11.6), respectively.
Given that the faculty stated the exam items were important for

A S PART OF ABET 2000 [1], [2], the EE Department at


San José State University, San José, CA, added the re-
quirement for all students enrolled in the department’s capstone
the students to master, these low scores were cause for concern.
II. IMPROVEMENTS TO THE SKILL AUDIT EXAM
senior design class [3] to take a skill audit exam [4]. The orig- Several changes were made to the skill audit exam to ad-
inal purpose of the exam was to measure the department’s se- dress faculty concerns about the students’ mastery levels. These
niors’ mastery level of core EE concepts and use this data to changes had two independent goals. The first was to determine
improve the delivery of these core concepts in the department’s in a non-resource-intensive manner what students, unprepared,
core classes. The faculty stated that an exam under their control remembered from the curriculum. From this data, necessary im-
would give better feedback than using the Fundamental Exam provements to the curriculum could be determined. The second
(FE) [5]–[7]. In addition, the faculty did not want to implement goal was to help students improve their mastery of these core
a qualifying exam as described in [8] and [9]. concepts before they graduated.
The original skill audit exam consisted of 50 items in a free- The improvements1 made to improve student mastery of core
response format. The students took the exam during a 3-h pe- EE concepts consisted of breaking the skill audit exam into
riod under the supervision of a proctor. The students were not two parts. The first part consisted of an online exam with mul-
given any information about what was to be on the exam be- tiple-choice items that the students used to review core EE skills
cause the faculty wanted to measure what the students remem- and that measured student mastery without preparation. The
bered without preparation. The items, for the most part, were second part consisted of a 10-item in-class exam with questions
intended to measure student mastery levels without preparation. similar to, but different than, the online version of the exam.
Typical items for each area included the following. The in-class exam measured student mastery levels after prepa-
ration. An additional component of the intervention was that stu-
Manuscript received September 02, 2009; manuscript revised November 18, dents had to earn a passing score on the 10-item in-class exam
2009. Date of publication March 15, 2010; date of current version May 04, 2011. (in this case 70%) to receive a grade in the senior project class
The author is with the Electrical Engineering Department, San José State Uni- (EE198B). This made the exam into a Go/No Go (pass/fail) exit
versity, San José, CA 95192-0084 USA (e-mail: dparent@email.sjsu.edu).
Digital Object Identifier 10.1109/TE.2010.2042451 1These were made with department faculty input.

0018-9359/$26.00 © 2010 IEEE


PARENT: IMPROVEMENTS TO EE SKILL AUDIT EXAM 185

TABLE I
MEAN SCORES OF THE SKILL AUDIT EXAM FROM SPRING 1999 TO SPRING 2007 BY CONTENT AREA

Missing data.

exam [10]. The in-class exam was reduced from 50 items in the
original exam to 10 items from the original exam so that the
cost of administrating the test would be reduced. With the cost
of offering the test reduced, students were allowed to retake the
in-class exam until they passed.
The online exam was created with an automatic quiz gener-
ator, using the Blackboard software [11], to allow the students
to review the skills they were being asked to master and to mea-
sure student mastery levels without preparation. Students were
able to take the exam until they achieved the mastery level they
were comfortable with. Another advantage to offering the exam
online was that student performance by item could be easily cap-
tured and displayed. This data could then be used to study item Fig. 1. Student mastery levels for core content areas measured by the online
validity and improve the test. exam for three semesters.
The 10-question pass/fail in-class exam measures student
mastery of core EE skills after preparation, and the pass/fail na- analysis of the raw data provided by the automated quiz gener-
ture of the exam ensured that students will study until they pass ator, the increased mastery level shown by the online exam re-
the exam. The underlying assumption that drove the decision sults was likely due to the students taking the online exam more
to make passing the in-class exam a graduation requirement often until a higher level of mastery was achieved. If the con-
was that even though the students may have known the core EE tent areas were to be scored only taking into account the first
skills at one time, the students needed to be motivated to put attempt of students in recent semesters, the means of the con-
in the effort required to master these skills. Even though some tent areas would drop to those encountered in semesters before
programs encourage motivation (buy-in) from the student by the improvements were carried out.
setting the exit exam to be worth 66% of the total grade in a In Fig. 2, item mean (closed boxes), and discrimination
project course [12], the department faculty determined it not to (closed circles) are plotted against the item identifier. The
be appropriate to give an exam so much weight in a hands-on discrimination factor is an estimate of how well a particular
project course. item can predict the performance of a student on the whole
exam. The item mean suggests the level of difficulty of the
III. RESULTS OF THE IMPROVEMENTS item. In Fig. 2, it can be seen that the item “Systems 7” had a
Mean scores for each content area from the online exam for discrimination factor of zero, which means that the item does
the first semester the intervention was implemented (Fall 2007) not predict the overall performance of the student on the exam.
and the last two semesters can be seen in Fig. 1. The sample This item also had one of the highest means. This indicates that
size for each semester was greater than 40. The results of the the item might be too easy and that it should be removed from
Fall 2007 online exam match very closely with the results from the exam.
the previous 14 semesters using the 50 question in-class exam The in-class exam results after the improvements were made
for which the students were not able to prepare. were very positive. All the students achieved at least 70% mas-
The average mastery level (as measured by the means of on- tery in the in-class exam; 76% of the students demonstrated
line test scores) in all the core content areas increased in subse- mastery on the first try, 23% on the second try, and 1% on the
quent years since the online exam was first offered. After careful third try. The trend of a few students taking more than once to
186 IEEE TRANSACTIONS ON EDUCATION, VOL. 54, NO. 2, MAY 2011

[15]. The 70% pass rate was selected with the idea that students
might select the wrong answer on an exam by accidentally cir-
cling the wrong answer out of nervousness. Given that only one
student earned a 70% on the exam without preparation in all the
years the exam was offered, 70% seemed a high goal to achieve.
The problem with the 70% mastery level is that a student can
show a complete lack of mastery (fail) a content area of the exam
and still earn a passing score. Data analysis showed that students
who failed the in-class exam were those who did not take the on-
line exam. Most of the wrong answers were not a “typo,” but a
memorization of the wrong answer, or solving the problem in-
correctly. (In the in-class exam of record, students write on the
exam paper, so what they were thinking can sometimes be de-
duced.) Given this data, the passing score of the exam should
probably be raised.
Fig. 2. Item mean (closed boxes) and discrimination (closed circles) from the
Spring 2009 semester. The items are arranged from the lowest discrimination V. STUDENT PERCEPTIONS OF THE SKILL AUDIT EXAM
factor to highest.
A survey was given to students enrolled in the senior design
project both to determine if they felt that the effort required to
demonstrate mastery in the in-class exam has continued to the
prepare for the skill audit exam was an effective use of their time
most recent semester. The students who did not pass the in-class
and to determine if the exam should be improved by increasing
exam were the students who did not complete the online version
the score required to pass it or by making the questions more
of the exam.
realistic.
Prior to the intervention, there was some student resistance
The students answered the survey voluntarily and anony-
to the idea of an exit exam due to the sensitive nature of high-
mously at the end of their senior project. The survey and student
stakes exams. This resistance was assumed to be from a fear
of failure. The student resistance to the exit exam was reduced responses can be seen in Table II. The survey was kept short and
by allowing the students to take and pass the exam early (after consisted of three questions concerning the worth of preparing
taking the online exam). Interestingly, once they had passed the for the exam and two questions concerning improvements to
exam, many students stated that the passing score should be set the exam.
at 90% or higher. The students were almost evenly split between those who
thought that preparing for the skill audit exam was at least some-
what helpful and those who thought preparing for the exam was
IV. MANAGING A GO/NO GO (PASS/FAIL) EXIT EXAM not helpful (Item 1, Table II). Given that some student projects
are at a high level of systems integration, and most students pass
Given that the skill audit exam was changed into a pass/fail
the exam two months prior to the start of their project, it is not
exit exam, and that exit exams such as the California High
surprising that many students did not feel that the preparation
School Exit Exam [13] can be controversial [14], efforts were
was not worth the time as it related to their project. These results
made to ensure that no one group of students was penalized by
could indicate that more systems questions need to be included.
implementing the pass/fail exit exam. Additional concerns with
A majority of students felt that preparing for the skill audit
respect to high-stakes exit exams are having the exam become
exam was at least somewhat helpful for job interviews (Item 2,
an end in itself (teaching to the test), maintaining a secure pool
Table II). This result indicates that further refinements to the
of items and making sure that high pass rates are not achieved
skill audit exam should be made to prepare students for the
by lowering the required mastery level.
workforce, even though it is too soon to have feedback from
Evidence that the exam has not become an unreasonable
local industry professionals as to a difference in SJSU EE stu-
burden on minority students is that all students, regardless of
dents.
minority status, have passed the exam since the changes were
Interestingly, when the students were asked simply if
made in Fall 2007.
preparing for the skill audit exam was worth it (Item 3,
Exam security is always a concern and is handled in several
Table II), a large majority answered “yes” by a higher margin
ways. First, the motivation to cheat is removed because the on-
than for the first two items measuring the skill audits exam’s
line exam is only used by the student to review; it does not count
worth. This probably means that preparing for the skill audit
for passing the in-class exam. Second, the mean scores for the
exam is good for the students who have not been captured by
semester are also monitored. For instance, if the cheating was
the survey.
occurring on the online exam, one would expect the means of the
The results from items 4 and 5 (Table II) support the idea
student’s first attempt of the exam to increase. Third, items on
that students feel the skill audit exam should be made more dif-
the in-class exam are modified to ensure the students are not re-
ficult to pass. A majority of students felt the questions should
warded for memorizing answers. Memorization of answers has
be made more realistic and the passing score should be at least
been detected because students have been found to select the an-
80%. Given that items 2 and 3 strongly indicated that students
swer that was correct on the online version of the exam but was
feel that preparing for the skill audit exam was worth spending
not the correct answer for the exam of record.
time on,2 and that items 4 and 5 indicated that the exam should
Even though the pass/fail exit exam intervention did increase
student performance, the passing score of 70% may be too low 2with Item 3 weakly supporting the worth of the exam
PARENT: IMPROVEMENTS TO EE SKILL AUDIT EXAM 187

TABLE II
SKILL AUDIT EXAM SURVEY ITEMS AND RESULTS FOR GRADUATING SENIORS

be more rigorous, the exam requirements were changed for the [2] M. Mendelson and R. Noorani, “Improving undergraduate engineering
Fall 2009 semester. The passing rate was changed to 80% given education in the US and abroad using EC 2000,” in Proc. Int. Conf. Eng.
Educ., Taipei, Taiwan, Jan. 15, 2000.
that this was the most cost-effective improvement that could be [3] “EE 198 senior design project resource home page,” May 11, 2009
made. [Online]. Available: www.engr.sjsu.edu/dparent/ee198
[4] “Assessment for curricular improvement,” May 11, 2009 [Online].
VI. CONCLUSION Available: www.engin.umich.edu/teaching/assess_and_improve
[5] “FE exam,” May 11, 2009 [Online]. Available: http://www.ncees.org/
The improvements made to the skill audit exam (which in- Exams/FE_exam.php/
cluded breaking up the old in-class exam into two parts: an on- [6] M. Rao, S. Junaidu, T. Maghrabi, M. Shafique, M. Ahmed, and K.
line exam for review and an in-class 10-item exit exam) has al- Faisal, “Principles of curriculum design and revision: A case study in
implementing computing curricula CC2001,” in Proc. 10th Annu. Conf.
lowed the students to increase their mastery level of core EE Innov. Tech. Comput. Sci. Educ., Portugal, 2005, pp. 256–260.
concepts immediately before they enter the workforce or grad- [7] K. Sanders and R. McCartney, “Program assessment tools in computer
uate school while still allowing the department to gather infor- science: A report from the trenches,” in Proc. 34th SIGCSE Tech. Symp.
Comput. Sci. Educ., Reno, NV, 2003, pp. 31–35.
mation on how well the students remember core EE concepts. [8] J. Bishop, “The effect of curriculum-based external exit exam systems
The online exam allows the gathering of information in a non-re- on student achievement,” J. Econ. Educ., pp. 171–182, 1998.
source-intensive manner. At the time of writing, two years after [9] J. Bishop, “Are national exit examinations important for educational
the improvements were implemented, there is no detectable stu- efficiency?,” Swedish Econ. Policy Rev., vol. 6, pp. 349–398, 1999.
[10] “Soldier’s Manual of Common Tasks—Warrior Skills Level 1,” May 11,
dent resistance to the idea that they should demonstrate mastery 2009 [Online]. Available: http://www.scribd.com/doc/2472749/Army-
though an exit exam. stp21-1-Soldiers-Manual-of-Common-Tasks-Warrior-Skills-Level-1
More improvements to the skill audit exam need to be made. [11] Blackboard May 11, 2009 [Online]. Available: http://www.blackboard.
First, the item bank should be expanded. Second, the item level com/
[12] J. Liu, J. Marsaglia, and D. Olson, “Teaching software engineering to
could be used to develop items with higher discrimination scores make students ready for the real world,” J. Comput. Sci. Coll., vol. 18,
and diverse difficulty levels. Each item’s discrimination factor pp. 43–50, 2002.
and mean can be used to develop exams with high discrimina- [13] K. J. Sullivan, “In new study, high school exit exam gets a failing
grade,” Apr. 21, 2009 [Online]. Available: http://storybank.stanford.
tion factors and diverse difficulty levels. Third, a higher passing edu/stories/in-new-study-high-school-exit-exam-gets-a-failing-grade
score should be set to encourage mastery of EE core concepts. [14] B. Jacob, “Getting tough? The impact of high school graduation
exams,” Educ. Eval. Policy Anal., pp. 99–121, 2001.
ACKNOWLEDGMENT [15] J. Betts, R. Costrell, H. Walberg, M. Phillips, and T. Chin, “Incentives
and equity under standards-based reform,” Brookings Papers Educ.
M. Adams is gratefully acknowledged for his help setting up Policy, vol. 2001, pp. 9–74, 2001.
the Blackboard software, as is T. Hudson for her constructive
editing help. David W. Parent (S’96–M’99) received the Ph.D. degree in electrical engi-
neering from the University of Connecticut, Storrs.
Currently, he is an Associate Professor with the Electrical Engineering De-
REFERENCES partment, San José State University (SJSU), San Jose, CA, where he teaches
[1] “ABET,” May 11, 2009 [Online]. Available: www.abet.org CMOS processing and device physics and conducts bio-interfacing research.

You might also like