Download as pdf or txt
Download as pdf or txt
You are on page 1of 20

Session 1793

An Example of Course and Program Outcome Assessment


Amir Karimi, Keith Clutter, and Alberto Arroyo
College of Engineering
The University of Texas at San Antonio

Abstract

This paper describes a process for systematic evaluation and updating of the undergraduate
educational objectives and outcomes of the engineering programs at the University of Texas at
San Antonio (UTSA). It describes a set of assessment tools, which includes survey
questionnaires, test results, and interviews. The course objectives are defined and evaluated for
each subject in the curriculum by the faculty on a regular basis to ensure that the program
outcomes are being met. A group of three or four faculty members are assigned to evaluate each
course outcome on a continual basis and their recommendations are used to make course
improvements. This paper will discuss the course and program outcome assessments. It will
explain how assessment data are collected, analyzed, and used in the enhancement of our
undergraduate programs.

Introduction

The College of Engineering at UTSA offers three undergraduate degree programs in civil,
electrical, and mechanical engineering (CE, EE, and ME). These programs started in 1982 and
received their first accreditation from the Accreditation Board for Engineering and Technology,
Inc. (ABET) in 1986. In preparation for the ABET accreditation under the Engineering Criteria
2000 (EC-2000)1, a set of educational objectives and outcome statements for each program was
first developed in 1999 and refined in 20022-4. The program objectives are in line with the
missions of the department, college of engineering, and the institution. In addition these
objectives are consistent with the requirements for ABET accreditation under the Engineering
Criteria 2000. These objectives have been reviewed and approved by the major constituencies of
each department. A process is in place for systematic evaluation and updating of each
department’s undergraduate educational objectives and outcomes and the engineering faculty are
directly involved in the assessment process. Procedures have been developed to obtain feedback
from all major constituencies, evaluate the inputs, and process the collected data for assessment.
A set of assessment tools has been developed and is being used to evaluate program objectives
and outcomes. The assessment tools include survey questionnaires, test results, and interviews.
For each subject in the curriculum, the course objectives are defined and are being evaluated by
the faculty on a regular basis to ensure that the program outcomes are being met. The following
section describes the assessment process.
Page 9.177.1

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
Evaluation of Educational Objectives

A process is in place for systematic evaluation and updating of each department’s program
objectives is in place. Departmental faculty and the College Accreditation Committee conduct
the evaluation. The Accreditation Committee consists of the Associate Dean for Academic
Affairs, all department chairs, a faculty representative from each department, and the director of
the College of Engineering Undergraduate Advising Office. The Accreditation Committee has
developed procedures for each department to obtain feedback from all major constituencies,
evaluate the input, and process the collected data for assessment. Input from alumni, employers,
and industry advisory board are obtained through surveys, interviews, personal contacts, and
industrial advisory board meetings. The faculty in each department reviews the collected
information, evaluates the input and works with the constituencies to revise and propose changes
to the program objectives. The Accreditation Committee assures that the changes to program
objectives are compatible with the College and University mission and at the same time clearly
reflects the unique features of each department.

Outcomes Required to Achieve Objectives

The faculty members in each department have developed a set of outcome statements for
achieving program objectives. For example, the mechanical engineering department has
identified a total of thirteen (13) outcomes. The relationships between student outcomes and the
objectives A, B, C, and D are shown below.

Objective A: Provide students with opportunities to acquire the ability to apply the
fundamentals of mathematics, sciences and engineering to quantitatively analyze problems.

Outcomes for Objective A: Students in this program will develop the following abilities
through their undergraduate education in this department:
A-1 to use the principles from chemistry, physics, statistics, and mathematics in engineering
applications
A-2 to use computer-based tools for engineering applications
A-3 to identify, formulate, and solve engineering problems

Objective B: Provide students with opportunities to develop innovative design skills, including
the ability to formulate problems, to think creatively, to synthesize information, and to
communicate effectively.

Outcomes for Objective B: Students will develop the following abilities through their
undergraduate education in this department:
B-1 to formulate design problem objectives, constraints, and synthesize problem information
B-2 to develop creative and innovative designs that achieve desired performance criteria within
specified objectives and constraints
B-3 to communicate effectively through written, oral, and graphical presentations
Page 9.177.2

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
Objective C: Provide students with opportunities to develop the ability to use modern
experimental techniques; collect, analyze, and interpret experimental data; and effectively
communicate the results.

Outcomes for Objective C: Students will develop the following abilities through their
undergraduate education in this department:
C-1 to design and conduct experiments to analyze and interpret experimental data
C-2 to use modern engineering tools, software, and laboratory instrumentation
C-3 to communicate effectively through technical presentations

Objective D: Provide opportunities to prepare students with the diverse skills needed to be
successful engineers.

Outcomes for Objective D: Students will be introduced to the following issues through their
undergraduate education in this department and will gain:
D-1 an ability to work in teams to solve multi-faceted problems
D-2 an ability to understand and contribute to the challenges of a rapidly changing society
D-3 an understanding of ethical and societal responsibilities of professional engineers
D-4 an understanding of the need for lifelong learning and continuing professional education

The department educational objectives also relates to ABET's Criterion 3 a-k “Program Outcome
and Assessment"1. The relationships are summarized in Table 1.

Outcomes Assessment Process

The program outcomes are mainly achieved through the curriculum. Student activities in student
professional organizations augment the curriculum in achieving the stated outcome. The
Accreditation Committee has identified a set of tools to monitor student progress in achieving the
outcomes. The assessment instruments fall into three general categories: audits, surveys, and
student performance results.

The feedback cycle varies for each of the instruments. While some provide immediate feedback
on student progress in achieving the outcomes and allow corrective actions to be made at the
beginning of each semester, others require long-term analysis over several years.

Audits

Several types of audits are identified for assessing program outcomes.

Curriculum Audit: Provides information on how the curriculum contributes to the program
outcomes. For example, the ME undergraduate program is designed to meet the student
outcomes for objectives A through D. Table 2 maps the mechanical engineering objectives to the
program outcomes. It shows whether each course makes either primary (1) or secondary (2)
contributions to program outcomes. It is clear that all educational objectives are addressed
throughout the curriculum. Other instruments, such as surveys and test results, are used to
determine whether students are achieving program outcomes. Analysis is an on going process
Page 9.177.3

and the curriculum will be adjusted when necessary.

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
Advising Process and Enforcement of Prerequisites: Advising and enforcement of course
prerequisites ensures that the students are taking required courses in the proper sequence. The
College of Engineering has established a policy requiring all students to see a faculty advisor
before registering for courses each semester5, 6. Each faculty receives a degree audit plan for the
students assigned to him/her. The degree audit plan shows the completed courses with grades
and the list of remaining courses required for the degree. A system to check prerequisites has
been implemented. At the beginning of each semester, the academic advisors check each student
record for the required course prerequisites and those lacking the required prerequisites are
dropped administratively from the course.

Course Transfer Audit: Transfer credits from community colleges and other universities are
reviewed and approved by the engineering advisors and department chair for quality and topic
coverage. Typically, Core Curriculum requirements and lower division courses such as calculus,
physics, and introductory engineering courses are automatically accepted. Transfer of upper
division courses requires a careful review by the faculty and department chair.

Table 1. Correlation of Program Objectives with ABET Criterion 3 a-k “Program Outcome and
Assessment"1
Program Educational Objective ABET Criterion 3
A “fundamental analysis skills” (a) an ability to apply knowledge of mathematics,
acquire the ability to apply the fundamentals of science, and engineering
mathematics, sciences and engineering to quantitatively (e) ability to identify, formulate and solve engineering
analyze problems problems
B “creative thinking and design skills” (k) ability to use techniques, skills, and modern
develop innovative design skills, including the engineering tools necessary for engineering
students’ ability to formulate problems, to think practice
creatively, to synthesize information, and to (c) ability to design a system, component, or process to
communicate effectively meet desired needs
(g) an ability to communicate effectively
C “experimental and data analysis skills” (b) ability to design and conduct experiments, as well
develop the ability to use modern experimental as to analyze and interpret data
techniques; collect, analyze, and interpret experimental (k) ability to use techniques, skills, and modern
data; and effectively communicate the results engineering tools necessary for engineering
practice
(g) an ability to communicate effectively
D “diverse career skills” (d) an ability to function on multi-disciplinary teams
prepare students with the diverse skills needed to be (h) broad education necessary to understand the impact
successful engineers of engineering solutions in a global and societal
context
(i) a recognition of the need for, and an ability to
engage in life-long learning
(j) a knowledge of contemporary issues
(f) an understanding of professional and ethical
responsibility
Page 9.177.4

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
Fourth Third Second First Semester
Table 2.

EE 2213

ME 3312
ME 3243
ME 3241
ME 3103
ME 1403

STA 2303
No

EGR 3323
PHY 1931
PHY 1924
EGR 2323
EGR 2213
PHY 1911
EGR 2213
EGR 1303

MAT 1223
PHY 1904*
CHE 1103*

ENG 1023*
ENG 1013*

MAT 1214*

University Core
University Core
University Core

Fine Arts
Calculus I

Calculus II

US History
Political Studies
Required Course

Title

General Chemistry

Technical Physics I

Technical Physics II
Statics and Dynamics
Statics and Dynamics

Electronic and Circuit

Materials Engineering
Engineering Analysis I
Freshman Composition

Engineering Analysis II
Technical Physics I Lab

Technical Physics II lab


Engr. Graphics &Design

Kinematics and Dynamics


Discourse Across the Disc

Materials Engineering Lab

* Satisfies the University Core Curriculum.


Exploring Engr. Profession

Appl. Prob & Statistics-Engr

Electronics & Data Acq. Lab


A-1 to use the principles from chemistry, physics, statistics, and

1
1
2
1
1
1
2
1
1
1
2
1
1
1
1
1
mathematics in engineering applications

1
1
1
1
2
A-2 to use computer-based tools for engineering applications

1
1
1
1
1
1
1
2
A-3 to identify, formulate, and solve engineering problems

B-1 to formulate design problem objectives, constraints, and

2
1
synthesize problem information
B-2 to develop creative and innovative designs that achieve desired

1
performance criteria within specified objectives and constraints
B-3 to communicate effectively through written, oral, and graphical

1
1
1
1
presentations
C-1 to design and conduct experiments to analyze and interpret

1
1
1
1
experimental data
C-2 to use modern engineering tools, software, and laboratory

1
1

Copyright © 2004, American Society for Engineering Education


instrumentation

1
1
1
1
1
C-3 to communicate effectively through technical presentations
ME Program Outcomes

1
2
2
2
2
1

D-1 an ability to work in teams to solve multi-faceted problems

D-2 an ability to understand and contribute to the challenges of a

2
1
1
2

rapidly changing society


D-3 an understanding of ethical and societal responsibilities of

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
1
Mapping of curriculum to ME program outcomes (first through fourth semesters)

professional engineers
D-4 an understanding of the need for lifelong learning and
2
2

continuing professional education

Page 9.177.5
Eighth Seventh Sixth Fifth Semester

ME
ME
ME
Table 2.

Elective

ME 4813
ME 4811
ME 4802
ME 4603
ME 4313
ME 4702
ME 4523
ME 4293
ME 3663
ME 3813
ME 3513
ME 3293
ME 3173
No

ECO 2023

University Core
University Core
University Core
University Core
University Core

Literature
US History
Political Studies
Fluid Mechanics
Solid Mechanics
Required Course

Title

Thermodynamics I

Thermal/Fluid Lab

ME Design Project
Mechanism Design

Thermodynamics II

Technical Elective*
Technical Elective*
Technical Elective*
Intro Microeconomics

Materials Engineering
Approved Math/Science

ME Project Planning Lab

Materials Engineering Lab


Numerical Methods in ME

Dynamic Syst. and Control

FEA in Mechanical Design


Mech. Systems/Controls Lab

Heat Transfer & Rate Process


A-1 to use the principles from chemistry, physics, statistics, and

1
2
2
1
1
1
1
1
1
1
1
1
1
1
mathematics in engineering applications

1
2
1
1
1
1
1
1
1
1
1
A-2 to use computer-based tools for engineering applications

1
2
1
1
1
1
1
1
1
1
1
1

* Mapping of technical elective courses are shown in table 2b


A-3 to identify, formulate, and solve engineering problems

B-1 to formulate design problem objectives, constraints, and

1
1
1
2
1
2
2
1
synthesize problem information
B-2 to develop creative and innovative designs that achieve desired

1
1
1
2
1
2
2
1

2
2
performance criteria within specified objectives and constraints
B-3 to communicate effectively through written, oral, and graphical

1
1
1
2
1
2
2
1
2

presentations
C-1 to design and conduct experiments to analyze and interpret

1
1
1
2

experimental data
C-2 to use modern engineering tools, software, and laboratory

1
1
1
2
2

Copyright © 2004, American Society for Engineering Education


instrumentation

1
1
1
C-3 to communicate effectively through technical presentations
ME Program Outcomes

2
1
1
1
1
2
1
2
1

D-1 an ability to work in teams to solve multi-faceted problems

D-2 an ability to understand and contribute to the challenges of a

2
1
1
1
2
1
2
1

rapidly changing society


D-3 an understanding of ethical and societal responsibilities of
Mapping of curriculum to ME program outcomes (fifth through eight semesters)

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
1
1
professional engineers
D-4 an understanding of the need for lifelong learning and

1
1
2
2
2
2
2
2
2
2

continuing professional education

Page 9.177.6
Table 2b. Mapping of technical elective courses in to the ME program outcomes

Technical Elective Courses ME Program Outcomes

B-3 to communicate effectively through written, oral, and graphical


B-2 to develop creative and innovative designs that achieve desired

D-2 an ability to understand and contribute to the challenges of a


A-1 to use the principles from chemistry, physics, statistics, and

D-1 an ability to work in teams to solve multi-faceted problems


C-3 to communicate effectively through technical presentations
C-1 to design and conduct experiments to analyze and interpret
performance criteria within specified objectives and constraints

D-3 an understanding of ethical and societal responsibilities of


C-2 to use modern engineering tools, software, and laboratory
A-2 to use computer-based tools for engineering applications

B-1 to formulate design problem objectives, constraints, and


A-3 to identify, formulate, and solve engineering problems

D-4 an understanding of the need for lifelong learning and


mathematics in engineering applications
No Title

continuing professional education


synthesize problem information

rapidly changing society

professional engineers
experimental data

instrumentation
presentations
ME 3263 Materials Processing 1 1 2 2 2 2 2 2
ME 3323 Dynamics of Mech. Systems 1 1 2 2 2 2 2
ME 3823 Machine Element Design 1 1 1 1 1 1 1 1
ME 4113 Engineering Fracture Mechanics
ME 4183 Propulsion 1 2 1 2 1 1 2
ME 4243 Intermediate Materials Engr. 1 1 2 2 2 2 2 2 2
ME 4323 Thermal Systems Design 1 1 1 1 1 1 2 2
ME 4343 Heating, A/C, and Refrig. Des. 1 1 1 1 1 1 2 2
ME 4613 Power Systems Design
ME 4623 Internal Combustion Engines
ME 4663 Fluid System Design 1 1 1 1 1 1 2 2
ME 4723 Reliability and Quality Control
ME 4963 Topics in Bioengineering 1 1 1 1 2 2 2 2 2

Course Audit: This is a short-cycle assessment tool that provides immediate diagnostic feedback.
Each course is peer reviewed every semester by a set of faculty assigned to the course. The
reviewing faculty are knowledgeable about the course since they are typically on the teaching
rotation for the course. The course audit involves the review of two sets of documents: course
notebook and course portfolio. The course notebook contains samples of students work (graded
homework, exams, projects, etc.). Three examples of student work are collected in the course
notebook (one high grade, one average grade and one low grade). Course portfolio is a set of
documents that are permanently maintained for the course. These documents include the results
and analysis of the prerequisite tests, results and analysis of student course surveys, grade
distributions, and all previous assessments and recommendations by the group of faculty
assigned to the course. These documents will be fully described later in this paper. The
reviewing faculty group will review all the materials in the course portfolio and course notebook
Page 9.177.7

and provide feedback on topic coverage and on whether the course objectives are being met.

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
Surveys

Survey instruments for assessing the outcomes are as follows:

Student Course Survey: This survey assesses student opinions on their success in reaching course
objectives. In Spring semester 2002 each department began to develop survey instruments to
obtain feedback from students regarding educational objectives and outcomes. In that semester
each department conducted surveys in selected undergraduate courses. Courses surveyed
included engineering foundation courses, an engineering science course, two laboratory courses,
a technical elective, and capstone design course. The student course survey process was
expanded to include all courses offered in the Fall 2002. Course surveys are typically conducted
close to the end of the semester (12-14 week). The student course survey coincides with the
student rating of instruction. Since Fall 2001, UTSA has used the instrument developed by the
Individual Development and Educational Assessment (IDEA) Center7 to evaluate the quality of
instruction. Questions 1-47 of the instrument are used exclusively for the evaluation of
instruction. This instrument also allows an option of asking 19 additional questions (questions
48-66). Starting in Fall 2002, we began using items 48-66 to assess the learning objectives for
each course. A set of additional questions was developed for each course and was given to
students during the course instructor survey.

The questionnaires included information on prerequisites and a list of course objectives. These
were followed with a set of questions seeking student opinions on their knowledge of
prerequisite topics and their success in meeting the learning objectives of the course. Table 3
shows the results of the course survey for ME 3293-Thermodynamics I. Students had the
following choices in response to questions on the survey: 1= definitely false, 2= more false than
true, 3= in between, 4 = more true than false, and 5= definitely true. Table 3-b gives the number
of responses in each category, the average score, and standard deviation for each question.
Occasionally, participants omitted responses to some of the questions as shown in Table 3. We
also suspect that almost one third of participant did not realize that there were additional items on
the backside of the questionnaire. Items 13-19 were printed on the backside of the questionnaire
and the results shows that almost 30% of participants omitted these items.

An average score above 3 represents a collective positive attitude of respondents towards a


particular question. The average scores for all items in Table 3 are above 3.0. Items 1-5 are
related to prerequisite topics. Most respondents indicated that the basic calculus courses have
prepared them for ME 3293. The responses related to courses in technical physics and statics
were not as strong. The average score for the statement "Statics and Dynamics has prepared me
for this class (e.g. I have no difficulty applying free body diagrams in solving problems)” was
3.4. Items 6-19 in Table 3 relate to the course learning objectives. With the exception of items
11-13 and 17, the average scores for items related to the course learning objectives were greater
or equal to 4.0. Items 11-13 are associated with the second law of thermodynamics and the
application of thermodynamics cycles. These topics are covered near the end of the semester.
Since the survey was conducted two weeks before the end of semester students did not have
adequate experience with these topics.
Page 9.177.8

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
Table 3 ME 3293 course survey – responses to specific questions. 1= definitely false, 2= more false
than true, 3= in between, 4= more true than false, 5= definitely true, O = omit

No. Question 1 2 3 4 5 O Avg s.d.


1 Calculus I (differential calculus) has prepared me for this class 0 2 5 4 7 1 3.9 1.1
2 Calculus II (integral calculus) has prepared me for this class 0 1 5 5 7 1 4.0 1.0
3 Technical Physics I (Mechanics and Heat) has prepared me for this class 1 4 1 6 6 1 3.7 1.3
4 Statics and Dynamics has prepared me for this class (e.g. I have no difficulty 1 4 4 5 4 1 3.4 1.2
applying free body diagrams in solving problems)
5 I have no difficulty applying differentiation and integration 0 0 5 7 6 1 4.1 0.8
6 I can define a thermodynamic system 0 0 1 7 10 1 4.5 0.6
7 I can evaluate properties, using thermodynamic tables and charts (general 1 0 2 5 10 1 4.3 1.1
compressibility, etc.)
8 I can evaluate properties, using appropriate equations for ideal gases 0 0 5 7 6 1 4.1 0.8
9 I can write down the mass rate balance equation and simplify it for a closed 0 0 2 8 8 1 4.3 0.7
system, an open system, a steady state process, or a transient process
(uniform state, uniform flow )
10 I can apply the first law of thermodynamics to a closed system, an open 0 0 3 6 9 1 4.3 0.8
system, a steady-state process, a transient process (uniform state, uniform
flow)
11 I can apply the second law of thermodynamics to the following systems and 1 0 4 5 8 1 3.8 1.1
processes to a closed system, an open system, a steady state process, a
transient process (uniform state, uniform flow)
12 I can apply the first law for thermodynamic cycles 1 0 6 6 5 1 3.8 1.1
13 I can apply the second law for thermodynamic cycles 1 0 7 6 4 1 3.7 1.0
14 I can define the thermal efficiency of a power cycle, the coefficient of 1 0 2 5 5 6 4.0 1.2
performance for a cooling cycle, or coefficient of performance for a heating
cycle (heat pump)
15 I can convert units, check dimensions, and solve problems using both US 1 0 0 4 8 6 4.4 1.1
customary units (ft, lbf, lb, Btu, hp) and SI (m, N, kg, J, W)
16 I have a firm understanding of the many applications of thermodynamics 0 1 2 5 5 6 4.1 1.0
encountered by engineers
17 I have utilized computers in solving engineering problems for this course 1 3 2 3 4 6 3.5 1.4
18 This course has helped develop/enhance my problem solving skills 0 0 3 4 6 6 4.2 0.8
19 The recitation hour in this course has helped me in learning the course 0 1 3 4 5 6 4.0 1.0
materials and enhancing problem solving skills

Student course surveys provide instructors immediate feedback on learning objectives at the end
of each semester. For example, Table 3 gives an average score of 3.5 for the statement "I have
utilized computers in solving engineering problems for this course." The average score for the
same item was 2.5 in Spring 2002 indicating that a number of students in ME 3293 did not fully
learn how to use the interactive computer software8 or did not take advantage of utilizing the
software in solving thermodynamic problems. Therefore, better instructions for using the
software were provided in Fall 2002 and the average score for this item was increased by one
point.

Student Exit Survey: A survey was conducted at the end of Spring semester 2002 to assess the
opinions of graduating seniors in the ME programs on their success in achieving program
outcomes and on their attitudes toward the department. Students had a choice of ranking each
item in the questionnaire as: exemplary = 5, positive = 4, neutral = 3, negative = 2, extremely
Page 9.177.9

negative =1. The results of this survey are summarized in Table 4. The average scores for all

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
questions are above 3.0, except question 6 (computer hardware and facilities) with an average
score of 2.9. It appears that the students were not completely satisfied with the access to the
computer lab/lecture, since this space is heavily used for instruction. Additional lab space is
necessary to improve the situation. This survey provides important information for the
assessments of educational objectives and outcomes: A-1 through A-4, B-1 through B-3, C-1
through C-3, and D-1 through D-4.

Table 4. Graduating seniors exit survey results. Student choices for rating each items were: 5 =
exemplary, 4 = positive, 3 = neutral, 2 = negative, 1= extremely negative. Specific
outcome measures are shown in parenthesis.
5 4 3 2 1 T
No. Question Ave
No. of Respondents
1 Overall education at UTSA 0 9 0 2 0 11 3.6
2 Overall education in the COE 0 9 1 0 1 11 3.6
3 Engineering Faculty 4 6 0 0 1 11 4.1
4 Engineering Staff 2 6 2 0 1 11 3.7
5 Advising:
a COE Advising Center 2 6 2 1 0 11 3.8
b Faculty Advising 3 3 5 0 0 11 3.8
6 Computer hardware and facilities 0 2 6 3 0 11 2.9
7 Computer software tools 2 4 4 1 0 11 3.6
8 Laboratory
a Facility 1 4 2 4 0 11 3.2
b Equipment 1 4 3 2 1 11 3.2
9 Quality of mathematics courses
a Fundamental (Calculus) (A-1) 3 6 2 0 0 11 4.1
b Engineering Analysis (A-1) 4 4 2 1 0 11 4.0
c Probability and Statistics (A-1) 3 2 4 2 0 11 3.5
10 Quality of engineering courses
a Fundamental (A-1) 2 7 1 1 0 11 3.9
b Laboratory-oriented (C-2) 3 5 2 1 0 11 3.9
c Design-oriented (B-2) 4 4 2 0 1 11 3.9
d Technical electives (A-1, A-2, B-1, B-2) 4 5 1 0 0 10 4.3
e Senior capstone (b-1, B-2, B-3, D-1, D-3) 4 5 1 0 1 11 4.0
11 Your ability to apply
a Basic knowledge of mathematics, science, and engineering (A-1, A-2, 3 6 2 0 0 11 4.1
A-3))
b Probability and statistics in engineering applications engineering (A-1) 2 4 3 2 0 11 3.5
c Economics in engineering applications (B-2) 2 4 5 0 0 11 3.7
d Computer-based tools for engineering applications (A-2) 4 5 2 0 0 11 4.2
12 Your ability to
a Identify, formulate, and solve engineering problems (A-3) 2 7 0 2 0 11 3.8
b Design laboratory experiments (C-1) 3 6 0 2 0 11 3.9
c Design a system, component, or process (B-1, B-2) 2 5 2 1 1 11 3.5
13 Your ability to communicate technical information
a Written (B-3) 5 3 2 1 0 11 4.1
b Orally (B-3) 6 2 2 1 0 11 4.2
14 Your ability to function in teams (D-1) 4 5 1 1 0 11 4.1
15 Your capacity for lifelong learning (D-4) 6 3 0 2 0 11 4.2
16 Your understanding of professional and ethical responsibilities (D-3) 6 3 1 0 1 11 4.2
Page 9.177.10

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
The IDEA Student Rating of Instruction Group Summary Report: This report summarizes the
quality of instruction in the department, shows how the course objectives identified by
instructors as essential or important relate to the student outcome for the program objectives.

The learning objectives identified by faculty as "Essential" and "Important" are summarized
below. The relationships between the selected course objectives and the specific student
outcomes are also identified. The percentage column indicates how often the ME faculty
members have identified a course learning objective as important or essential. Not all items
given in the IDEA report are included here. Only those items related to our program objectives
are included.

Course Learning Objective % of ME Course ME-Outcome #


1. Gaining factual knowledge 68 A-1, A-3
2. Gaining the fundamental principles 90 A-1-A-3
3. Learning to apply course material 94 A-1-A-3
4. Developing specific skills 74 A-2, C-2
5. Acquiring skills working in a team 26 D-1
6. Developing creative capacities 26 B-2
7. Developing skills in oral and written communications 26 B-3
8. Learning how to use resources to find a solution 29 D-4

The results in the IDEA report indicate that with the exception of course objective 5, the student
rating of the UTSA's ME courses in meeting the course objective always exceeds those for the
IDEA system.

Alumni and Employers Survey: Survey instruments have been developed to obtain input from
such major constituencies as alumni and employers. We mailed survey questionnaires to
approximately 600 alumni who have received a BS degree within the last five years. In these
surveys we asked our alumni to identify their immediate supervisors so that we could send them
an employer survey. We have found obtaining responses to these surveys to be a challenging
process. Nearly half of the alumni surveys were returned by the post office as “addressee
unknown”. We have received nearly 100 responses to the alumni surveys. The results of the
alumni survey for the ME program are shown in Fig. 1 which shows the alumni perception of
their educational preparation versus what is important in industry.

Student performance measures

The student performance measures used in the assessment include a set of quizzes and exams
that include the Fundamental Engineering (FE) exam.

Prerequisite Quiz: During the first week of each semester a diagnostic prerequisite quiz is given
in each course to determine the proficiency of students in the prerequisite topics needed in the
course. One immediate benefit for the prerequisite quiz is that the course instructor uses the
results of the prerequisite quizzes to adjust his/her lectures by using the course topics to review
those prerequisites topics which students had difficulty with. For example in an introductory
Page 9.177.11

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
thermodynamic course, integration could be easily reviewed in examples involving energy
transfer by work resulting from a system moving boundary.

5.0
40 29 36

28
42 39 23 22

4.5 32 35 34

31 18 26 25
16
33
38 24

27
Importance in Industry

37

4.0
21

43
17

3.5

3.0

2.5 20

19

2.0
2.0 2.5 3.0 3.5 4.0 4.5 5.0
Preparation (UT SA)

Mathematics (to calculus) Critical and Logical Thinking


Mathematics (beyond calculus) Modern Techniques to Analyze Data
Basic Sciences Design of Components and Systems
Social Sciences Function on Multi-disciplinary Teams
Humanities Professional and Ethical Responsibilities
Economics Applications Interpersonal Skills
Engineering Sciences to your Discipline Written, graphical, and oral communication
Motion and Mechanical Systems Recognizing the need for life-long learning
Materials Contemporary Engineering Issues
Thermal and Fluid systems Modern tools for good engineering practice
Design of Mechanical Systems Computer use in Analysis and Design
Design of Thermal Systems Computers in Data Collection & Analysis
Use of fundamentals to analyze problems Special Software Application
Solving Engineering Problems Participation in Student Organizations

Fig. 1 ME Alumni survey results - Educational Preparation at UTSA versus Educational


Importance in Industry
Page 9.177.12

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
Results of Fundamentals of Engineering (FE) Exam: This national exam is used as a first step in
the professional licensing of engineers and was developed to measure minimum technical
competence. It is a pass/fail exam that is taken by approximately 50,000 people who are either
seniors within one year of graduation or recent college graduates.

In the last few years engineering students have been given the choice of taking either a general
FE exam or a discipline-specific exam. The 8-hour exam includes a morning and an afternoon
period. The morning session includes questions in math, science, and basic engineering topics.
The afternoon session questions cover specific engineering topics.

The FE exam results also provide data on student performance in specific topic areas within the
engineering curriculum. The results show that in many of the topic areas our students perform
the same as those from other state and national institutions. However, our students have not
performed as well in a few of the topics in the FE exam (Statics, Mechanical Design). We are
making adjustments in our curriculum to improve instruction in the weak performance areas.

The results of the FE exam passing rates reported by the National Council of Examiners for
Engineering and Surveying (NCEES) are summarized in Fig. 2 for the ME program. Since the
number of students taking the exam might be very low in a given semester, we are showing the
running average of the passing rate for the period between 1990 and 2002. The passing rates for
the ME department have been in the range of 0% (when only one student took the exam) to
100%. Figure 2 shows that the current cumulative passing rate for ME students is 83.4%, that is
above the college rate and is comparable to the state and national rates.

At UTSA we encourage all engineering students to take the FE exam. Before the offering of the
discipline-specific exams by NCEES, the engineering faculty at UTSA provided review sessions
outside the regular class periods. However, it became more difficult and impractical to offer
three different discipline-specific review sessions. As a result, the passing rate has dropped
slightly since Fall semester, 1997. We are again exploring ideas to find an efficient way to begin
offering FE exam review sessions for our students. In the last two semesters the student chapter
of ASCE has organized FE review exams which has resulted an improvement in CE student
passing rate.

Course assessment instruments

As discussed earlier, the course portfolio and course notebooks for each course are assessed by a
set of faculty assigned to the course to ensure that students are adequately progressing towards
meeting the program outcomes. The course notebook includes samples of student work. Course
portfolio contain a copy of course syllabus, history of grade distribution, results and analysis of
prerequisite exams, results and analysis of student course survey, and the course evaluation
forms. Each of these items is described below.
Page 9.177.13

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
Cumulative passing rate

100.0

95.0

90.0
Passing rate %

85.0

80.0

75.0

70.0
r in 9 0

Fa 91

r in 9 1

Fa 92

r in 9 2

Fa 93

r in 9 3

Fa 94

r in 9 4

Fa 95

r in 9 5

Fa 96

r in 9 6

Fa 97

r in 9 7

Fa 98

r in 9 8

Fa 99

r in 9 9

Fa 00

r in 0 0

Fa 01

r in 0 1

Fa 02

r in 0 2

03
Sp 19

19

Sp 19

19

Sp 19

19

Sp 19

19

Sp 19

19

Sp 19

19

Sp 19

19

Sp 19

19

Sp 19

19

Sp l 19

20

Sp l 20

20

Sp l 20

20

Sp 20

20
g

g
ll

ll

ll

ll

ll

ll

ll

ll

ll
l

l
r in
Sp

FE exam period

Fig. 2. FE exam running average passing rate for the ME student.

Course Syllabus: The course syllabus is important for several reasons. First, it is the initial
contact with the student. If the syllabus clearly defines the objectives of the course and how these
tie to the departmental objectives, this send a clear message to the student that time has been
spent planning the course. It also relays to the student that the information being disseminated
and tested through the course is a fundamental building block to the entire engineering
education. As a minimum, the syllabus includes the following information:
• Instructor information (name, office, office hours, etc.)
• Course coordinator
• Catalog course description
• Textbook information
• List of course prerequisites
• List of prerequisites by topic
• Course learning objectives
• Relationship between individual course learning objectives and ME department
objectives and outcomes, as summarized in Table 2
• Topics covered in the course
• Evaluation methods (homework assignments, quizzes, exams, projects, reports, etc.)
• Performance criteria (evaluation methods used to assess student performance for each of
Page 9.177.14

the course learning objectives)


• Course content (Engineering Science, Design, etc.)

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
It is desirable that the course syllabus be maintained as consistently as possible over multiple
terms. This allows for a useful evaluation of the teaching methods and instruction tools. To
ensure this, the syllabus is prepared by the course coordinator and reviewed by at least two other
faculty members who may teach the course or have expertise in that area.

Grade Distribution: The next item included in the portfolio is the grade distribution. This is the
first data item that relates to the reviewers the success the course had in reaching the defined
outcomes. It is important to emphasize that what is being audited is the course, not specifically
the instructor. The change in grade distribution is one indication whether changes made in the
instruction of the class was successful.

Course Survey and Analysis: A critical item used in the analysis of the course is the student
course survey. Details about the survey itself were provided in an earlier section. What is key in
terms of the portfolio audit is whether the survey data was analyzed and acted upon. The
emphasis here is that this is not just a data collection and tabulation effort. The continuous
improvement program requires the coordinator take the extra steps to analyze the data and
formulate an action to address any item that receives an average score of less than 3 on a 5 point
scale. Therefore, the course portfolio contains a summary of the survey questions and scores as
well as a Course Survey Review and Action Form. The form is in a memo format and identifies
the following:
• items receiving a low score
• a comment by the instructor from that term as to the most probable reason for the low score
was received
• a proposed action that will potentially increase the score in the next term

Prerequisite Quiz and Analysis: Also included in the portfolio is an analysis as to whether the
students were adequately prepared for the course through the prerequisites. Administrating a quiz
within the first 10 days of class does this. More details regarding the testing process are provided
below. Within the context of the Portfolio Audit, here too it is important that the data not just be
collected but analyzed and acted upon. As in the case of the course surveys, specific actions must
be proposed that can be taken during the current term. For instance, if the quiz denotes a problem
with integration skills, then a possible action would be demonstrating integration within the
context of solving example problems. The proposed action should be easily identifiable.

Critical Evaluation Documents: The final component of the portfolio is any document that
clearly denotes how evaluation within the class was conducted. Examples are forms used to
evaluate design reports and presentations. Tables 5 shows an example of oral presentation
evaluation form. A similar form is designed for the evaluation of technical reports.

Faculty Course Assessment: The course notebook and information in the course portfolio and the
audit of these materials best represents the continuous process improvement program (CPIP) put
in place at UTSA. Each course portfolio is a folder containing the items discussed here. It is
stored at a central location that course coordinators and reviewers have access to. All information
is updated at the end of each semester in which the course has been taught. A team of faculty
Page 9.177.15

members assigned to the course review the portfolio. The review is conducted using the course

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
assessment form shown in Table 6. After reviewing the content of course notebook (samples of
students work) and the course portfolio (materials such as surveys and perquisite results and
analysis) each reviewer assigns a score for each item on the form. Contribution of the course to
the student outcome is assessed on the second page of the form.

Table 5. Presentation evaluation form for a conceptual design course (ME 4811)

Project Title:
Evaluator:

Attribute 1- Not 2- Marginally 3- Meets 4- Exceeds Score


Acceptable Acceptable Expectations Expectations

Presentation Mechanics
Organization and Length totally disjointed, no confusing; some good organization, superior organization
organization, far too items out of order, appropriate length and communication,
long or too short somewhat too long appropriate length
or too short
Effective Design and Use of Visual Aids numerous visual aid some visual aid good design and use superior design and
- appropriate font size, legible graphics, deficiencies, waste of deficiencies of visual aids; use of visual aids
slides not overly wordy, sufficient time space (i.e. inclusion effective inclusion of
allowed per slide, etc. of clip art, etc.) analysis output
Presentation Content
Introduction - Brief description of the problem, client/user problem, client/user problem, client/user -
design problem; Why is it being done? and objectives not or objectives poorly and objectives
Who are the client & users? What is the stated stated clearly stated
objectives?
Management of the Project - description poorly presented marginally effective presentation superior presentation
of the project tasks, schedule and presented; WBS, of WBS, Gantt chart of project
spending Gantt chart and and spend plan information and
spend plan not evidence of effective
presented management
Management of the Product - description poorly presented marginally clear presentation of superior description
of form and function; description of presented; weak form & function;
engineering analysis; clear consideration description of form presentation of
of alternative solutions; clear definition & function; no clear multiple alternatives;
of leading concept(s) description of leading concept(s)
alternatives, no clear described
concept(s) defined
Engineering Analysis - describe/present missing, incorrect, explained but lacking well-explained, superiorly presented
analyses used to determine leading inappropriate, or not in details, not partially supported
concept presented in an supported by figures, by figures, graphs
understandable graphs or equations and equations
manner
Response to Questions
Direct responses that are fairly complete non-responsive or not complete or clear and direct, exceptionally clear,
and non-evasive, reflect understanding evasive inaccurate complete complete
Individual Presentation Mechanics - 1- Not 2- Marginally 3- Meets 4- Exceeds Score
speaking volume, enunciation, speed, Acceptable Acceptable Expectations Expectations
hesitations, eye contact, mannerisms,
appearance, poise, body language
Page 9.177.16

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
Table 6. Course evaluation form

Course:
Reviewer’s name:

Attribute 0-N/A 1 - Not Acceptable 2 - Marginally 3 - Meets 4 - Exceeds Score


Acceptable Expectations Expectations
Course Notebooks
Topic coverage Not clear what topics Most topics listed on All topics listed on All topics were
were covered syllabus were the syllabus were covered & the
covered covered student’s knowledge
was evaluated
Student notebooks None collected Very few examples Examples of graded Examples of graded
- Homework of graded work, very student work present; materials, feed back,
- Quizes limited feedback feedback provided special assignments
- Exams such as use of
computers, ethics
project, etc.
Written reports for lab or design None included Included; limited Included; shows Evidence of use of a
projects feedback to students feedback to students multiple assignment,
feedback process
Oral presentation None included Included; limited Included; shows Evidence of use of a
feedback to students feedback to students multiple assignment,
feedback process
Portfolios

Course Syllabus None Present Present; not signed or Present; in ABET


in ABET format format & signed
Prerequisite Quiz None given Given but no analysis Given, analyzed and Given, analyzed,
action formed action formed,
evidence of action
Course learning objective survey None Given Given but no analysis Given, analyzed and Given, analyzed,
action formed action formed,
evidence of action
Grade distribution for the course None given Given Given and compared Given, analyzed,
to previous action formed,
evidence of action
Unique evaluation documents for None present Present but no Present, evidence of Present, used and
design class or labs(design report evidence of use use action formed based
grade sheets, etc.) on results
Comments / Recommendations:

Page 9.177.17

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
Table 6. Course evaluation form (cont.)
Attribute Prim / 1 - Not Acceptable 2 - Marginally 3 - Meets 4 - Exceeds Score
Sec Acceptable Expectations Expectations

Contribution to Program Outcomes

develop the ability to use the No evidence this This outcome was This outcome was This outcome was
principles from chemistry, physics, outcome was addressed but not addressed and addressed and
statistics, and mathematics in addressed or assessed assessed assessed assessed over
engineering applications (A-1) multiple terms
develop the ability to use No evidence this Assignments Assignments Assignments given
computer-based tools for outcome was requiring use of requiring use of requiring an more in-
engineering applications (A-2) addressed or assessed computer given computer given and depth understanding
feedback provided of computer tools

develop the ability to identify, No evidence this This outcome was This outcome was This outcome was
formulate, and solve engineering outcome was addressed but not addressed and addressed and
problems (A-3) addressed or assessed assessed assessed assessed over
multiple terms
develop the ability to formulate No evidence this This outcome was This outcome was This outcome was
design problem objectives, outcome was addressed but not addressed and addressed and
constraints, and synthesize addressed or assessed assessed assessed assessed over
problem information (B-1) multiple terms
develop the ability to be creative No evidence this This outcome was This outcome was This outcome was
and innovative in developing outcome was addressed but not addressed and addressed and
designs that achieve desired addressed or assessed assessed assessed assessed over
performance criteria within multiple terms
specified objectives and constraints
(B-2)
develop the ability to communicate No evidence this Evidence of written Evidence of written Evidence of a series
effectively through written, oral, outcome was & oral reports & oral reports of written & oral
and graphical presentations (B-3) addressed or assessed assigned assigned, graded & reports assigned,
feedback given graded & feedback
given
develop the ability to design and No evidence this This outcome was This outcome was This outcome was
conduct experiments to analyze outcome was addressed but not addressed and addressed and
and interpret experimental data(C- addressed or assessed assessed assessed assessed over
1) multiple terms
develop the ability to use modern No evidence this This outcome was This outcome was This outcome was
engineering tools, software, and outcome was addressed but not addressed and addressed and
laboratory instrumentation (C-2) addressed or assessed assessed assessed assessed over
multiple terms
develop the ability to communicate No evidence this Evidence of written Evidence of written Evidence of a series
effectively through written, oral, outcome was & oral reports & oral reports of written & oral
and graphical presentations (C-3) addressed or assessed assigned assigned, graded & reports assigned,
feedback given graded & feedback
given
gain an ability to work in teams to No evidence this Teams formed and Teams formed, In-depth study of
solve multi-faceted problems. (D- outcome was projects assigned projects assigned; team dynamics and
1) addressed or assessed performance of team team problem solving
and projects assessed
gain an ability to understand and No evidence this This outcome was This outcome was This outcome was
contribute to the challenges of a outcome was addressed but not addressed and addressed and
rapidly changing society (D-2) addressed or assessed assessed assessed assessed over
multiple terms
gain an understanding of ethical No evidence this This outcome was This outcome was This outcome was
and societal responsibilities of outcome was addressed but not addressed and addressed and
professional engineers (D-3) addressed or assessed assessed assessed assessed over
multiple terms
gain an understanding of the need No evidence this This outcome was This outcome was This outcome was
for lifelong learning and outcome was addressed but not addressed and addressed and
continuing professional education addressed or assessed assessed assessed assessed over
Page 9.177.18

(D-4) multiple terms

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
Summary of outcome assessment results and overall action items

The overall assessment results are positive. Several types of student surveys were used to
measure the outcomes of the program outcomes. Respondents were asked to rate their
satisfaction with various aspects of their educational experience at UTSA. They have indicated
high levels of satisfaction with faculty in several areas: access, enthusiasm for teaching, and
quality of instruction. The majority of the ME students surveyed were satisfied with the quality
of skills, knowledge, and understanding they had acquired in the College of Engineering. The
undergraduate course surveys provided useful information for individual courses and a
correlation between the topic coverage in these courses and the program learning objectives.

The faculty assessments of course prerequisites have indicated student deficiencies in knowledge
of topics in Statics. In Fall 2000, the ME and CE programs decided to combine Statics and
Dynamics into a single course in order to reduce the required semester credit hours for a B.S.
degree. To correct the deficiency faculty members in both departments have decided to revise
the curriculum for the 2004-06 catalog by requiring two separate courses in Statics and
Dynamics.

Other assessment instruments, such as the passing rate for the FE exam showed positive results
for the ME program. However these results also indicated that our students have not been
performing well in the area of mechanical design in the last few years. In fall 2000 Machine
Element Design was changed from a required course to a technical elective in the curriculum.
During the revision of the 2004-06 catalog, the ME faculty decided to make Machine Element
Design a required course again.

Student surveys also identified other problems. For example scheduling courses is a difficulty
for those students who work off campus, even though we currently offer all required courses
during the Fall and Spring semesters. The dilemma is that these students want the courses to be
available at a time that does not conflict with their work schedule. One remedy is to offer
multiple section courses. However, this requires additional resources. To improve the situation,
a two-year course rotation schedule has been developed and made available to students showing
the days and times of the course offerings. This will allow students to plan ahead for work and
class schedules. The exit survey has also indicated that students have had difficulty in getting
access to the Engineering computer facilities. Currently, the computer facilities have dual usage.
Two out of the three computer facilities in the college of engineering are being used for
instruction and students have access to these facilities only when they are not used for
instruction. This problem of space should be resolved when the new Biosciences and
Bioengineering building is complete in 2005.

Review and assessment of the course materials have provided useful information to correct
weaknesses in the program and make program improvements. For example when the
prerequisite test is given to students during the first week of the semester, the results are
tabulated and analyzed, the instructor can make adjustment in his/her lecture by reviewing the
prerequisite topics that most student have shown weakness. The review is done when the
prerequisite topics in needed in the coverage of new materials. For example if the perquisite
Page 9.177.19

test given in a thermodynamic course indicate that student have difficulty with the definition of
slope and integrations students will get an opportunity to review integration when topics

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education
involving work (expansion), or transient problems involving the application of conservations of
mass and energy are covered in the course. Students will additional practice integration practice
when solving homework problems in these areas. The course assessment process has also
created a dialog among the faculty discussing the adjustment for prerequisite coverage in various
courses. This has resulted in a much more integrated curriculum.

References
1. URL: http://www.abet.org/images/Criteria/E1%2003-04%20EAC%20Criteria%2011-15-02.pdf
2. URL: http://engineering.utsa.edu/ce/program_objectives.html
3. URL: http://engineering.utsa.edu/ee/program_objectives.html
4. URL: http://engineering.utsa.edu/me/program_objectives.html
5. Karimi, A., Bench, S., and Hodges, Suzan, “Improving Engineering Student Retention in an Urban University,”
Presented at the 2001 Annual Meeting of the Gulf-Southwest Section of ASEE, March 28-30, 2001, College
Station, Texas.
6. Karimi, A., “Implementing a New Mechanical Engineering Curriculum to Improve Student Retention,” ASEE
2001-1566, Proceedings of the 2001 ASEE Annual Conference, June 24-27, Albuquerque, New Mexico.
7. URL: www.idea.ksu.edu
8. Karimi, Amir, “Using Learning Objective Assessment Tools to Enhance Undergraduate Engineering
Education,” 2003 Annual Meeting of the Gulf-Southwest Section of ASEE, Arlington, Texas, March 19-21,
2003.

Biography
AMIR KARIMI
Amir Karimi is a Professor of Mechanical Engineering and an Associate Dean of Engineering at The University of
Texas at San Antonio (UTSA). He received his Ph.D. degree in Mechanical Engineering from the University of
Kentucky in 1982. His teaching and research interests are in thermal sciences. He has served the chair of
mechanical engineering twice: the first time between 1987 and 1992 and again from September 1998 to January of
2003. Dr. Karimi has served on curriculum committees at all university levels and has been a member of the
University Core Curriculum (1993-95 and 1999-2003). He is the ASEE Campus Representative at UTSA and is the
current ASEE-GSW Section Campus Representative. He chaired the ASEE-GSW section during the 1996-97
academic year.

J. KEITH CLUTTER
Keith Clutter is an Assistant Professor of Mechanical Engineering at The University of Texas at San Antonio
(UTSA). He received his Ph.D. degree in Aerospace Engineering from the University of Florida in 1997. His
teaching and research interests are in thermal sciences and computational modeling. He is a member of ASEE.

ALBERTO ARROYO
Alberto Arroyo is a Professor and the Chair of Civil and Environmental Engineering Department at The University
of Texas at San Antonio (UTSA). He received his Ph.D. degree in Civil Engineering from New Mexico State
University in1982. His teaching and research interests are in structural engineering. Arroyo is a member of ASEE.
He was awarded the UTSA Presidential Teaching Award in 1994. In 2000 he received the ASEE-GSW section
Outstanding Teaching award.
Page 9.177.20

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition
Copyright © 2004, American Society for Engineering Education

You might also like