Spring 2015 Newsletter

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Volume 6, Issue 2

Spring 2015

Inside IR
INSIDE THIS ISSUE: MCC’s Official Definition of STEM
MCC’s Official Definition of STEM 1 The acronym STEM Using definitions from the to an A.S. degree.
refers to Science, Tech- National Science Foun-
New Learning Centers Referral Link 1 nology, Engineering, and dation, the Department of The majors that are cate-
Math. Homeland Security, and gorized as STEM at MCC
Enrollment Models 2
MCC's history of studies are shown at:
Student Achievement Measure 3 The federal government and grants that defined http://www.monroecc.edu/
has stated that “a world- academic programs ad depts/research/documents/
Common Data Set 3
class STEM workforce is hoc, the IR Office drafted a STEMDefinitionUpdate-
Expectations & Experiences Surveys 4 essential to virtually every definition of STEM. After a 4August292013.pdf
goal we have as a nation few iterations, the define-
– whether it’s broadly tion was approved by the It is important that all
shared economic pros- College. It is now used in faculty and staff use the
perity, international com- measuring Direction 2, official, approved MCC
petitiveness, a strong na- Goal 3 of the Strategic definition of STEM. Please
tional defense, a clean Plan. use the definition and
energy future, and longer, majors noted above when
healthier, lives for all As shown in the Glossary considering STEM-related
Americans. If we want on the left-hand panel of studies, grants, data, etc.
the future to be made in the IR home page, “MCC
It is important that all faculty and categorizes academic pro-
America, we need to re-
staff use the official, approved MCC grams as being composed
double our efforts to
definition of STEM. of two categories: ‘Applied
strengthen and expand
our STEM workforce.” STEM’ and ‘Transfer
STEM.’” Applied STEM
In 2012, we created a majors are those that lead
precise definition of to A.A.S. degrees or certifi-
STEM at MCC in order to cates in technologies or
measure the enrollment engineeering technologies
and graduation rates of programs. Transfer STEM
STEM students. majors are those that lead

New Learning Centers Referral Link


The Learning Center Re- MCC’s Learning Centers closer to final exams.
ferral Form link has give students access to
moved. It is now located in computers, printers, A/V These referrals are critical
the Faculty Survey Menu. equipment, and tutoring. for funding that MCC re-
Faculty members are re- ceives from the State.
To access the referral
form, go to Banner Self- quired to refer students to
If you have any questions,
Service, select “Faculty the Learning Centers at
please contact Amy Wright
Services,” then select the beginning of each sem-
in the IR Office.
“Faculty Survey Menu.” ester, but can also do so
Page 2 Inside IR

Enrollment Models: Where’s the Crystal Ball?


In a previous issue of counts. We then convert tage error of 3%.
Inside IR, we noted that a the headcounts to annual
new enrollment model FTEs. 4. Simulation via Boot-
was being added to IR’s strapping In this model,
enrollment projection tool 2. Student Type Model In we create a set of values
box: the relationship be- this model, we apply trend from the above models
tween Monroe County’s analyses to MCC’s cate- and do 100,000 simula-
December unemployment gories of students: Contin- tions. This produces a
rate and the projected uing, Returning, First- sampling distribution of
size of County high Time, Transfer, and High average annual FTEs, and
school graduating class- School. First-Time stu- allows us to produce prob-
es. We also mentioned dents are partitioned into ability estimate of ranges
that this was one of five two sub-groups (i.e., for the average annual
models used for enroll-
Figure 2 shows how starting with the recent and non-recent high FTEs.
ment planning. In the school graduates), then
knowledge of annual FTEs informs
current issue of Inside IR, projections of recent high
us of three-quarters of our revenues. we describe the other The five models produce
school graduates are util- five different projections for
four models. ized and an average yield the next five years. The
applied. This model also projections are then re-
1. Age Yield Model In projects MCC’s fall census viewed by the Enrollment
this model, we analyze headcounts, but uses a Committee, which meets
Figure 2. Revenue Sources the trends of students moving average of credit twice per year. First, it
age 17 through 65, and hours to headcount to meets in December for ini-
produce a yield model by convert to annual FTEs. tial estimates budget test-
age group based on ing. Then it meets again in
MCC’s fall census. Age 3. Double Exponential late March or early April to
group projections from Smoothing Time Series finalize new information
the U.S. Census Bureau Model In this model, we from the models or other
and the Cornell Program apply the statistical tech- information/considerations
on Applied Demographics nique “time series” to an- that exist outside them.
are used, and a moving nual FTEs. The model
average of the age yields used is double exponential
are applied. This projects Figure 1 shows the initial
smoothing, and produces projections IR made in
MCC’s census head- a, mean average percent- December 2014.

Figure 1. Five-Year Enrollment Models: First Iteration

Sex Age
Group
Female
18-19
20-21
22-23
24-29
30-39
40+
Inside IR Page 3

Student Achievement Measure (“SAM”)


In the fall issue of Inside al Data System (“IPEDS”) ference in rates was due to
IR, we described the Vol- reporting required by the the fact that the SAM
untary Framework of Ac- federal government. This cohort was composed not
countability (“VFA”) project is due to the difference in just of first-time, full-time
MCC participates in. Re- the methodologies each students who started in fall
cently, MCC took part in mandates, shown in Table 2007, but was restricted to Table 2.. Graduation Rate of
another project, the Stu- 1. those who had earned 12 MCC’s Fall 2007 Cohort
Using SAM Using IPEDS
dent Achievement Mea- or more credits within their Methodology Methodology
sure (“SAM”), which is As shown in Table 2, the first two years at MCC. In 44% 23%
based in part on the VFA. SAM graduation rate for addition, the SAM rate was within 6 years within 3 years
the fall 2007 first-time, full- based on students’ status N=3,256 N=3,744
SAM is a summary of stu- time MCC cohort was six years after first enroll-
dent outcomes, but is dif- 44%, in contrast to the ment while the IPEDS rate
ferent than the Integrated 23% graduation rate we re- was based on three.
Postsecondary Education- ported for IPEDS. The dif-
You can look up all of the institutions
Table 1. SAM and IPEDS Methodology Comparison who participate in SAM at
Category SAM Methodology IPEDS Methodology http://www.studentachievementmeasure
Student Cohort
Fall, first-time & transfer students Fall, first-time students seeking a .org/participants/. Currently, 10 of the
seeking a degree or certificate degree or certificate 30 SUNY community colleges
Must have earned:
participate.
Student • A HS diploma or equivalent Must have been admitted into an
Qualifications • 12 or more credits at the college by academic program at the college
the end of one’s second year
Analyses re: Data on full-time & part-time students
Only full-time students’ data is used
Credit Load is disaggregated
Outcome Period 6 years 3 years
Outcome Graduation from, enrollment at, or Graduation from or transfer out of
Measures transfer out of the institution the institution

The Common Data Set


Each year, the IR Office al data is standardized Peterson’s Guide, the Col-
completes a spreadsheet through the Common Data lege Board, and Winter-
template called the Com- Set. Institutions must re- green Orchard House.
These publications then
mon Data Set (“CDS”). It port data on enrollment, re-
disseminate the data to
was developed by the tention, graduation, tuition If you would like a copy of the
prospective students and
Common Data Set Initia- costs, institutional policies, other colleges. Common Data Set, email Andrew
tive, a “collaborative effort and related facts such as Welsh in the IR Office.
among data providers in the number of campuses a From time to time, IR may
the higher education com- college has and whether it ask your office for informa-
munity and publishers as provides residence halls. tion to complete the CDS.
represented by the College Please take a few minutes
By completing the CDS to do so in order to help us
Board, Peterson's Guide,
template, IR is able to or- provide accurate, complete
and U.S. News & World and current information to
Report.” ganize MCC’s data in a
these publications.
uniform way, then share it
The reporting of institution- with publications such as
Inside IR
Page 4

Expectations and Experiences Surveys


Every four years, during respondents experienced pleased with their deci-
the summer, IR admin- than expected… sion to go to college.
isters the Student Expec- These included having
tations Survey to students  that the student body the following experiences:
who have been accepted was diverse
to MCC for the fall sem-  that their instructors  looking forward to go-
ester. Then, in January, reviewed most reading ing to class
we administer the Stu- assignments in class  making new friends
dent Experiences Survey  perceiving that the
to the students who com- Unfortunately, fewer res- student body was di-
pleted the first survey. pondents experienced verse.
than expected feeling
The purpose of the first stressed about doing well There were also differen-
survey is to find out what in class. ces in survey responses
students expect MCC to based on personal char-
be like, while the purpose Many students’ expecta- acteristics. For example,
of the second is to find tions didn’t differ from a higher percentage of
out what they actually ex- their actual experiences. respondents with previ-
perienced. The overarch- For example, more than ous college experience
ing goal of the project is three-fourths of the res- (i.e., AP or Dual Credit
to identify then address pondents both expected courses in high school
differences in expecta- and experienced being and/or attendance at an-
tions and experiences to pleased with their deci- other college before
increase student success sion to go to college, and MCC) than those without
and retention. more than two-thirds it experienced feeling that
expected and experi- the information they
This year, a total of 269 enced classes that in- learned in their classes
students completed both spired them to think in was relevant to their eve-
the Expectations and Ex- new ways. ryday life.
Student Consumer periences Surveys. The
Information following are some of the Additional analyses re- The full survey report will
key findings. vealed that certain experi- be posted on the IR web
The Higher Education Act of
ences actually predicted pages within the coming
1965 requires colleges to provide On the plus side, more respondents’ being weeks.
prospective and current students,
parents, guidance counselors,
coaches, and the public access to
For more information about the Institutional Research (IR) Office, you can visit our pages on
certain information that they are
the MCC website or contact an IR staff member:
entitled to as consumers.

MCC’s Student Consumer


Angel E. Andreu, Director, 292-3031, aandreu@monroecc.edu
Information page is located at:
http://www.monroecc.edu/depts/re Mary Ann Matta DeMario, Assistant Director, 292-3032, mdemario1@monroecc.edu
search/consumer.htm Amy Wright, Secretary, 292-3035, awright@monroecc.edu
Andrew Welsh, Specialist, 292-3034, awelsh4@monroecc.edu
Elina Belyablya, Specialist, 292-3033, ebelyablya@monroecc.edu

Previous issues of Inside IR are available on our homepage:


http://www.monroecc.edu/depts/research/

You might also like