Any questions you might have about the use of ACE should be directed to the ACE Coordinator at 335-0357 or email@example.com.
ACE (Assessing the Classroom Environment) is an instructor/course
evaluation system based on the use of scannable answer sheets.
Although ACE utilizes scannable answer sheets, faculty are encouraged
to consider additional and/or alternative methods for collecting
student feedback. This bulletin explains the ACE system and outlines
the policies for administration and score reporting. References
are provided at the end of this bulletin and EES and Center for
Teaching staff are available to work with departments on developing
a comprehensive evaluation system.
Two formats for instructor/course evaluations are typically discussed
in the literature: formative and summative. These two formats
are distinguished by question type, the point in a course at which
feedback is sought, the format of the results, and the intended
use of the results. The evaluation options offered by the Exam
Service are based upon a distinction between these two formats.
The following is a brief discussion of formative and summative
Formative evaluations provide feedback to assess the effectiveness
of specific instructional practices and identify areas for improvement
and development. With an emphasis on improvement, the type of
evaluation and the questions used should be diagnostic and specific
to the course format and instructor. Formative evaluation should
take place throughout a semester, allowing students to provide
feedback that impacts the instruction they are receiving. A common
complaint from students is that they are rarely asked for input
until the end of a course when they will not realize any benefit
from instructional changes.
Because formative evaluations are diagnostic and specific to an
instructor/course, the instructor controls the format of the evaluation
and the method for collecting feedback. Results from a formative
evaluation should not be considered during reviews for personnel
or administrative decisions unless they are voluntarily submitted
to a department chair, review committee, or dean by the instructor
being evaluated. Following these guidelines allows instructors
to ask probing questions which may elicit mixed responses.
There are numerous methods for collecting evaluative information
during a semester without sacrificing significant course time.
Some examples of formative evaluations are given in the next section.
The following is a limited list of activities designed to collect
diagnostic information from students. They are intended to help
instructors develop methods for receiving continual student feedback.
Formative evaluations work best when the results are shared and
students and instructors work jointly to make changes.
Use the quiz, chat room or email options in ICON.
a “quality committee” of students who meet with
you on a regular basis to discuss issues related to the
“minute papers” at the end of selected classes.
Ask students to respond briefly in writing to a specific
course related question. Minute papers should require only
a two or three sentence response. This is an excellent technique
to get a general view of how a course is progressing and
what areas may need further evaluation or emphasis. You
might ask students to summarize the main point of a particular
class activity, to comment on whether or not a text is easy
to understand or helpful, to indicate the one thing that
would help them most to understand the material presented
in class, or to cite ideas that remain unclear to them at
the end of a class.
a colleague’s classroom and see how they incorporate
different teaching methods, discussion techniques, or technology.
a colleague or a staff member from the Center for Teaching
to interview your class. This could be done by splitting
the class into small groups to discuss course-related issues.
Each group then presents a summary of their discussion while
the interviewer records responses and asks questions. Contact
the Center for Teaching for help in using the Small Group
Instructional Development (SGID) method to interview classes.
a mid-semester scannable evaluation that allows for written
comments. Take time in class to review and discuss the results
videotaped portions of your class or of classes taught by
Videotaping can be arranged through the Center for Teaching.
University of Iowa Center for Teaching provides excellent resources
for instructor and/or course evaluation techniques. Center staff
members can be contacted using any of the following:
University of Iowa Center for
Summative evaluation occurs at the end of instruction and is used
to arrive at broad judgments of teaching effectiveness. Results
of summative evaluations relate to issues of accountability and
are often used to compare an instructors’ performance with
a peer group. A summative evaluation should consist of global
items intended to provide feedback to administrators and/or peers
for use in making administrative decisions (i.e., tenure, promotion,
merit, etc.). Global items generalize across course type and instructor
level. For example, a statement such as, “I would recommend
this course to another student” can be used across many
course formats. In a typical summative evaluation, all instructors
within a department or college use a standard set of items making
it possible to generate normative information. Sample global items
are included in the ACE item catalog.
Research Tells Us About Course Evaluations
There has been a great deal of debate concerning factors that
affect students’ evaluation of an instructor or a course.
Research studies have produced mixed conclusions. Raoul Arreola,
in “Developing a Comprehensive Faculty Evaluation System”
(1995), offers the following summary:
this research points out that many commonly held beliefs
concerning student ratings are, on the whole, myths or misconceptions.
Faculty cannot “buy” good ratings by giving
easy grades. Students do not generally rate faculty lower
in classes taught early in the morning or right after lunch.
Teaching a small class does not automatically guarantee
high student ratings, nor does teaching a large class automatically
guarantee low ratings. However, freshman students do tend
to rate faculty more harshly than do sophomores, sophomores
tend to rate more harshly than juniors, and so on, with
graduate students tending to rate faculty most generously.
Also, students in required courses tend to rate their instructors
more harshly than students in elective courses. It is interesting
to note that most large, required courses tend to be offered
early in the curriculum of a college. Thus, most of the
large required courses may be offered in the freshman and
sophomore years, just the time when students tend to rate
their teachers most harshly. It would be easy to conclude
from personal experience with such courses that the problem
lies with the size of the course when, in fact, the research
indicates it is the level (freshman, sophomore, etc.) and
the fact that the course is required that are the factors
which contribute to generally lower student ratings. The
important conclusion to be drawn at this point is the necessity
to systematically incorporate these findings into the interpretation
of student ratings.”
extensive discussion of these issues is provided in the resources
listed at the end of this bulletin.
Guidelines for Scanner-Based Evaluations
evaluations should be collected in the classroom under class
conditions. A minimum of three-fourths of the students should
should not be collected during the final exam period or be
a part of any test or quiz.
directions, provided by either EES or the department/college,
should be read to students exactly as printed. This is particularly
important for summative evaluations where faculty will be
compared to a peer group. All students should hear the same
should be provided a confidential mechanism through the department
or college for reporting incidences where instructors fail
to follow administrative guidelines or a student feels the
evaluation process has not been carried out appropriately.
one set each of instructor and departmental core results will
be printed by EES. If additional copies are requested, the
information will be provided electronically or standard copy
center fees will be charged for additional hard copies.
Exam Service does not store results of course evaluations;
each instructor is responsible for maintaining a personal
limited circumstances, EES will rerun the results from a course
evaluation, however, a fee will be charged based on the amount
of time it takes to recover archived data. Rerun requests
are only considered for the current or preceding semester.
the ACE System
is designed to obtain formative and/or summative information from
a single student questionnaire. The following
sections outline the general characteristics of ACE and provides
a step-by-step example of an ACE order.
A. Instructor Selected Items – Formative Evaluation
EES maintains an ACE item pool
with approximately 200 items addressing a variety of instructor/course
evaluation areas. Each item has been assigned a unique serial
number. Up to twenty items can be selected from the pool and printed
on the front of ACE answer sheets. The back of ACE answer sheets
is reserved for items which an instructor wishes to include that
are not in the ACE
item pool. There is space on the back of ACE answer sheets
for an additional twenty gridded items using either the Strongly
Agree to Strongly Disagree response scale or a scale created specifically
for each item. The back of ACE answer sheets can also be used
to ask open-ended questions that require written responses. Instructors
are responsible for providing a formatted original for duplication
on the back side of ACE answer sheets. The items should be typed
on white paper using the format requirements shown in the ACE
The results for ACE
pool items include the number and percentage of students marking
each response option, the median, mean, and variability indices.
Department/College Core Items - Summative Evaluation
Departments/colleges wishing to use ACE to collect evaluations
for administrative purposes (norms) will be asked to identify
a set of items that will make up a Department Core. The core should
be comprised of global items that generalize across course type
and format. The Department Core will be printed on all instructors’
ACE answer sheets preceded by a statement to students indicating
that results will be reported to a department representative for
possible use in administrative decisions. A Department Core should
be maintained for a reasonable length of time to provide a longitudinal
record of student responses. Departments/colleges may choose to
use more than one norm group in reporting results. For example,
a college might request that courses be grouped based on level
(by course number). A minimum of fifty class medians (current
or combined semesters) is recommended to provide stable normative
medians for select percentile points are printed on instructor
results for each core item. In addition, the department chair
or college dean will receive a summary table showing the percentile
point median values for each core item. Sample
Student government representatives have developed a core set of
items which instructors may choose to have printed on their ACE
forms. Results from the Student Core will be made available by
UISG to the general student population to be used in the process
of course selection. UISG has identified six items that make up
the Student Core. In addition to the six pre-selected UISG items,
instructors can select up to four items from the ACE
pool to be added to the Student Core. The Student Core will
always be printed as the last block of items on the back of ACE
answer sheets. The six pre-selected UISG items are:
course requires an appropriate amount of work for the credit
instructor increased my interest in the course material.
in this course were fair.
syllabus was an accurate guide to course requirements.
instructor clearly communicated class material.
this is an excellent course.
following is an example of the information that will is made available
ACE forms are ordered either by an individual faculty member
for use in their own classes or, by a department or college
representative who is ordering for all faculty within an identified
group. The ordering and mailing procedures are different under
these two circumstances and are outlined in the next two sections.
Faculty – Ordering ACE Answer Sheets
When you request information about ACE, you will be mailed an
ACE packet that includes a blue envelope, an ACE Header Sheet,
a cherry colored address sheet, and a set of directions. When
you receive your packet, complete the following steps to order
ACE answer sheets:
your name, course number, department, and campus address
on the cherry address sheet. An EES address appears on the
opposite side of the cherry sheet which is used to mail
forms and results back-and-forth between you and EES.
the front side of the ACE Header Sheet, complete the areas
that are listed below. An explanation of these categories
is given on the instruction
sheet included with your ACE packet.
Name or Course Title
# of Classes
Number and Cross Listed Course Number (if appropriate)
||Optional Codes - leave blank
#--If you have a form you have used in the past
and do not want to make any changes, you can grid
the old form number and do not need to fill out
anything on the backside of the ACE Header Sheet.
If this is the first time you are using a set of
items, leave the Form # blank and EES will assign
# or E-mail address
the back side of the ACE Header Sheet, complete the following
areas if you are creating a new form or, modifying a form you have used previously.
and grid the serial numbers of items from the ACE
pool in the order that you would like to have
them appear on the ACE answer sheets. If making changes to a previously used form, make sure you erase the original form # on the front of the ACE header sheet since a new number needs to be assigned.
whether or not you want the Student Core printed
on your answer sheets.
the cherry address sheet in the blue envelope with the EES
address showing along with your completed ACE Header Sheet
and send it through campus mail to EES.
ACE order will be filled and student answer sheets returned
to you along with administration directions.
Header Sheets are used twice during the course evaluation process,
once for ordering the answer sheets that students complete, and
a second time when the student answer sheets are scanned. The
instructor and course information that you grid on the ACE Header
Sheet is printed on your results.
or College Representative – Ordering ACE Answer Sheets
and/or Colleges that require a core set of items (Department Core)
to appear on all faculty answer sheets for normative purposes
must use “Batch Processing” for ACE. Batch processing
requires that a person be designated within the department to
order, distribute, collect, and return ACE answer sheets to EES
in one batch mailing. The batch mailing includes a blue ACE envelope
for each instructor within a designated comparison group who use
the same Department Core (form number). Normative information
cannot be calculated unless EES receives all the peer group envelopes
at the same time.
If you are a new department representative or your department
has never used a Department Core before, call the ACE Coordinator
at 335-0357 to clarify procedures.
following outlines two different formats for batch ordering of
ACE answer sheets:
Core – No Custom Items
this format, all faculty use the exact same form (Department
Core) and are not allowed to add items to their ACE answer
sheets. Complete the following steps to order ACE answer sheets
for your entire department or college:
Request an ACE packet from the EES ACE Coordinator.
the cherry address form print your name on the ‘Mail
to (if different than instructor)’ line, the department
name, and your mailing address
the ACE Header Sheet print and grid the following:
department name in the ‘Instructor Name or Course
number of copies needed for the entire department.
number of classes that will be using the forms.
phone number or email address
ready copy for any items that are to be printed on
the back of the ACE answer sheets.
the ACE Header Sheet and address sheet (EES address
showing) in the blue envelope and return it to EES.
In some departments there may be more than one ACE
form used for designated groups, for example, one
form might be used for faculty and a different form
for TAs. An ACE Header Sheet must be submitted for
each unique form number.
will send the printed answer sheets, enough blue envelopes,
ACE Header Sheets and administration directions for
the number of classes indicated in the order. Pack
and distribute an ACE envelope to each faculty member.
Once faculty receive their ACE packets they must complete the
Name, Course Number, and Form Number fields on the ACE Header
Sheet or, this should be done by the department representative
prior to distributing the packets. The individual faculty information
is printed on the results for identification purposes. No ACE
answer sheets will be scanned without a complete ACE Header Sheet.
Core Plus Faculty Selected Items
this format, faculty are required to use a Department Core
but can also include items of their own choosing on their
ACE answer sheets. This format requires that you:
an ACE packet for each instructor and course.
the cherry address sheet including the instructor name,
course number, name of the department representative on
the ‘Mail to (if different than instructor)’
line, department, room and building code. Many departments
use a preprinted label for the ‘Mail to’ name
ACE packets to faculty and request that they complete the
name, course number, phone or email, form number, item serial
numbers, and the Department Core ID on the front and back
of the ACE Header Sheet.
and Returning ACE Answer Sheets for Scanning – Faculty
the address sheet and ACE Header Sheet in the blue envelope
with the EES address showing through the window. The instructor/department
address information on the cherry address form must be complete.
a student to be responsible for collecting the completed
ACE answer sheets. Instruct the student to return the
forms by inserting them in the blue envelope under the
address sheet and then doing one of the following:
your forms include a Department Core, have the student
envelope to the department’s ACE representative.
you ordered your own forms and your department does
not use a
Department Core have the student return the packet
by either placing it
in campus mail, returning it to a central collection
place in your department, or
dropping it off in person at EES.
out one ACE answer sheet to each student and read the administrative
instructions exactly as printed. Reading a preprinted script
helps standardize the evaluation process.
the room while the evaluation is being completed. Under
no circumstances, should student responses be viewed by
you before they are returned to EES for processing.
sheets are scanned by EES and a copy of the results and
answer sheets are mailed to the address provided on the
ACE address sheet after the Registrar's grade submission deadline.
A separate ACE Header Sheet, address sheet and blue envelope
is required for each unique course – do not put answer
sheets with different form numbers in the same ACE envelope.
help us save on costs, we ask that you not store your results
in the blue envelope but return the envelope to EES in campus
mail – a return address is stamped inside the envelope
The permanent deadlines for ordering ACE forms are November
10th for the fall semester and April 10th
for the spring semester. Orders will be filled and mailed a
few days prior to the due date indicated on the ACE Header Sheet.
If you submit an order after the deadline, you will be provided
with preprinted standard forms. Standard forms contain general
items and are available with or without
the student core.
The ACE Header Sheet includes an area for instructors to indicate
whether an evaluation is formative (before end of course) or
summative (end of course). Formative evaluations will be processed
on receipt and the results returned within a week to the instructor.
Results for summative evaluations will not be returned until
the end of each semester after grades have been submitted to
the Office of the Registrar.
It is fairly easy to summarize student responses to course evaluations
but very difficult to interpret what those responses mean. The
value of student ratings increases when attention is focused
on response patterns both within a class and over time rather
than on a numeric index of teaching ability. Careful consideration
should be given to response frequencies, particularly if a third
or more of the students in a class are disagreeing with an item.
Normative information is intended only to give instructors a
rough guideline of their relative standing within an appropriate
norm group. It is important that the limitations of norms be
kept in mind when score interpretations are made. With norms,
if every instructor within a department received superior ratings,
fifty percent would still appear to be average or below. Likewise,
if every instructor received poor ratings, the norms table would
show fifty percent being above average. Norms provide guidelines,
Return to Using the ACE System
The following pages show an example of an ACE order from start
to finish. Each page has been labeled and a short explanation
is given below.
is a two-sided address label. One side is preprinted with
the Exam Service address and the other side provides space
for an instructor’s name, course number, and address.
This label will be used in conjunction with the special
blue ACE envelopes to send orders, completed forms, and
results back-and-forth between EES and you.
page contains step-by-step instructions on how to complete
the ACE Header Sheet for ordering ACE answer sheets. The
ACE Header Sheet will be returned with your printed ACE
answer sheets and must be submitted again when answer sheets
are returned to EES for scanning.
you are not responsible for ordering your own answer sheets,
you will still need to complete parts of an ACE Header Sheet
to be submitted with the completed forms for scanning. This
page explains those parts of the ACE Header Sheet that need
to be completed for scanning services only. ACE Header Sheets
should be available from the person in your department responsible
for ordering answer sheets.
is an example of a completed ACE Header Sheet. In this example,
the instructor has requested ACE answer sheets that include
a departmental core, instructor-selected items from the
ACE item pool, custom items, and the student core.
is an example of an original for duplication of custom items
on the back of ACE answer sheets. An original for custom
items must be submitted on 8.5” x 11” white
paper with the following margins: a left margin of ½”;
a top margin of ½”; and a maximum text width
of 5½” (making the right margin 2½”).
The bottom margin will vary depending on whether or not
the student core option has been selected. With student
core items, the bottom margin should be 3½”
and without, the bottom margin should be 1½”.
is an example of the ACE answer sheet that would result
from the ACE Header Sheet and duplicating original shown
in Samples 4 and 5.
are directions for administering ACE, including a standard
statement to be read to students. Some departments or colleges
may have custom scripts for administering course evaluations.
sheet, explaining the results from an ACE evaluation is
available online as a reference.
is an example of the results from an ACE evaluation.
This is an example of a norms table that would be sent to
a department chair or college dean. This table gives the
median at select percentile points from the distribution
of course medians for the designated norm group.
|Standard Form X001
is an example of the standard form X001 without student core questions.
|Standard Form X002
is an example of the standard form X002 with student core questions on the back.
The following is a brief list of resources for information
related to instructor and course evaluation.
Abrami, P.C. (1989). How should we use student ratings
to evaluate teaching. Research in Higher Education, 30(2),
Abrami, P.C., D’Appollonnia, S., & Cohen, P.A.
(1990). Validity of student ratings of instruction: What
we know and what we do not know. Journal of Educational
Psychology, 82(2), 219-231.
Bednash, G. (1991). Tenure review: Process and outcomes.
Review of Higher Education, 15(1), 47-63.
Braskamp, L.A., Brandenburg, D.C., & Ory, J.C. (1984).
Evaluating teaching effectiveness: A practical guide.
Beverly Hills, CA: Sage Publications, Inc.
Cashin, W.E., & Downey, R.G. (1992). Using global
student rating items for summative evaluation. Journal
of Educational Psychology, 84(4), 563-572.
Cross, K. P., & Angelo, T. Classroom assessment techniques:
A handbook for faculty. Ann Arbor, MI: National Center
for Research to Improve Postsecondary Teaching and Learning,
Marsh, H.W. (1980). The influence of student, course
and instructor characteristics on evaluations of university
teaching. American Educational Research Journal, 17, 219-237.
Marsh, H.W. (1984). Students’ evaluations of university
teaching: Dimensionality, reliability, validity, potential
biases, and utility. Journal of Educational Psychology,
Miller, R.I., (1987). Evaluating faculty for promotion
and tenure. San Francisco, CA: Jossey-Bass, Inc.
Milman, J. (Ed.) (1981). Handbook of teacher evaluation.
Beverly Hills, CA: Sage Publications.
Overall, J.U., & Marsh, H.W. (1980). Students’
evaluations of instruction: A longitudinal study of their
stability. Journal of Educational Psychology, 72, 321-325.
Rando, W.C., & Firing Lenze, L. (1994). Learning
from student: Early term student feedback in higher education.
National Center on Postsecondary Teaching, Learning, and
Root, L. S. (1987). Faculty evaluation: Reliability of
peer assessments of research, teaching, and service. Research
in Higher Education, 26(1), 71-84.
Seldin, P. (1982). Changing practices in faculty evaluation.
San Francisco, CA: Jossey-Bass, Inc.
Theall, M., & Franklin, J. (Eds.), Student ratings
of instruction: Issues for improving practice, new directions
for teaching and learning. San Francisco, CA: Jossey-Bass.
(1990). Improving college teaching. San Francisco: Jossey-Bass,