The sixteen recommendations outlined below are a result of more than a year’s work by the Student Evaluations of Teaching Working Group from 2019/20, and extensive consultations with the UBC community. While some of the recommendations were established early on in the Working Group’s deliberations, the majority emerged after extensive discussions and consultations. A set of initial recommendations was drafted in November 2019 and refined through further Working Group discussion and consultation. Consultations included student groups, open forums, and interim presentations to Senates on both campuses.
Brief updates on the recommendations are provided below. Please see the other menu items in this section at left for more details on many of these recommendations.
Download the recommendations in the 2020 Report to Senate.
Student Involvement
1. Evaluation of teaching should include student feedback.
Students have a unique and valuable perspective from which to provide feedback on teaching at UBC. Student feedback on teaching is one of several sources of data that should be used for making personnel decisions and for the improvement of teaching.
2. The name of the process by which student feedback is gathered should be changed from ‘Student Evaluation of Teaching’ to ‘Student Experience of Instruction’.
Evaluation of teaching is a complex process, whether for formative or summative purposes. To do it effectively requires input from multiple perspectives and sources (students, peers, self) integrated across time. As noted in (1) above, students have an important perspective that should be part of that. However, students should be asked to focus on their experience, rather than to ‘evaluate’ teaching writ large.
The new name of the process has been rolled out in official communications about the surveys since Winter Term 1, Fall 2021.
3. Questions asked of students should focus on elements of instruction based on their experience with instructor(s) in specific contexts and relationships.
In line with a recent statement from the American Sociological Association (September 2019) questions for students should focus on their experiences and be framed as an opportunity for students to provide feedback, rather than positioning the request as a formal and global evaluation of the teacher.
Changes to the university module questions on both campuses were designed to focus on students’ experiences in their courses; these changes were implemented starting in Winter Term 1, Fall 2021.
4. Student leadership on both campuses should be actively engaged in raising the profile of student feedback on instruction.
Gathering and considering feedback on teaching and learning from students is a responsibility shared between faculty and students. Student leadership should play an active and visible role in raising awareness of the purposes for, and ways in which, this feedback can improve instruction. Student leadership should also be part of efforts to raise awareness of comments that are not appropriate and/or counter-productive in the context of an anonymous survey.
The SEI Implementation Committee has connected with student leadership on both campuses about this recommendation, and consulted about student-facing communications and information about SEI.
UMI Questions
5. UMI-6 (Overall the instructor was an effective teacher) should be retained in the core question set, but modified.
The Working Group had extensive discussions about the inclusion or deletion of this item. Analysis of UBC data indicates that UMI-6 scores are able to be predicted to a high degree of confidence based on a weighted linear combination of other UMI questions (except UMI-4). However, in its current form, UMI-6 asks students to directly evaluate the ‘overall effectiveness of the teacher’. As we have argued above, students are not in a position to be able to make sweeping, all-inclusive judgments about the effectiveness of instruction. On balance, the Working Group recommends retaining UMI-6, but rewording it as ‘Overall, this instructor was effective in helping me learn’. This centres the question on the individual experience of the student.
After consultations and testing, a new wording for this question was implemented in surveys in both campuses starting in Winter Term 1, Fall 2021. See more information on changes to the UMI questions.
6. Minor changes in wording of other UMI questions are suggested to better reflect the focus on each student’s experience of instruction.
The instructor made it clear what students were expected to learn, to be changed to The instructor made it clear what I was expected to learn
The instructor helped inspire interest in learning the subject matter, to be changed to The instructor engaged me in the subject matter
The instructor communicated the subject matter effectively to be changed to I think that the instructor communicated the subject matter effectively.
The instructor showed concern for student learning to be changed to I think that the instructor showed concern for student learning
The latter two questions are phrased so as to balance first person perceptions with overall cohort experience and classroom climate.
After consultations and testing, new wordings for these questions were created and implemented in surveys starting in Winter Term 1, Fall 2021.
7. UMI-4 (Overall, evaluation of student learning was fair) should be removed from the common set
UMI-4 is something of an outlier in the current UMI set used in Vancouver campus surveys. It is consistently answered by fewer students. It is also problematic because the concept of ‘fairness’ is highly ambiguous. Student consultations have indicated they are often unsure how to interpret what ‘fairness’ means.
Changes to the university module questions on both campuses was implemented starting in Winter Term 1, Fall 2021. This question has been removed.
8. A new UMI item, pertaining to the usefulness of feedback, should be trialled.
Whilst the working group recommends removal of the previous UMI-4 item, on fairness of assessment (see recommendation 4), there was a strong sense that, given the importance of timely and effective feedback in the learning process, this should be reflected in the core UMI questions.
We recommend a question worded as follows: “I have received feedback that supported my learning”. However, this question should be piloted in a limited set of courses in 2020/21 to ensure that we understand how responses might be influenced by variables such as class size, etc. It is certainly the case that the opportunity to provide feedback, and indeed the nature of that feedback (e.g., written and / or numerical), will look very different in a seminar class of 20 compared to a large introductory lecture of 200. We should collect data from a pilot to better understand how this question is understood and responded to before including it in the core UMI set. The results of the pilot could be included in the 2020/21 Report to Senates and a decision taken on how to proceed.
A question on feedback is part of the new set of university module questions that was implemented in Winter Term 1, Fall 2021.
9. There should be a common set of UMI questions asked across both campuses
There should be a commonly-used core set of five or six questions across both campuses. Modular approaches to constructing feedback surveys may be appropriate (university-wide items plus Faculty, Department and course-specific items). However, units should be mindful that most students complete several surveys per semester, potentially causing ‘feedback fatigue’ and reducing rates of participation. Therefore, units should be mindful of the overall length of feedback surveys students are being asked to complete. Units should also explore other ways to gather specific feedback as the course progresses.
Changes to the university module questions on both campuses was implemented starting in Winter Term 1, Fall 2021; both campuses now use the same set of UMI questions.
Data and Reporting
10. Units should be supported to adopt a scholarly and integrative approach to evaluation of teaching.
Because teaching is complex and contextually dependent, departments and units should be supported to adopt an integrative and scholarly approach to evaluation that synthesizes multiple data sources (e.g., students, peers, historical patterns, and self-reflection documentation) for a holistic picture, without over-reliance on any single data source. This approach will necessarily look different in different units but should include both in-kind support from units such as CTLT/CTL and funding for department leaders to accomplish the work proposed. When used for personnel decisions, the unit’s approach, strategy, and norms can then be communicated to all levels of review, along with the file. The VPAs on both campuses should work with the Senior Appointments Committee (SAC) to identify and disseminate anonymous examples of effective ways to integrate, synthesize and reconcile multiple perspectives on teaching effectiveness.
The SEI Implementation Committee and others wrote a discussion paper on an integrative approach to evaluation of teaching in Fall 2021. A dual-campus working group is moving forward on this recommendation in conjunction with work on a new Senate policy on evaluation of teaching (see recommendations 15 and 16, below).
11. Reporting of quantitative data should include an appropriate measure of centrality, distributions, response rates and sample sizes, explained in a way that is accessible to all stakeholders, regardless of quantitative expertise.
The interpolated median should be used as the measure of centrality, with the dispersion index as a measure of spread. Reports should include distributions of responses, response rates and sample sizes, clearly flagging where response rates do not meet minimum requirements for validity and accuracy. Visualizations of comparative (anonymous) data should be developed, along with an on-going program of consultation and dissemination to different groups (faculty, staff and administrators).
The results reports provide the interpolated median, the dispersion index, and the percent favourable (percentage of respondents answering “agree” or “strongly agree” to each question. Reports also include response rates and information about minimum recommended response rates based on class size.
12. UBC should prioritize work to extract information from text/open comments submitted as part of the feedback process.
Many faculty members report the free-text student comments as sources of rich data to support reflection and enhancement of their course and teaching. It is recommended that a pilot investigation be undertaken, with one or more Faculties, to investigate the potential of automated approaches to extract useful information from large volumes of text submissions. The pilot should engage with appropriate research expertise in Faculties in these areas, and aim initially for formative purposes. There is an opportunity for UBC to take a lead among institutions in providing balance and insight when combining quantitative and qualitative data. Failing to do this continues to privilege quantitative over qualitative data about teaching.
The SEI Implementation Committee is investigating several such platforms and will report on investigation and testing by early Fall 2022.
Dealing with Bias
13. UBC needs additional and regularized analysis of our own data to answer questions related to potential bias, starting with instructor ethnicity, as it is frequently highlighted as a potential source of bias in the literature on student evaluation of teaching.
An analysis of UBCV data with respect to instructor and student gender over the last decade reveals no systematic differences in aggregate data of ratings received by female vs. male instructors. Variables tested for (including instructor and student gender) indicate aggregate differences at the level of approximately +/- 0.1 on a 5-point scale, in other words, very small effects. Course-specific effects (e.g., subject discipline, course level) demonstrate larger effects (typically +/- 0.3 on the same scale). An analysis of UBCO data across 2015-16 and 2018 academic year revealed mixed results, as are detailed in Appendix 3 of the Working Group’s report to both Senates.
For both campuses, it is important to note that this is an analysis of aggregate data and, as such, will mask variation on an individual level. The lived experience of individual instructors may be quite different from this aggregate view. However, holistic evaluations of a person’s teaching (see recommendation 15) can be used to contextualize individual instructors’ experience. We cannot stress enough the importance of a holistic evaluation that allows individual lived experiences to be heard, particularly if their lived experience runs counter to the aggregate data.
Given that studies have presented evidence of bias on the basis of instructor ethnicity, it would seem both appropriate and timely that the same analysis be brought to bear in checking the UBC data for bias. This work comes with privacy and ethical implications. We recommend developing a process that would allow instructor ethnicity data to be accessed confidentially for regular investigation of bias. We have not been able to address this analysis during the timescale of this working group and thus recommend a follow-on activity to investigate this, reporting back to Senates during the 2020-2021 academic year. The follow-on report would also be in a position to recommend regularized analysis and mitigation strategies to address any systematic biases found, particularly related to gender and/or ethnicity.
Work on processes for collecting and using data on instructor ethnicity for the purpose of bias analyses is underway, and these analyses will commence in 2022, when adequate data to support reliable conclusions is available.
14. The work of collecting, integrating, interpreting and using feedback on teaching should mitigate against bias, but should not presume the complete removal of bias.
As with most other forms of surveys, student feedback on instruction cannot be completely free from bias. Bias can be explicitly discriminatory and perpetuating of stereotypes. But bias can also be implicit, where respondents are not consciously aware of how their attitudes influence their responses. Implicit biases have been shown to occur in many domains and the general approach at UBC (e.g., on hiring committees) has been one of mitigation through education and awareness raising.
This recommendation is supported by an analysis of the voluminous literature on the topic of student evaluations of teaching, and interrogation of the UBC dataset at multiple points in the last 10 years. The research literature reports studies on a wide variety of instruments and processes, with considerable variation in the scope of data collected. Individual studies are often reported in the mainstream academic press, sometimes with extrapolation beyond the context and the effects found in the initial study. Studies investigating a variety of instructor effects (e.g. age, gender, ethnicity) vary in whether they show bias, no bias or bias toward (rather than against) female instructors. In the subset of published studies where biases are found, and enough detail is provided to be able to discern the effect size, those effect sizes on aggregate are small.
Broader Issues
15. The Vancouver Senate should review the policy on Student Evaluations of Teaching and consider a broader policy on the evaluation of teaching writ large. The Okanagan Senate should develop a similar policy for the Okanagan campus.
Student feedback, both quantitative and qualitative, should be integrated with other forms of data to estimate the effectiveness of a faculty member’s teaching. The current policy (2007) says little about how student feedback should be integrated with other forms of data before making judgments about the effectiveness of teaching. Therefore, it is appropriate to revisit the UBC-V Senate Policy on Student Evaluation of Teaching and consider adding or replacing it with a policy that sets forth a broader and more scholarly approach to the evaluation of teaching. Similar processes should be applied and governed by either a joint Senate policy, or aligned policies for each campus.
As of Spring 2022 dual-campus working group has been formed to move forward on this recommendation and the following one.
16. Senate should commit to support the ongoing work of implementing policies related to the evaluation of teaching.
Career advancement decisions are made on the recommendation of Departmental, Faculty and a system-wide Senior Appointments Committee, each of whom is tasked to evaluate teaching effectiveness as a component of every case. It is imperative that UBC commit to providing the necessary resources and training, including administrative and technological support, to implement Senate policies on evaluating teaching (see recommendation 15).