Focusing the Evaluation of Teaching on the Effectiveness of Student Learning

Report to the Elon University Community by Robert C. Froh - August 8, 2003

 

Executive Summary

This executive summary provides an overview of the recommendations in the report addressing the two primary areas for emphasis: incorporating student self assessment of progress in relation to learning goals as part of student evaluations of teaching, and developing the academic department as a critical location for peer review that focuses on the impact of teaching on the effectiveness of student learning. The recommendations stress the need for methodologies at three levels: faculty self-evaluation, peer review, and student evaluations of teaching and learning. The recommendations also place on high value on implementing methods that enable the ongoing investigation of the relationship of teaching and learning processes with learning outcomes.

 

General Recommendations

Use this coming academic year to experiment with new forms of formative and summative evaluations of teaching. There are several current initiatives that should influence evaluations of teaching, particularly the use of peers in summative evaluations of teaching at the Love School of Business, and some reconsideration of the role of the department chair in relation to teaching evaluations. These and other initiatives will continue to evolve in the appropriate contexts, and at appropriate points the Ad Hoc Committee for the Evaluation of Teaching should incorporate relevant components from these developments into revisions of the evaluation of teaching.

 

Incorporating Student Self Assessment As Part of Student Evaluations of Teaching

            Continue to use the “Student Evaluation of the Learning Process at Elon College” to address ‘characteristics of effective teaching’ but supplement the current form with questions addressing learning goals. Experiment at the department level with questions that incorporate student estimates of their progress in relation to course and program learning goals as an important indicator of effective teaching. Give faculty the opportunity to indicate for each of their courses the relative emphasis they place on each of the learning goals represented on departmental forms. Use the IDEA form and the Hampshire College form as examples.

 

Develop the Academic Department as a Critical Location for Peer Review

Implement ‘collegial’ approaches to teaching evaluations within departments. Articulate student learning goals for each department to serve as a basis for components of student and peer evaluations of teaching that focus on learning outcomes.

 

Develop the use of reflective memos to enable assistance from peers in strengthening teaching. Use course and program learning goals as the basis for these memos. Share these reflective memos with peers within and across departments. Support faculty in summing across these memos to prepare entries for annual report and promotion and tenure files.

 

Pilot the use of course portfolios by selecting a small group of faculty to receive funding and release time to develop these portfolios. Use examples supplied on the Carnegie Foundation for the Advancement of Teaching web site that show how these portfolios can demonstrate the scholarship of teaching.

 

Incorporate systematic reviews of samples of student work to support formative evaluations of teaching that focuses on student learning outcomes. Enable faculty to receive the results of reviews of student work for students in their courses to enable reflection on the accomplishment of course and program learning goals.

 

Use an agreed upon classroom observation form to support both chair and peer review of teaching in a more systematic and consistent fashion. Consider the results of these observations as data points that are separate and distinguishable from other forms of chair and peer review.

 

Since classroom observations appear to be such a strong component of the Elon culture, each department should develop a procedure for how this will be done addressing the role of the chair and other senior faculty for summative evaluation, and the opportunities available for faculty to seek observations within a confidential formative evaluation context.

 

Finally, pilot the use of web-based forms for completing student evaluations of teaching. This should include enabling opportunities for follow-up peer review of student comments to open ended questions as a formative evaluation tool.

 


Introduction

The Ad Hoc Committee for the Evaluation of Teaching (The Committee) focused this year primarily on improving the administration of the “Student Evaluation of the Learning Process at Elon College” form and broadening the base of information used to evaluate faculty performance. I have provided advice along the way in relation to this focus. In addition to many constructive interactions with members of The Committee, this advice has been guided by observations of faculty concerns framed during the March 7 Numen-Lumen Forum, and by reflections from interviews with faculty and administrators at various points during this past spring semester.  In relation to the report by The Committee presented to the Academic Council on May 3,  specific comments provided to Gene Gooch at that point can be found in Appendix A.

 

This current report urges consideration of new approaches to the evaluation of teaching through addressing the impact of teaching on the effectiveness of student learning. This report will suggest the need to align the evaluation of teaching with the assessment of student learning by use of some specific and promising methodologies at three levels: faculty self-evaluation, peer review, and student evaluations of teaching and learning. This report will address two primary areas for consideration: incorporating student self assessment of progress and effort in relation to learning goals as part of the student evaluation of teaching process, and developing the academic department as a critical location for peer review that focuses on the impact of teaching on the effectiveness of student learning.

 

Testing New Forms of Formative and Summative Evaluation of Teaching

This report attempts to frame recommendations that align with the intent of The Committee that the Elon community experiment over the coming year with new forms of formative and summative evaluations of teaching. This approach has merit for several reasons. The Committee determined that the current “Student Evaluation of the Learning Process at Elon College” form needs to be modified to better serve classes and learning contexts that do not fit the standard mode such as lab classes, undergraduate research experiences, internships, etc. The Committee will work this coming year to support the development of appropriate forms for these different modes. The consideration of the use of ‘teaching portfolios’ to strengthen ‘self-evaluation’ has not been resolved due to concerns regarding the demands that teaching portfolios might place on an already very busy faculty. The role of the department chair in relation to teaching evaluation is evolving to consider new forms of peer review at the department level in support of the vision of the Elon ‘Scholar-Teacher.’ The Love School of Business will implement a teaching committee approach that will enable peer review at promotion and tenure decision points, and the implementation of this approach should inform what other programs at Elon might consider in strengthening peer review for summative evaluation. The proposed center for teaching and learning will evolve and most likely will provide support for the formative evaluation of teaching. This support should enable more clarity regarding the balance and potential integration of formative and summative evaluation efforts. Finally, the ‘Committee to Define Scholarship’ has finished its deliberations and actions taken on basis of their recommendations may need to be aligned with evaluation of teaching efforts to address faculty evaluation in a more holistic fashion.

 

It appears advisable for The Committee to consider over the coming year how each of these initiatives will have impact on the evaluation of teaching as part of the Elon ‘teacher-scholar’ vision. I stand ready to assist The Committee generally, and to consider specifically how the recommendations made below in this report can be applied.

 

Incorporating Student Estimates of Progress as a Component of Student Evaluations of Teaching

Making innovations in the evaluation of teaching has been constrained by the lack of integration of research on student evaluations of teaching with research on the conditions and outcomes of effective student learning.  Specific to Elon and typical of the forms used on many campuses, the “Student Evaluation of the Learning Process at Elon College” focuses on ‘teaching style’ and ‘characteristics of effective teaching’ as viewed by students, and does not address learning outcomes. Some institutions are beginning to incorporate student estimates of their progress on course and program learning goals as an indicator of effective teaching, and are using these student self-assessed estimates of learning to support the evaluation of teaching within summative contexts. These institutions are also beginning to stress the use of student evaluations of ‘characteristics of effective teaching’ in more diagnostic and formative contexts.

 

Astin’s Input-Environment-Outcome (IEO) Model as Support for Improved Teaching Evaluations

Using Astin’s (1993) input-environment-outcome (IEO) model ‘characteristics of effective teaching’ are important ‘environment’ variables that contribute to explaining learning ‘outcomes.’ Astin postulates that student involvement or what Pace (Pace 1990) describes as ‘quality of student effort’ is another critical environment variable that contributes to explaining student learning. Astin also postulates that depending on the circumstances, student involvement can serve as an ‘intermediate outcome,’ and he summarizes research that demonstrates that learning outcomes are positively associated with student involvement. This conceptualization is important to the evaluation of teaching because it enables integration of research demonstrating that both faculty effort and student effort are important to understanding learning outcomes, and it suggests that faculty can strengthen student learning outcomes through influencing the ‘intermediate outcome’ of student involvement. Most importantly within the evaluation of teaching context, this research suggests the power of looking at environment variables in relation to outcome variables in strengthening the effectiveness of evaluations of teaching.

 

Using Student Estimates of Learning for Summative Evaluation

In making use of the IEO model, the Elon community should consider encouraging departments to pilot student course evaluation forms that ask students to rate their effort and progress on student learning goals articulated in relation to the various components of the undergraduate curriculum. These components are lower division introductory and general education courses including offerings of first year small class seminars, internships and study abroad experiences, upper division courses in the major, and capstone projects and advising along the way that enable students to reflect on their learning at given points and to modify their learning goals. This approach in combination with the use of the current set of questions on Elon’s teaching evaluation would foster a good balance in evaluating teaching processes through the current form, and in assessing learning outcomes through the use of these pilot forms developed to address learning within particular components of the undergraduate curriculum.

 

To use this approach most effectively, each faculty member should have the opportunity to indicate for each of their courses the relative emphasis they place on each of the learning goals represented on the pilot evaluation forms. This indicator of relative emphasis should be taken into account in formative and summative evaluation contexts. The examples described below for the IDEA form and for the Hampshire College form show how this can be done.

 

In support of this approach, research done at the University of Illinois in relation to their ICES (Instructor and Course Evaluation System) catalogue of over 300+ questions indicates that more experienced faculty tend to select questions that asked students to measure their progress in learning more so than questions directed toward particular characteristics of effective teaching. This research suggests that faculty that had developed their teaching style gradually wished to focus more so on what students were actually learning (Ory and Braskamp 1981).[1]

 

Also, in support of this approach, Bill McKeachie (McKeachie 1997), in a review of the literature on student evaluations of teaching, recommends the method developed by Center for Faculty Evaluation and Development at Kansas State University called the IDEA short form (Instructional Development and Effectiveness Assessment).[2] IDEA asks students to rate their progress on instructional goals and asks faculty to rate their relative emphasis on these goals. McKeachie suggests this as a method that “provides information that goes beyond the teacher’s conformity to a naïve stereotype of good teaching but also is educational in broadening students’ conceptions of what the aims of education are.”  McKeachie supports this approach particularly in relation to the use of student ratings in the context of summative evaluations of teaching.

 

As a part of a redesign of the course evaluation form while working at Duke University, a revision of the student course evaluation involved the implementation of the IDEA short form as one part of a redesigned form, along with Duke faculty defined characteristics of effective teaching. Faculty appeared to appreciate the feature of being able to indicate their relative emphasis on goals and having this compared to the student ratings of progress.

 

In a more extensive way, Hampshire College recently has taken this approach even further and has developed 18 of 32 total questions on their student evaluation to evoke student ratings of their progress (Appendix B). They are currently conducting analysis regarding the usefulness of this approach. These analyses will help to select outcome questions where students appear to discriminate particularly well across different courses.

 

Rather than implement the IDEA or the Hampshire College form, Elon should consider developing their own learning outcomes focused questions that represent goals specific to Elon and that address a continuum of lower order to higher order cognitive skills using Bloom’s ‘Taxonomy of Educational Objectives’ as a source. Appendix C contains a Power-Point presentation that lists useful descriptors for Bloom’s taxonomy that could stimulate the development of these types of questions.  Another important source for considering these type questions would be the definition of core abilities at Alverno College where students are asked to self-assess their development in relation to eight core abilities defined for the curriculum with levels of development articulated in relation to each ability to guide students in their self-assessment (also in Appendix C).

 

As an important aside, research on student self assessment prompted by the IDEA form and by the National Survey of Student Engagement (NSSE) form supports the validity of student self-assessment when the questions are sufficiently specific in nature (Pike 1995).

 

Developing the Academic Department as a Critical Location for Peer Review

Toward the end of this spring semester as part of deliberations with Academic Council, The Committee began to consider in more detail the role that department chairs play in the evaluation of teaching particularly with reference to conducting classroom observations. The Committee also began to consider the role that departmental communities play in peer review of teaching efforts particularly with reference to formative evaluation. Depending on the departments, the chair may or may not conduct classroom visitations as part of their teaching evaluation. In addition, within a number of departments the faculty have taken initiatives to establish peer review processes.

 

It appears that traditionally the department chair at Elon assumed much of the responsibility for evaluating teaching beyond that offered by the student evaluation form, and that the processes for peer review were less developed. However, given the evolving ‘teacher-scholar’ vision at Elon, it appears that this may be an appropriate point to consider more robust ‘collegial’ approaches to teaching evaluations that align more with how scholarship is recognized and rewarded within disciplinary and professional domains.

 

There is a growing body of literature that focuses on the academic department as the focal point for enabling ongoing improvement in teaching and learning and for responding to increasingly stronger calls for demonstrating quality to various internal and external constituencies. A recent report by the National Academy of Sciences addressed how ‘high quality student learning’ should be the major criterion for measuring teaching effectiveness, and how academic departments “might assume collective responsibility for developing a coherent set of course, programs, and other educational experiences that can enable all participating students to maximize their opportunities to learn.” (Fox, Hackerman et al. 2003)

 

Defining College and Department Level Learning Outcomes

            The presentation above, suggesting the utility of student estimates of their learning in student evaluations of teaching, provides a useful backdrop for the need for departments to define learning goals that serve as the basis for formative and summative evaluation of teaching and for the assessment of student learning. Various campuses have formed ad hoc faculty committees to support this articulation of outcomes and to monitor assessment efforts designed to determine the impact of pedagogy and curriculum on achieving these goals. The University of New Hampshire web site describes their ‘faculty associate’ group which serves in this role.[3] St. Michael’s College in Vermont convened a committee of faculty that took as its initial focus the articulation of student learning outcomes for departments.[4] Elon departments might find these approaches informative in efforts to implement formative and summative evaluations of teaching focused on goals for student learning. Departments at Elon might incorporate efforts to articulate outcomes as part of the review process for their five year plans.

 

Supporting Faculty Reflections on Teaching

A considerable amount of literature supports the use of teaching portfolios as a mechanism for enabling faculty to guide the faculty evaluation process. The literature stresses that portfolios enable faculty to articulate their goals for learning and their evolution as teachers with particular strengths and areas for desired improvement. Portfolios also serve as a way of presenting evidence regarding the impact of teaching on student learning. However, Elon counts itself among a number of institutions that have been reticent to adopt this approach due to what appears to be imposing time demands for documentation. There are some useful alternatives to teaching portfolios that should be considered.

 

Given the context of learning goals and assessment of student learning addressed above, faculty reflection on teaching can become much more focused and directed to desired impacts on students. Two approaches merit consideration for fostering more reflection and communication regarding teaching within a formative evaluation context: reflections on teaching, and course portfolios composed by selected faculty. These approaches emanate in part from an AAHE-Carnegie Foundation for the Advancement of Teaching project addressing peer review called ‘From Idea to Prototype: The Peer Review of Teaching’ (Hutchings, AAHE Teaching Initiative. et al. 1995).

 

With regard to reflections on teaching, some conversations I have had recently support the notion of strengthening reflection as part of nurturing a ‘culture of inquiry’ regarding teaching and learning. Michele Marincovich[5] at Stanford explicates the use of reflective memos with the specific context of an NSF-funded New Century Scholars project for new (1-3 years) engineering faculty. Reflective memos enabled faculty to reflect on the match between their aims and their design for a course, and the course's actual learning outcomes. Generally two to four pages, reflective memos begin with a general description of the course (who takes it, why, the course's major goals and components, the challenges of teaching such a course) and proceeds with a more detailed description of major aspects of the course (whether lectures, projects, labs, review sections, or homework assignments, etc.).  In each case, faculty reflect on these components using the evidence from their own observations and the evidence of students' work, comments, and evaluations. Memos usually conclude with a section on next steps to improve the course or to develop a strategy for further course development.

 

Craig McEwen and Christine Cote at Bowdoin[6] have recently embarked on a ‘Reflective Teaching’ project to work with faculty members to clarify their goals for student learning as teachers of disciplines within the context of a clear statement of the curricular goals of the College.  Reflective teaching calls upon faculty to gather information that will tell them how well those goals are being met, and to learn from the evidence to reflect on and modify pedagogy with help from professional development activities centered on best practices in the field. 

 

Donna Qualters at Northeastern[7] talked about the use of ‘Course Memos’ in engineering for faculty to seek feedback from department colleagues after the end of course. The memo asks faculty to attach a copy of their course learning objectives and measurable outcomes as stated in the what they call a ‘course charter’ which represents an agreement among colleagues regarding what learning outcomes will be addressed in particular courses. Through the memo faculty reflect on their course objectives and comment on whether they would add, subtract, change, or refocus them.  Through the ‘Course Memo’ faculty support their comments with assessment data where possible, and address what teaching methods worked particularly well, what modifications they will make next time they teach the course, and who else would benefit from knowing the results in their classes (department committees, other course instructors, advisors etc.)

 

In each of these cases, the perception has been that implementing teaching portfolios might be too demanding, but that mechanisms that support sharing faculty reflections on teaching can provide material useful to the evaluation and improvement of teaching. At Elon, faculty could share these reflection documents with colleagues within and across departments for suggestions in formative evaluation contexts and with department chairs to support decision making in summative evaluation contexts. It is now my understanding that all courses syllabi at Elon contain learning objectives. These objectives could serve as a basis for reflective memos.

 

In considering how to make use of reflective memos, Elon academic departments will need to resolve how these memos could support faculty preparing their annual report (Unit 1) and their promotion and tenure files. The processes suggested above for reflective memos can be retooled from one of summarizing particular courses to one of summarizing a set of courses in a given year, or summarizing intentionality and progress over a number of years in the case of promotion and tenure reviews. In these cases, the reflections need to be more comprehensive, and the evidence cited needs to be more selective and presented in a way that shows one’s evolution as a teacher. Increasing reflective capacity in interaction with peers at the course level should eventually strengthen the quality of high level summaries for annual reports, and promotion and tenure reviews.

Selected Use of Course Portfolios

It would still be useful to consider the use of teaching portfolios, but in a context where faculty seek special funding or release time to develop course portfolios. These course portfolios would represent how they and their departmental colleagues have addressed particular teaching and learning challenges such as enabling introductory courses to generate excitement and a sense of value for the field, or developing capstone projects or senior theses in a way that challenges students to strengthen their scholarship. Course portfolios in these cases are faculty generated documents directed to colleagues in the department and the field wishing to consider how to strengthen the effectiveness of teaching and learning. The Carnegie Foundation for the Advancement of Teaching provides several examples of course portfolios developed by faculty on their web site.[8] In each of these cases, faculty pursue the course as a form of scholarship and present reflections and artifacts to represent this scholarship. Given Elon’s teacher-scholar vision, a focused effort enabling a select group of faculty to develop course portfolios would have considerable merit for both internal and external consumption. In relation to summative evaluation of teaching, selections from these course portfolios would enable faculty who have compiled them to demonstrate their scholarship of teaching.

 

An incentive grant program may be needed to provide selected faculty with technical support, teaching center consultation, and release time for completing course portfolios and for determining their utility for colleagues within their departments and within their disciplinary and professional domains.

 

Departmental Efforts to Review Courses for Impact on Learning Outcomes

Increasingly different institutions are convening faculty committees to conduct systematic reviews of samples of student work to determine what this suggests regarding progress toward program level teaching and learning goals. Faculty within writing programs sometimes take the lead in developing this method which involves sampling student work across sets of related courses (such as general education and service courses, courses in the major, and capstone courses). Faculty review panels develop a scoring rubric representing levels of mastery in relation to learning goals, and then judge the quality of the student work samples in demonstrating achievement of learning goals.

 

Examples of this approach can be found on the Worcester Polytechnic Institute (WPI) web site where faculty judge the quality of capstone projects as indicators of achievement of learning outcomes.[9]  Walvoord and Anderson (Walvoord and Anderson 1998) describe this approach in detail.

 

Typically this approach has been used primarily for program evaluation in determining whether students are achieving desired learning outcomes. However, this approach can simultaneously support faculty in their completing formative evaluations of teaching and learning in relation to their specific courses. In this case faculty would develop a reflective memo after reviewing scored samples of student work for their course. The memo would address the sense of the representativeness of this data, and what they think it says about the quality of teaching and student learning in their courses.

 

Classroom Observation Processes and Forms

The Committee will continue to resolve and make recommendations for how the chairs and others in the department might use classroom observations. Within this context, the use of an agreed upon classroom observation form and process would strengthen the reliability, validity, and transparency of this approach. There are several examples presented in the literature and one form is offered in Appendix D (Centra, Froh Gray, Lambert. 1987). Since classroom observations appear to be such a strong component of the Elon culture, each department should develop a procedure for how this will be done addressing the role of the chair and other senior faculty for summative evaluation, and the opportunities available for faculty to seek observations within a confidential formative evaluation context. Depending an the specifics of the department and the particular chair, the chair may choose not to conduct observations for summative evaluation. However, if the chair does complete classroom observations, these observations should be rendered within a summary that stands separate and distinct, but contributes to the chairs overall evaluation.

 

Web-based student evaluation forms

A number of institutions are experimenting with web-based forms that enable students to complete their course evaluations on-line. Concerns regarding return rates limits widespread use, but some institutions have found ways to compel students to complete the form such as requiring students to complete of the forms before they can see their grades on-line. The power of this approach in enabling better access to student comments should not be ignored, particularly in supporting department level formative evaluation by peers. Colleagues could assist each other in reviewing these comments and in determining the most useful points. For example, faculty could complete a reflective memo after reviewing their comments and then seek assistance from a colleague in validating their reflection by reviewing the student comments.

 

Summary

There are many other possibilities for looking at the quality of teaching and learning within formative evaluation and summative evaluation contexts. With this report, I have focused on methods that enable investigation of the relationship of teaching and learning processes with learning outcomes. The methods suggested (incorporating student estimates of progress, articulation of departmental learning outcomes, reflective memos, selected use of course portfolios, analysis of student work samples, the use of classroom observation forms, and using web-based teaching evaluation forms appear to be the most promising at this point in supporting self evaluation and peer review within academic departments.


Aleamoni, L. M. (1987). Techniques for evaluating and improving instruction. San Francisco, Jossey-Bass.

           

Astin, A. W. (1993). Assessment for excellence : the philosophy and practice of assessment and evaluation in higher education. Phoenix, Ariz., Oryx Press.

           

Austin, A. E. (1992). "Supporting Junior Faculty Through a Teaching Fellows Program." New directions for teaching and learning,(n.50): 14.

           

Baez, B. (2002). "Confidentiality and peer review: The paradox of secrecy in Academe." Review of higher education : a bulletin of the Association for the Study of Higher Education 25: 163-183.

           

Baxter Magolda, M. B. (1999). Creating contexts for learning and self-authorship : constructive-developmental pedagogy. Nashville, Tenn., Vanderbilt University Press.

           

Bloom, B. S. (1961). Taxonomy of educational objectives; the classification of educational goals. New York,, D. McKay.

           

Braskamp, L. A. and J. C. Ory (1994). Assessing faculty work : enhancing individual and institutional performance. San Francisco, Jossey-Bass Publishers.

           

Centra, J. A. (1993). Reflective faculty evaluation : enhancing teaching and determining faculty effectiveness. San Francisco, Jossey-Bass.

           

Centra, J. A., R. C. Froh, P.J. Gray, L.M.Lambert. (1987). A Guide to Evaluating Teaching for Promotion and Tenure, Copley Publishing Group.

           

Cross, K. P. and M. H. Steadman (1996). Classroom research : implementing the scholarship of teaching. San Francisco, Jossey-Bass.

           

Feldman, K. A. (1997). Identifying Exemplary Teachers and Teaching: Evidence from Student Ratings. Effective teaching in higher education : : research and practice. R. P. Perry and J. C. Smart. New York, Agathon Press: xii, 452.

           

Fenwick, T. J. (2001). "Using Student Outcomes to Evaluate Teaching: A Cautious Exploration." New directions for teaching and learning, no. 88: 11.

           

Fox, M. A., N. Hackerman, et al. (2003). Evaluating and improving undergraduate teaching in science, technology, engineering, and mathematics. Washington, D.C., National Academy Press.

           

Froh, R. C. and M. Hawkes (1996). Assessing Student Involvement in Learning. Teaching on solid ground : using scholarship to improve practice. R. J. Menges and M. Weimer. San Francisco, Jossey-Bass Publishers.

           

Hativa, N. and M. Marincovich, Eds. (1995). Disciplinary Differences in Teaching and Learning: Implications for Practice. New Directions for teaching and learning. San Francisco, Jossey-Bass.

           

Hoyt, D. P. and W. H. Pallett (1999). "Appraising Teaching Effectiveness: Beyond Student Ratings." Idea Paper, no. 36.

           

Hutchings, P., AAHE Teaching Initiative., et al. (1995). From idea to prototype : the peer review of teaching : a project workbook. Washington, DC, AAHE Teaching Initiative American Association for Higher Education.

           

Keig, L. (2000). "Formative Peer Review of Teaching: Attitudes of Faculty at Liberal Arts Colleges Toward Colleague Assessment." Journal of personnel evaluation in education 14(1): 20.

           

Light, R. J. (2001). Making the most of college: students speak their minds. Cambridge, Mass., Harvard University Press.

           

Marincovich, M. (1998). Ending the disconnect between the student evaluation of teaching and the improvement of teaching a faculty developer's plea. Stanford, CA, National Center for Postsecondary Improvement.

           

McKeachie, W. J. (1997). "The Validity of Use." The American psychologist 52(11): 1218-1225.

           

Ory, J. C. and L. A. Braskamp (1981). "Faculty Perceptions of the Quality and Usefulness of Three Types of Evaluative Information." Research in Higher Education 15(3): 271-282.

           

Pace, C. R. (1990). The undergraduates : a report of their activities and progress in college in the 1980's. Los Angeles, Calif., Center for the Study of Evaluation University of California Los Angeles.

           

Pace, C. R. (1993). Forward. Making a difference : outcomes of a decade of assessment in higher education. T. W. Banta. San Francisco, Jossey-Bass: xxxix, 388.

           

Pike, G. R. (1995). "The relationship between self-reports of college experiences and test scores." Journal of Research in Higher Education 36(1): 1-21.

           

Shulman, L. S. (1986). "Those Who Understand: Knowledge Growth in Teaching." Educational researcher February 1986: 4-14.

           

Shulman, L. S., Smith, V.B. and Stewart, D.M. (1987). Three Presentations from the Second National Conference on Assessment in Higher Education.

           

Sorcinelli, M. D. (2002). "New Conceptions of Scholarship for a New Generation of Faculty Members." New directions for teaching and learning, no. 90: 8.

           

Walvoord, B. E. F. and V. J. Anderson (1998). Effective grading : a tool for learning and assessment. San Francisco, Calif., Jossey-Bass Publishers.

           


In considering how to strengthen the role of peer review in both formative and summative evaluation, the Committee considered other efforts that will have impact on how peer review evolves to support the ‘Teacher-Scholar’ vision, such as the Love School of Business’s redefinition of its teaching evaluation system, and the efforts of the Committee on Scholarship (name ?). The Evaluation of Teaching Committee Report highlights the Love School effort and prompts for department level initiatives to enable new forms of peer review to supplement the weight placed on Student Evaluations of Teaching (SETs) and the dominant role department chairs play in observing and judging teaching in the classroom.

 

The Committee also addressed the need to strengthen the Unit I annual report. This need might be framed more broadly as to strengthen the processes related to ‘files for teaching faculty’ referenced in the faculty handbook. Other points of culmination such as the mid-point and final year of the probationary period, in addition to Unit I annual reports could use further articulation in terms of support and best case examples.

 

The Committee chose not to consider at this point incorporating in the evaluation of teaching process student self reports of learning outcomes. It may be advisable to encourage some pilot efforts in this arena. The faculty manual points to the SET at Elon as ‘the principal tool in assessing learning in the classes of teaching faculty…” and the form at this point addresses pedagogy within and outside the classroom, but does not address what student actually learn.

 

What follows are suggestions in relation to the recommendations made in the Committee Report stressing the themes described above. In pursuing these suggestions, further detail could be provided.

 

Making Evaluation of Teaching More Complete and Consistent

(Recommendation 1)

 

The Committee reports the need for more specific guidelines for faculty Unit I Annual Reports. Providing more support to faculty in building their ‘Files for Teaching Faculty’ (Section E of the Faculty Manual) seems critical to enabling the ‘Teacher-Scholar’ vision. Chairs and deans appear to desire more consistency in selection, organization, and summarization of these files both annually and as part of the mid-point and final year of the agreed upon probationary period. Enabling faculty to reflect on their goals and progress, and creating opportunities for assistance from peers-mentors who have attained tenure would be useful processes worthy of consideration..

 

Although the Committee chose not to look at student learning outcomes in the evaluation of teaching process, it may be advisable to consider some pilot efforts during this period of transition. Bill McKeachie in an article in the American Psychologist (1997, 52, 1218-1225) stresses the utility of this approach and cites the Kansas State IDEA short form as a good example of this emphasis.

 


Making the Evaluation Process More Transparent

(Recommendation 2)

 

It seems imperative to respond to the concern raised by some junior faculty regarding classroom observations completed by department chairs. Enabling multiple perspectives to strengthen the reliability and validity of this method probably will be accomplished by the approach that the Love School of Business will implement by using a three-member committee of tenured peers at the critical mid and end point review periods of the tenure review process. Another approach that can also strengthen reliability and validity of classroom observations is the use of a an observation form designed to enable more transparency and consistency. Faculty could also use such a form to guide their seeking optional classroom observations by trusted colleagues as a type of formative evaluation. Faculty could then opt to make this information available as part of their ‘teaching file’ for summative reviews when they perceived that the chair observations do not adequately represent their teaching.

 

 

Ensuring Anonymity and Improving Teaching

(Recommendations 1c and 3 respectively)

 

In responding to the concern regarding anonymity of students’ comments, one simple solution would be to suggest that students who have this concern print their responses to avoid detection of their handwriting.

 

Beyond this specific concern, creating better access to students’ responses remains a perennial issue with SETs. Faculty overall, but particularly those in the arts and  humanities, perceive thoughtful comments and suggestions by students as the most valuable kind of evaluation they can get regarding their teaching. Enabling web administered SETs with email prompts surpasses other approaches in capturing this type of data, but solving the potential of low response rates prevents more widespread adoption of this way of gathering SETs. A number of institutions, such as Smith and Yale have established effective leverage points. Elon has a special advantage in considering this approach as student responses could be easily sorted by the overall scale score to organize them for easier reading and summarizing.

 

Instituting Formative and Collegial Evaluation

(Recommendation 3a)

 

If they have not done so already, academic departments should consider accessing two notable programs that have been particularly robust in supporting formative and collegial evaluation, the Lilly Teaching Fellows Program and the AAHE/Carnegie Peer Review Program. These programs have generated a range of specific ways to enable dialogue and inquiry among colleagues in the support making improvements to teaching, and in contributing to the ‘scholarship of teaching.’


Hampshire College

Student Course Evaluation

 

 

Part 1              Progress on Course Objectives

 

 

On each of the objectives listed below, rate the progress you have made as a result of taking this course by filling in the appropriate circle on the following scale:

 

None               Little               Some               Much              Very Much

 

 

In rating your progress, consider each objective carefully. Because the objectives are stated in general terms, interpret each of them in the context of this course; for instance, if an objective includes a list of items, consider the objective only in terms of the relevant item(s). Also, because most courses do not attempt to achieve all of these objectives, there are likely to be some objectives on which you have made no progress. Thus, an answer of “None” may simply mean that the objective in question was not one the course sought to achieve.

 

 

1. Gaining knowledge of facts, terms, classifications, works, major figures, etc.

 

2. Gaining an understanding of theories, fundamental concepts, or other important ideas.

 

3. Learning to understand professional/scholarly literature.

 

4. Learning to interpret primary texts or works.

 

5. Developing skill in critical thinking.

 

6. Developing skill in problem solving.

 

7. Developing skill in critical/analytical writing.

 

8. Developing creative capacities.

 

9. Learning techniques and methods for gaining new knowledge in this subject.

 

10. Developing the ability to conceive and carry out independent work

 

11. Developing the ability to work collaboratively with others.

 

12. Developing skill in expressing ideas orally.

 

13. Developing skill in expression through art, music, media, writing, design, or performance.

 

14. Developing specific skills or competencies, such as artistic techniques, production methods, laboratory methods, quantitative techniques, computer applications, or fieldwork methods.

 

15. Gaining an understanding of the relevance of the subject matter to real-world issues.

 

16. Gaining an understanding of the historical and social context in which the subject has developed.

 

17. Gaining an understanding of different views and perspectives on the subject.

 

18. Discovering the implications of the course material for understanding myself (interests, talents, preconceptions, values, etc.

 

Part 2 General Questions

 

Indicate your level of agreement with each of the next twelve statements by filling in the appropriate circle on the following scale:

 

Strongly                                                                                  Strongly

Disagree         Disagree         Neutral           Agree              Agree

 

Questions 19—21 refer to this course’s instructor or, if the course has two instructors, to: ________________

19. I really wanted to take a course from this instructor.

20. I would like to take another course from this instructor.

21. Overall, I rate this instructor an excellent teacher.

 

Answer questions 22-24 only if this course has two instructors,

22. I really wanted to take a course from this instructor.

23. I would like to take another course from this instructor.

24. Overall, I rate this instructor an excellent teacher.

 

Please answer all of the remaining questions.

25. I really wanted to take this course, regardless of who taught it.

26. As a result of taking this course, I have a new or increased interest in this subject.

27. I put considerable effort into this course.

28. I had an adequate background for this course.

29. Overall, I rate this an excellent course.

30. Overall, I learned a great deal in this course.

 


31. Please indicate how this course fits into your educational program

(choose only one)

·        Course-based Division I exam

·        Project-based Division I exam

·        Division II concentration

·        Division III

·        Other (please specify)

 

32. Please indicate any other reason(s) you had for taking this course (choose all that apply).

 

·        Curiosity

·        Recommendation of student or faculty member

·        Acquisition of particular skills

·        Graduate or professional school requirement

·        Prerequisite for another course

·        Other (please specify)

 

33. Please comment on what you did and did not get out of this course, given your expectations for the course.


Hampshire College

Instructor Objectives Report

 

Course Number ____________

(please print clearly)

 

 

Course Objectives

 

On each of the objectives listed below, rate the importance of this objective in your course by filling in the appropriate circle on the following scale:

 

Minor Importance      Moderate Importance           Essential

 

The rating “Minor Importance” should be understood to mean “of no more than minor importance.”

 

1. Gaining knowledge of facts, terms, classifications, works, major figures, etc.

 

2. Gaining an understanding of theories, fundamental concepts, or other important ideas.

 

3. Learning to understand professional! scholarly literature.

 

4. Learning to interpret primary texts or works.

 

5. Developing skill in critical thinking.

 

6. Developing skill in problem solving.

 

7. Developing skill in critical/analytical writing.

 

8. Developing creative capacities.

 

9. Learning techniques and methods for gaining new knowledge in this subject.

 

10. Developing the ability to conceive and carry out independent work.

 

11. Developing the ability to work collaboratively with others.

 

12. Developing skill in expressing ideas orally.

 

13. Developing skill in expression through art, music, media, writing, design, or performance.

 

14. Developing specific skills or competencies, such as artistic techniques, production methods, laboratory methods, quantitative techniques, computer applications, or fieldwork methods.

 

15. Gaining an understanding of the relevance of the subject matter to real-world issues.

 

16. Gaining an understanding of the historical and social context in which the subject has developed.

 

17. Gaining an understanding of different views and perspectives on the subject.

 

18. Discovering the implications of the course material for understanding myself (interests, talents, preconceptions, values, etc.).


Slide 1

Assessing Student Learning in New England

Outcomes Focused Assessment

University of MaineAugusta

(Lewiston-Auburn, Bangor)

Thursday, May 15, 2003

Robert C. Froh, Ph.D

Associate Director

Commission on Institutions of Higher Education (CIHE)

New England Association of Schools and Colleges (NEASC)

rfroh@neasc.org

 

Slide 2

Outcomes Based Assessment

Suggestion:

• Use Bloom’s Taxonomy of

Educational Objectives to ensure:

– Outcomes defined along a continuum of

lower to higher order cognitive skills

– Action verbs that enable assessment

 

Slide 3

Outcomes Based Assessment

Bloom’s Taxonomy of Objectives

Competence - Knowledge

Skills Demonstrated:

• observation and recall of information

• knowledge of dates, events, places

• knowledge of major ideas

• mastery of subject matter

Question Cues:

list, define, tell, describe, identify, show, label,

collect, examine, tabulate, quote, name, who,

when, where, etc.

 

Slide 4

Outcomes Based Assessment

Bloom’s Taxonomy of Objectives

Competence - Comprehension

Skills Demonstrated:

• understanding information

• grasp meaning

• translate knowledge into new context

• interpret facts, compare, contrast

• order, group, infer causes

• predict consequences

Question Cues:

summarize, describe, interpret, contrast,

predict, associate, distinguish, estimate,

differentiate, discuss, extend

 

Slide 5

Outcomes Based Assessment

Bloom’s Taxonomy of Objectives

Competence - Application

Skills Demonstrated:

• use information

• use methods, concepts, theories in new

situations

• solve problems using required skills or knowledge

Questions Cues:

apply, demonstrate, calculate, complete,

illustrate, show, solve, examine, modify, relate,

change, classify, experiment, discover

 

Slide 6

Outcomes Based Assessment

Bloom’s Taxonomy of Objectives

Competence - Analysis

Skills Demonstrated:

• seeing patterns

• organization of parts

• recognition of hidden meanings

• identification of components

Question Cues:

analyze, separate, order, explain, connect,

classify, arrange, divide, compare, select,

explain, infer

 

Slide 7

Outcomes Based Assessment

Bloom’s Taxonomy of Objectives

Competence - Synthesis

Skills Demonstrated:

• use old ideas to create new ones

• generalize from given facts

• relate knowledge from several areas

• predict, draw conclusions

Question Cues:

combine, integrate, modify, rearrange,

substitute, plan, create, design, invent, compose,

formulate, prepare, generalize, rewrite

 

Slide 8

Outcomes Based Assessment

Bloom’s Taxonomy of Objectives

Competence - Evaluation

Skills Demonstrated:

• compare and discriminate between ideas

• assess value of theories, presentations

• make choices based on reasoned argument

• verify value of evidence

• recognize subjectivity

Question Cues

assess, decide, rank, grade, test, measure,

recommend, convince, select, judge, explain,

discriminate, support, conclude, compare,

summarize

 

Slide 9

Articulating Outcomes

Alverno's Eight Abilities as an Example

Communication Make connections that create meaning

between yourself and your audience. Learn to speak, read,

write and listen effectively, using graphics, electronic media,

computers and quantified data.

Analysis Think clearly and critically. Fuse experience, reason

and training into considered judgment.

Problem Solving Figure out what the problem is and what is

causing it. With others or alone, form strategies that work

in different situations. Then, get done what needs to be

done, evaluating effectiveness.

Valuing Recognize different value systems while holding

strongly to your own ethic. Recognize the moral dimensions

of your decisions and accept responsibility for the

consequences of your actions.

 

Slide 10

Articulating Outcomes

Alverno's Eight Abilities

Social Interaction Know how to get things done in committees,

task forces, team projects and other group efforts. Elicit

the views of others and help reach conclusions.

Developing a Global Perspective Act with an understanding of

and respect for the economic, social and biological

interdependence of global life.

Effective Citizenship Be involved and responsible in the

community. Act with an informed awareness of contemporary

issues and their historical contexts. Develop leadership

abilities.

Aesthetic Response Appreciate the various forms of art and

the contexts from which they emerge. Make and defend

judgments about the quality of artistic expressions.

• http://www.alverno.edu/prospective_stu/eight_abilities.html

 

Slide 11

Articulating Outcomes by Level

Alverno's Eight Abilities

Valuing

• Level 1: Identify own values

• Level 2: Infer and analyze values in artistic and humanistic works

• Level 3: Relate values to scientific and technological developments

• Level 4: Engage in valuing in decision-making in multiple contexts

Aesthetic Response

• Level 1: Articulate a personal response to various works of art

• Level 2: Explain how personal and formal factors shape own

responses to works of art

• Level 3: Connect art and own responses to art to broader contexts

• Level 4: Take a position on the merits of specific artistic works and

reconsider own judgments about specific works as knowledge and

experience

 

Slide 12

Articulating Outcomes by Level

Alverno's Eight Abilities

Analysis

• Level 1: Show observational skills, articulate and evaluate own

problem solving process

• Level 2: Draw reasonable inferences from observations, define

problems or design strategies to solve problems using disciplinerelated

frameworks

• Level 3: Perceive and make relationships, select or design

appropriate frameworks and strategies to solve problems

• Level 4: Analyze structure and organization, implement a solution

and evaluate the problem solving process used

 

Slide 13

Articulating Outcomes by Level

Alverno's Eight Abilities

Communication

• Level 1: Identify own strengths and weaknesses as communicator

• Level 2: Show analytic approach to effective communicating

• Level 3: Communicate effectively

• Level 4: Communicate effectively making relationships out of

explicit frameworks from at least three major areas of knowledge

(The above are to be done in writing, reading, speaking, listening,

using media, quantified data and technology)

Effective Citizenship

• Level 1 - Assess own knowledge and skills in thinking about and

acting on local issues

• Level 2 - Analyze community issues and develop strategies for

informed response

• Level 3 - Evaluate personal and organizational characteristics, skills

and strategies that facilitate accomplishment of mutual goals

• Level 4 - Apply her developing citizenship skills in a community

setting

 

Slide 14

Articulating Outcomes by Level

Alverno's Eight Abilities

Social Interaction

• Level 1: Identify own interaction behaviors utilized in a group problemsolving

situation

• Level 2: Analyze behavior of others within two theoretical frameworks

• Level 3: Evaluate behavior of self within two theoretical frameworks

• Level 4: Demonstrate effective social interaction behavior in a variety of

situations and circumstances

Developing a Global Perspective

• Level 1: Assess own knowledge and skills to think about and act on global

concerns

• Level 2 - Analyze global issues from multiple perspectives

• Level 3 - Articulate understanding of interconnected local and global issues

• Level 4 - Apply frameworks in formulating a response to global concerns and

local issues

Problem Solving

• No levels stated in the reference

• http://smccd.net/accounts/skygrowth/Volume%204.htm

 

 

 


 

Instructor________________________ Course_____________________________

Date _____________________________ Observer ____________________________

 

Directions: Below is a list of instructor behaviors that may occur within a given class or course. Please use it as guide to making observations, not as a list of required characteristics. When this worksheet is used for making improvements to instruction, it is recommended that the instructor highlight the areas to be focused on before the observation takes place.

 

Respond to each statement using the following scale:

                 not                                       more emphasis                           accomplished
                 observed                              recommended                            very well
                    1                                           2                                               3

 

Circle the number at the right that best represents your response. Use the comment space below each section to provide more feedback or suggestions.

 

Content Organization
 1. Made clear statement of the purpose of the lesson

 2. Defined relationship of this lesson to previous lessons

3. Presented overview of the lesson

4. Presented topics with a logical sequence

5. Paced lesson appropriately

6. Summarized major points of lesson

7. Responded to problems raised during lesson

8. Related today’s lesson to future lessons

Comments:

 

 

Presentation

 

9.      Projected voice so easily heard

 

10.    Used intonation to vary emphasis

 

11.    Explained things with clarity

 

12.    Maintained eye contact with students

 

13.    Listened to student questions and comments

 

14.    Projected nonverbal gestures consistent with intentions

 

15.    Defined unfamiliar terms, concepts, and principles

 

16.    Presented examples to clarify points

 

17.    Related new ideas to familiar concepts

 

18.    Restated important ideas at appropriate times

 

19.    Varied explanations for complex and difficult material

 

20.    Used humor appropriately to strengthen retention and interest

 

21.    Limited use of repetitive phrases and hanging articles

 

Comments:

 

 

Instructor - Student Interactions

 

22.    Encouraged student questions

 

23.    Encouraged student discussion

 

24.    Maintained student attention

 

25.    Asked questions to monitor students’ progress

 

26.    Gave satisfactory answers to student questions

 

27.    Responded to nonverbal cues of confusion, boredom, and curiosity

 

28.    Paced lesson to allow time for note taking

 

29.    Encouraged students to answer difficult questions

 

30.    Asked probing questions when student answer was incomplete

 

31.    Restated questions and answers when necessary

 

32.    Suggested questions of limited interest to be handled outside of class

 

Comments:

 

 

Instructional Materials and Environment

 

33.    Maintained adequate classroom facilities

 

34.    Prepared students for the lesson with appropriate assigned readings

 

35.    Supported lesson with useful classroom discussions and exercises

 

36.    Presented helpful audiovisual materials to support lesson organization and major points

 

37.    Provided relevant written assignments

 

Comments:

 

 

Content Knowledge and Relevance

 

38.    Presented material worth knowing

 

39.    Presented material appropriate to student knowledge and background

 

40.    Cited authorities to support statements

 

41.    Presented material appropriate to stated purpose of course

 

42.    Made distinctions between fact and opinion

 

43.    Presented divergent viewpoints when appropriate

 

44.    Demonstrated command of subject matter

 

 

Comments:

 

 

 

45.    What overall impressions do you think students left this lesson with in terms of content or style?

 

 

46.    What were the instructor’s major strengths as demonstrated in this observation?

 

 

47.    What suggestions do you have for improving upon this instructor’s skills?

 

 

 



[1] Personal correspondence with John Ory at the University of Illinois Urbana-Champaign confirmed that experienced faculty tend to select student outcomes type questions.

[4] St. Michaels web citation to be supplied

[5] personal correspondence; Michele Marincovich is the Associate Vice President & Director of the Center for Teaching and Learning at Stanford University

[6] personal correspondence; Craig McEwen is the Dean for Academic Affairs and Christine Cote is the Registrar and Director of Institutional Research at Bowdoin

[7] personal correspondence; Donna Qualters is the Director for the Center for Effective Teaching at Northeastern University