Resources

Resources on student feedback and program assessment:

MID-COURSE CONSULTATIONS & STUDENT FEEDBACK

Mid-Course Feedback

A growing body of research on teaching and learning in higher education suggests that early feedback from students about their learning can be a valuable resource for faculty in their teaching. Mid-course feedback helps faculty members make small but important adjustments while in the process of teaching a course. Mid-course feedback activities also provide students a structured opportunity to think about their own learning and the kinds of course activities that most assist them in this process.  We invite faculty colleagues to work with us on ways to collect feedback from students that will be most helpful as you think about your teaching in specific courses.

There are many ways to solicit early feedback from students. At the CTL we often use informal approaches such as one-minute papers and “Critical Incident Questionnaires.”[i]  We also use short learning-centered questionnaires as a simple tool for faculty to get feedback from their students about a course while it is in progress. We have developed sample surveys that can be adapted to your interests and to specific courses, and we would be happy to discuss these and other student feedback activities with you.[ii]

Please email or call the CTL if you would like to schedule a confidential consultation with me or with one of our CTL faculty members.

Sally Schwager, Director
CEU Center for Teaching and Learning


[i] The CIQ is a simple classroom evaluation tool developed by Stephen D. Brookfield and used at Oxford and other UK universities. See, Brookfield and Preskill, Discussion as a Way of Teaching: Tools and Techniques for Democratic Classrooms, 2nd ed. (San Francisco: Jossey-Bass, 2005), p. 48.

[ii] For a good brief discussion and examples of mid-semester course evaluations designed for faculty members’ own use, see: http://www.princeton.edu/mcgraw/library/for-faculty/midcourseevals/index.xml

 

A Note from the Director of the CTL:

Using Evidence from Student Work for Program Improvement

 Dear Colleagues:

As many of you, like us at the CTL, are continuously working to analyze and improve the quality of your programs, I thought I would take this opportunity to share some ideas and recent research aimed at improving graduate programs through a focus on student learning. As in all of our work at the CTL, we strive to examine our programs from the perspective of scholarship on student learning and academic “best practice.”  To that end, I have recently been reviewing new research and major disciplinary discussions on student learning and graduate program improvement. I hope this brief overview might be of help as you formulate your own strategies. I also would welcome the opportunity to meet with you at any time to discuss specific program goals or other questions of student learning and program improvement.

As you may know, an important body of research in higher education on the design and assessment of academic programs focuses on the use of student work as data to inform program improvement. Unlike traditional standardized assessments, this more scholarly approach is, I believe, more consistent with our concerns as doctoral supervisors and faculty directors of graduate programs. Essentially, student work of the sort we are accustomed to examining for evidence of intellectual development, can also be used as direct evidence to help us identify aspects of our programs that are working well and, conversely, areas that may need to be revised. [1]

Taking this approach allows us to start with the program’s established learning goals or objectives, and then examine a sample of advanced student projects to identify trends relative to one or more of the program’s goals for student learning. Examples of such student projects include: 

  • Master’s thesis
  • Major projects
  • Comprehensive examinations
  • Doctoral dissertations
  • Capstone courses
  • Dissertation defense
  • Portfolio evaluation of student work
  • Disciplinary conference presentations
  • Peer-reviewed submissions for publication

How programs choose to approach the analysis of student work obviously will vary across disciplines and programs, both in accordance with individual program objectives and in relation to discipline-specific protocols. But I would like to suggest several general principles that might be useful:

Guiding Principle:  Keep it simple.

One of the most compelling research findings in looking at student learning and program improvement is that meaningful change is more likely to result from targeting a very few student learning objectives and collecting relatively small amounts of data in any given year. Some of the underlying principles of continuous quality improvement of student learning at the program level are:

  • Target one or two high-priority student learning outcomes: “A program does not have to assess every outcome every year to know how well it is doing toward the attainment of student outcomes.”[2] Program faculty often find it useful to start by identifying the two or three highest-priority outcomes (knowledge or skills that students graduating from the program should develop and be able to demonstrate). Programs can then target their inquiry about how the students are “measuring up” in relation to these outcomes, one at a time, over several successive years. Keep in mind that the purpose is not to collect massive amounts of data; the purpose is simply to gather enough evidence from student work to identify trends that can be used to inform improvements at the program level.
  • Collect data selectively: You do not have to collect data on every student in every course. Because the purpose of this inquiry is continuous improvement of the program, and not an evaluation of individual students or individual courses, you might, for example, collect information from only two or three upper-level courses where your targeted outcome is “covered.”
  • Examine student projects using a simple rubric or a list of two or three statements that describe evidence of exemplary learning. In looking at a master’s thesis for evidence of student learning, to give an example, the program might decide to target two or three features that demonstrate mastery of specific program goals:  a) “Applies sound research methods/tools to problems in an area of study and describes the methods/tools effectively;” b) “Communicates research clearly and professionally…” [in written form appropriate to the field]; and c) “Has demonstrated capability for independent research in the area of study, applying substantial expertise in that area and to making an original contribution to it.” [3]
  • Conduct an annual program faculty meeting to discuss the evaluative evidence, analyze what it means for the program, and define any next steps.

 

As a final note, let me please invite you again to meet to discuss this process or other aspects of teaching and learning that might interest you. Please email me directly or contact the CTL if you would like to set up an individual or small-group meeting with me or with one of my faculty colleagues at the CTL. We welcome the opportunity to learn more about the specific activities of your programs and to share other aspects of our teaching and research with you.

Sincerely,

Sally Schwager, Director

 


[1] Direct assessment refers to measures of student learning “that require students to display their actual knowledge and skills (rather than report what they think their knowledge and skills are). Because direct assessment methods tap into students’ actual learning (rather than perceptions of learning), they are often seen as the preferred type of assessment.”  

Shamima Ahmed, “The MPA Capstone Course: Multifaceted Uses and Potentialities in Program Assessment,” Teaching Public Administration (accessed September 14, 2015): 6. 

http://tpa.sagepub.com/content/early/2014/08/14/0144739414542714

For major professional initiatives and discussion of using direct student evidence to improve programs in various disciplines, see Chris M. Golde and George E. Walker, eds., Envisioning the Future of Doctoral Education: Preparing Stewards of the DisciplineCarnegie Essays on the Doctorate (San Francisco: Jossey Bass Wiley, 2006); and Carnegie Initiative on the Doctorate (CID) Collection (accessed September 14, 2015): http://gallery.carnegiefoundation.org/cid/

[2]  Accreditation Board for Engineering and Technology (ABET), “Continuous Quality Improvement of Student Learning” Module 4, p. 4 (accessed April 14, 2016): http://www.abet.org/network-of-experts/for-current-abet-experts/refresher-training/module-4-quality-improvement-of-student-learning/ 

[3] For a simple rubric, see example from Duke University, Graduate Program in Biomedical Engineering (accessed May 5, 2016): http://bme.duke.edu/sites/bme.duke.edu/files/BME-Rubric-MS-ThesisDefense.pdf

See also, “Rubric for Evaluating PhD Dissertation and Defense” (accessed April 16, 2016): http://www.units.miamioh.edu/celt/assessment/grads/Dissertation_and_Defense.pdf

Using Evidence from Student Work for Program Improvement pdf

Sample Evaluation Rubric: Masters Thesis pdf

 

Selected resources compiled by the CTL:

STUDENT PRESENTATIONS AND PRESENTATIONS IN TEACHING

SELECTED GENERAL RESEARCH-BASED RESOURCES ON PRESENTATIONS IN TEACHING 

Reprinted from The Teaching Professor, 26.1 (2012): 5. Maryellen Weimer, “Student Presentations: Do They Benefit Those Who Listen? [advises peer evaluation to engage listeners]
http://www.facultyfocus.com/articles/teaching-and-learning/student-presentations-do-they-benefit-those-who-listen/

Brown University, the Sheridan Center’s handbook “Teaching and Persuasive Communication: Class Presentation Skills” (LAST PAGES on setting student presentations: pp 37-39):http://brown.edu/about/administration/sheridan-center/sites/brown.edu.about.administration.sheridan-center/files/uploads/Teaching%20and%20Persuasive%20Communication.pdf

Craig, Russell J.; Amernic, Joel H.(2006) “PowerPoint Presentation Technology and the Dynamics of Teaching”. Innovative Higher Education vol. 31 issue 3 October 2006. p. 147 - 160 
http://cms.springerprofessional.de/journals/JOU=10755/VOL=2006.31/ISU=3/ART=9017/BodyRef/PDF/10755_2006_Article_9017.pdf

Richard E. Mayer,(2014)  “Research-Based Principles for Designing Multimedia Instruction” In Victor A. Benassi, Catherine E. Overson, and Christopher M. Hakala (Eds) Applying Science of Learning in Education: Infusing Psychological Science into the Curriculum, Society for The Teaching of Psychology  (pp. 59-70)  http://teachpsych.org/Resources/Documents/ebooks/asle2014.pdf

Catherine Overson, (2014) “Applying Multimedia Principles to Slide Shows for Academic Presentation“ In Victor A. Benassi, Catherine E. Overson, and Christopher M. Hakala (Eds) Applying Science of Learning in Education: Infusing Psychological Science into the Curriculum, Society for The Teaching of Psychology (pp: 252-258) http://teachpsych.org/Resources/Documents/ebooks/asle2014.pdf

(For general inspiration to start designing an oral presentation assessment rubric) American Association for Colleges and Universities Oral  Communication VALUE Rubrick: http://www.aacu.org/value/rubrics/OralCommunication.cfm

EASY-READING PRESENTATION RESOURCES FOR STUDENTS

On poster sessions and general presentation preparation:
http://gradschool.unc.edu/academics/resources/postertips.html

“Creatively Speaking: Some Strategies for the Preparation and Delivery of Oral Presentations” from Speaking of Teaching Stanford University's Newsletter on Teaching:
http://www.stanford.edu/dept/CTL/Newsletter/CTLNewsletterFA08.pdf

Stanford’s Hume Center’s tip sheets on clear PPTs, speech anxiety, oral presentations: https://undergrad.stanford.edu/tutoring-support/hume-center/resources/speaking-resources

The Leiter Report, a philosophy blog by Brian Leiter, Karl N. Llewellyn Professor of Jurisprudence and Director, Center for Law, Philosophy, and Human Values at the University of Chicago: http://leiterreports.typepad.com/blog/2012/08/tips-on-teaching-a-graduate-seminar-for-the-first-time.html

Nancy Duarte (researcher/author on Powerpoint and other visual presentation strategies), Harvard Business Review  Blog on oral presentations: http://blogs.hbr.org/2012/10/do-your-slides-pass-the-glance-test/ 

Tufte, E. (2003, September). PowerPoint Is Evil: Power Corrupts. PowerPoint Corrupts Absolutely. Wired Magazine. http://www.wired.com/wired/archive/11.09/ppt2.html

click for "CTL Resource Sheet - Presentations in Teaching" PDF

 

The following center websites offer useful resources related to teaching and learning, from syllabi examples and lecture tips to advice on applying for fellowships:

Derek Bok Center for Teaching and Learning , Harvard University

Eberly Center for Teaching Excellence, Carnegie Mellon University

Stanford University Center for Teaching and Learning

Center for Research on Learning and Teaching, University of Michigan

The McGraw Center for Teaching and Learning, Princeton University

The Graduate School of Arts and Sciences Teaching Center, Columbia University