Program Evaluations

The Miami University School Psychology Program strives to provide foundational and exemplary training consistent with our program mission and national standards set forth by NASP (2020). To do so, it is imperative that the program consistently and systematically engages in reflective evaluation to help assess and maintain program quality at a high and competitive level.  We collect internal and external data to help provide feedback from different sources, including the following:

Internal Sources
  • Program Review: Program review is conducted annually through the University. Miami University is accredited by the Higher Learning Commission (HLC). Miami's most recent comprehensive HLC accreditation review was in 2015, and it resulted in a continuation of our accreditation for ten years, through 2025. School Psychology faculty summarize program-level data annually and the results help steer program changes in the areas of curriculum, program development, and faculty development.
  • School Psychology Program Advisory Committee. This ad hoc committee is composed of practicing school psychologists representing a range of settings and experiences, and current program faculty. In these meetings once or twice per year, faculty seek input and feedback on a variety of program and curriculum issues. We also summarize feedback from our annual program survey and semester course evaluations and identify areas for program improvement and specific action items. Many of the committee members have served in a supervisory capacity of our practicum students (second year) or interns (third year), which allows them to offer detailed feedback.
  • Student evaluation of courses: Students complete a standardized end-of-course rating scale evaluation. Individual faculty members review the ratings to help them gauge their own effectiveness and to help guide any relevant revisions needed to course or evaluative content.
  • Intervention case results: Intervention outcomes associated with relevant field experiences and practica are evaluated. This provides the program with data reflecting training effectiveness on improvements for children.
  • Practica and internships logs and portfolios
  • Practica and internship evaluations: Data from field supervisor ratings of practica students are summarized and used as general indicators of preparation relative to specific national training domains and program training goals.
  • Student evaluation of practica and internship settings
  • S. Small Group Research Projects: Students work with a faculty research advisor and peers on research projects, which are often presented at State and National Conferences. This required research project demonstrates student commitment and training within the scientist practitioner model and highlights the applied nature of our program focus.
  • Transcripts of graduate work
  • Comprehensive exam (year 1); comprehensive portfolio evaluation (year 3)
External Sources
  • NASP Approval: This reflects confirmation of how well our program meets national training standards set by the National Association of School Psychologists (NASP)
  • CAEP Accreditation and the NCSP exam (PRAXIS) results
  • Ohio Department of Education accreditation
  • Surveys of alumni, supervisors, and practitioners
  • Informal surveys of intern and practicum supervisors