Assessment Brief #105 - Feedback on Assessment

Assessment Logo: assessment-revision-outcomes

October 2019

Frequent Feedback on Departmental Assessment Reports and Plans

Brief, but detailed, program-level feedback on assessment reports and plans has been identified by constituencies in Miami's assessment process as a very valuable resource. A content analysis of past feedback was recently carried out to identify the most frequently-offered feedback. Below is a summary.

Summary of Assessment Feedback

  • Focus on a few learning outcomes and a few assessment strategies rather than trying to do too much.
  • If using feedback on prior student work as an assessment method, develop effective rubrics and provide training to faculty members and others using them so that their judgment is calibrated.
  • Be more specific in explaining how assessment results were reflected upon and what follow-up actions were taken. Instead of reporting only that “results were discussed,” provide more detail. What conclusions were reached? What changes will result? How will you know if those changes are effective?
  • Consider moving to a two-year cycle for reporting for programs with small numbers of students. Assessment data should be collected every year, but reporting on two years of data may be more useful.
  • Include rubrics, survey questions, and other assessment measures in the report.
  • In addition to using direct assessment methods, collect and report upon indirect assessment evidence from students, graduates, and others. Examples of indirect assessment methods include graduating student surveys, results of survey administered by the Office of Institutional Research and Effectiveness (e.g., National Survey of Student Engagement), focus groups, adding items concerning learning outcomes to course evaluations, tracking student acceptance into graduate and professional programs, feedback from internship/clinical supervisors, and feedback from employers.
  • Consider having faculty members who score student work not be the instructors of the classes in which the work was produced.
  • Collect a representative and sufficiently large sample of student work or other assessment data.
  • While it is important to have one faculty or staff member and/or a small working group manage assessment activity in the department, it is also critical that all faculty and/or staff members in the unit meet at least once each year (perhaps once each semester in departments with several programs) to share assessment findings, reflect upon implications, and develop follow-up plans.
  • Follow the approaches outlined in the department's assessment plan or develop a revised plan.
  • Schedule departmental meetings to discuss assessment results, reflect upon implications, and develop follow-up plans as well as the annual assessment report submission date (June 30 or December 31) so that the "closing the loop" activities can be reported upon in each year's report.
  • Report on assessment activities separately for each program in the department.
  • For programs with disciplinary accreditation, if necessary, create separate assessment reports for the University that address all requirements for full-cycle assessment.
  • Make sure rubrics are sufficiently descriptive (e.g., what does a "3" or "accomplished" represent?).

Assessment Feedback Rubric

The University Assessment Council has developed the rubric below to provide feedback on assessment plans and reports.

1. Learning Goals or Outcomes

  • Strong: Learning goals are clear, measurable, and actionable, follow the assessment plan, and are consistent across years (unless the plan is changed.)
  • Needs Some Improvement: Learning goals are included, but they are not easily measurable and/or lead to actionable results.
  • Needs Immediate Attention: Learning goals are not provided or explained in only a very general way.

2. Assessment Methods

  • Strong: Documentation of assessment instrument is included with report.
  • Needs Some Improvement: Assessment methods are mentioned, but examples are not included with report.
  • Needs Immediate Attention: Methods are not mentioned or included with report.

3. Findings

  • Strong: Findings clearly indicate where students excelled, met standards, and fell short; findings portion includes thoughtful analysis.
  • Needs Some Improvement: Findings show evidence of some analysis of student learning beyond broad and general statements.
  • Needs Immediate Attention: Findings are reported in very general, overall terms or there is no mention of findings.

4. Context of Findings

  • Strong: Appropriate context is provided to assist readers in understanding what has been found.
  • Needs Some Improvement: Findings provide minimal context that would assist a reader in understanding what has been found.
  • Needs Immediate Attention: No context of findings is provided.

5. Dissemination of Results

  • Strong: Results are widely disseminated and well presented. Report includes input from stakeholders.
  • Needs Some Improvement: There is indication that results were shared, but in a limited scope (i.e., report given to faculty but not discussed or no wide faculty input solicited).
  • Needs Immediate Attention: No indication that results were shared.

6. Action Plan

  • Strong: Report summarizes specific and logical actions planned (including a timeline) and then taken based on findings for each and all of the assessed outcomes.
  • Needs Some Improvement: Report offers specific and logical actions taken for most of the outcomes.
  • Needs Immediate Attention: Use of results is completely future-oriented (we plan to do this...) without a concrete timeline OR no mention of future action.

7. Follow-up from Previous Reports

  • Strong: Includes analysis of the efficacy of the actions taken from previous reports.
  • Needs Some Improvement: Inconclusive or limited information included about efficacy of actions taken from previous reports.
  • Needs Immediate Attention: No information about the efficacy of actions taken from previous reports.