Assessment Brief #93 - Master of Computer Science Assessment Activity

Assessment Logo: assessment-revision-outcomes

January 2017

This brief features the assessment activity for the Master of Computer Science (MCS) program, and its alignment with the assessment requirements of the Higher Learning Commission (HLC). HLC promote "full cycle" assessment of degree programs–meaning that programs have clear and meaningful learning outcomes, multiple measures of assessment, data collection & analysis, sharing of findings, and development and tracking of plans for improvement.

Learning Outcomes & Method

Because the mission of the master's degree program focuses on acquiring research skills and specialized knowledge, the outcomes in the assessment plan focus on explaining and applying timely computer science topics, engaging and identifying the relevant professional literature, and communicating research findings. These outcomes are assessed via three methods: (1) assessment of the final defense using a rubric; (2) student exit surveys; and (3) faculty reflections. The rubric includes thoughtful quality descriptors and several levels of gradation. Three faculty members assess students' final defenses, and scores are averaged.

Findings & Strategies for Improvement

Direct assessment findings (scoring of student work via the rubric) suggested that students are meeting outcomes at a satisfactory level, scoring an average of 3.39 (out of 4) with 100% of students scoring above of 2 on all three outcomes. However, the faculty commented that they would like to see average scores closer to 3.50. Indirect assessment of the students' perceptions suggest that students are satisfied with their level of attainment of the outcomes. The faculty reflections, however, were more mixed in their perceptions of student learning.

Faculty discussed these findings and identified several strategies for improvement: (1) increased coverage of specialized knowledge in the form of a Machine Learning course; and (2) improvement of students' technical interview capacities through the Technical Interview Seminar.

Conclusion

The strengths of the assessment plan for this degree program include: (1) concrete outcomes that connect directly to the program mission; (2) effective rubric with concrete quality descriptions; (3) scoring of student artifacts (defense) by an outside and trained faculty member; (4) sharing and discussion of findings among all program faculty; and (5) generating strategies for improvement that are tracked in subsequent reports.

The assessment activity could be enhanced by incorporating questions that focus directly on the student learning outcomes in the two indirect assessment measures (senior exit survey and faculty reflections).