Larry Peterson, Director of Accreditation, Assessment, and Academic Advising
NDSU chose to ask departments to compare the student learning outcomes and activities in our undergraduate capstone experiences in each major withÂ DQPÂ benchmarks for applied learning.Â Examining what â€œgraduates canÂ doÂ with what they know,â€ (in the words of theÂ DQP) fits not only the land-grant culture of NDSU, but also had multiple other advantages for our campus. We hoped that evaluating their current capstone would provide departments with an opportunity to reflect on how well it served their students as an experience in which they synthesize and apply the knowledge and skills they gain. We also wanted to capture best local practices in capstones to share across the campus.
NDSU created anÂ electronic surveyÂ asking departments to evaluate to what extent their required capstone experience met the elements of applied learning from theÂ DQP.Â Our survey subdivided the applied learning benchmarks of theÂ DQPÂ into fourteen separate items. For example, departments were asked if their capstone met theÂ DQPâ€™s benchmark of â€œformulates a question on a topic that addresses more than one academic discipline or practical setting.â€ If it did, then they were asked to describe what student activities in the capstone provide evidence for their conclusion.
In December 2011 the Provost sent copies of theÂ DQPÂ to all faculty and a letter to all departments asking them to respond to the previously described electronic survey by February 15, 2012. OurÂ analysisÂ of the responses about theÂ DQPÂ suggested that even though capstones are a general education requirement, many faculty view them as discipline-specific classes. Consequently, they had trouble seeing how the broader learning outcomes of theÂ DQPÂ fit with their classes.
NDSU faculty evaluated theÂ DQPâ€™sÂ â€œapplied learningâ€Â benchmarksÂ as having six strengths:
- It reflected the benefits of a comprehensive framework for evaluation, particularly in the ability to, â€œgauge how students can apply information from the classroom to an experience.â€
- It was useful in identifying â€œwhere gaps are in obtaining feedback on the outcomes and related assessments.â€
- For some departments, it closely paralleled existing evaluation of learning outcomes.
- The outcomes were broadly aligned with professional accreditation standards (although those standards are significantly more specific).
- Such a frame work was useful in aligning, â€œcourses, methods, and pedagogical goals.â€
- Two of the benchmarks were identified as â€œfundamentally the core outcomesâ€ for the capstone in Engineering.
Correspondingly, faculty faulted theÂ DQPâ€™sÂ â€œapplied learningâ€Â benchmarksÂ in eight areas, five of which are outlined below:
- Some of the more technical disciplines (such as dietetics, engineering and business) that need to meet evaluation criteria set out by professional associations criticized the DQPÂ as too general, as relying on theoretical rather than applied professional knowledge.
- A number of respondents raised concerns about the need for students to address issues in more than one discipline, citing problems of cost, evaluation, and professional accreditation standards.
- The outcomes could be clarified and more concrete examples could be provided.
- The outcomes are too broad to be assessable by a single program or department.
- The outcomes do not address important areas such as oral presentations and teamwork.
As we analyzed the survey data about the extent to which department capstones met the elements of applied learning from theÂ DQP, we realized even though we had two departments pilot the survey, some questions were not as clear as we intended and we had more variation in how departments described their student work than we expected. Consequently, in November 2012 we did a short follow-up fixed-response survey of capstone course instructors which revealed that
- 13% had revised or were revising their syllabi
- 88% require research
- 97% require a final product, project, or performance
- 45% use a rubric to evaluate that product, project, or performance
We expect to complete our analysis of theÂ DQPÂ Capstone project by March 2012, but we continue to use the DQPÂ as a reference point in campus conversations about assessment and shared learning outcomes for all undergraduates.