The DQP in Practice at North Dakota State University

Larry Peterson, Director of Accreditation, Assessment, and Academic Advising

NDSU chose to ask departments to compare the student learning outcomes and activities in our undergraduate capstone experiences in each major with DQP benchmarks for applied learning.  Examining what “graduates can do with what they know,” (in the words of the DQP) fits not only the land-grant culture of NDSU, but also had multiple other advantages for our campus. We hoped that evaluating their current capstone would provide departments with an opportunity to reflect on how well it served their students as an experience in which they synthesize and apply the knowledge and skills they gain. We also wanted to capture best local practices in capstones to share across the campus.

NDSU created an electronic survey asking departments to evaluate to what extent their required capstone experience met the elements of applied learning from the DQP.  Our survey subdivided the applied learning benchmarks of the DQP into fourteen separate items. For example, departments were asked if their capstone met the DQP’s benchmark of “formulates a question on a topic that addresses more than one academic discipline or practical setting.” If it did, then they were asked to describe what student activities in the capstone provide evidence for their conclusion.

In December 2011 the Provost sent copies of the DQP to all faculty and a letter to all departments asking them to respond to the previously described electronic survey by February 15, 2012. Our analysis of the responses about the DQP suggested that even though capstones are a general education requirement, many faculty view them as discipline-specific classes. Consequently, they had trouble seeing how the broader learning outcomes of the DQP fit with their classes.

NDSU faculty evaluated the DQP’s “applied learning” benchmarks as having six strengths:

  • It reflected the benefits of a comprehensive framework for evaluation, particularly in the ability to, “gauge how students can apply information from the classroom to an experience.”
  • It was useful in identifying “where gaps are in obtaining feedback on the outcomes and related assessments.”
  • For some departments, it closely paralleled existing evaluation of learning outcomes.
  • The outcomes were broadly aligned with professional accreditation standards (although those standards are significantly more specific).
  • Such a frame work was useful in aligning, “courses, methods, and pedagogical goals.”
  • Two of the benchmarks were identified as “fundamentally the core outcomes” for the capstone in Engineering.

Correspondingly, faculty faulted the DQP’s “applied learning” benchmarks in eight areas, five of which are outlined below:

  • Some of the more technical disciplines (such as dietetics, engineering and business) that need to meet evaluation criteria set out by professional associations criticized the DQP as too general, as relying on theoretical rather than applied professional knowledge.
  • A number of respondents raised concerns about the need for students to address issues in more than one discipline, citing problems of cost, evaluation, and professional accreditation standards.
  • The outcomes could be clarified and more concrete examples could be provided.
  • The outcomes are too broad to be assessable by a single program or department.
  • The outcomes do not address important areas such as oral presentations and teamwork.

As we analyzed the survey data about the extent to which department capstones met the elements of applied learning from the DQP, we realized even though we had two departments pilot the survey, some questions were not as clear as we intended and we had more variation in how departments described their student work than we expected. Consequently, in November 2012 we did a short follow-up fixed-response survey of capstone course instructors which revealed that

  • 13% had revised or were revising their syllabi
  • 88% require research
  • 97% require a final product, project, or performance
  • 45% use a rubric to evaluate that product, project, or performance

We expect to complete our analysis of the DQP Capstone project by March 2012, but we continue to use the DQP as a reference point in campus conversations about assessment and shared learning outcomes for all undergraduates.