Description
This task is used for the assessment of Quantitative Reasoning (which the DQP refers to as Quantitative Fluency). It has been used to assess quantitative reasoning skills of students in our campus-wide general education Essential Studies (ES) program. This assignment is not specific to any course. Rather, it can be used broadly for undergraduate students at all levels of study.
This assignment addresses the following DQP Proficiencies:
- Intellectual Skills:
- Use of Information Resources
- Quantitative Fluency
- Communicative Fluency
Background and Context
A team of faculty at UND developed signature tasks to assess undergraduate students’ proficiency in our general education program (“Essential Studies” or ES). Owing to our previous experience with the Collegiate Learning Assessment Performance Task Academy, we refer to our signature tasks as ES Performance Tasks. Â This task is for the assessment of Quantitative Reasoning (which the DQP refers to as Quantitative Fluency).
- The DQP describes Quantitative Fluency generally as recognizing the critical importance of–the use of visualization, symbolic translation and algorithm–in the context of analysis. This aligns well with the intent of our ES quantitative reasoning outcome which requires students to “apply empirical data to a special problem or issue; draw conclusions based on quantitative information; and analyze graphical information and use it to solve problems.”
- Our ES quantitative reasoning rubric includes within “interpreting data, the ability to reason with data, read graphs or maps, draw inferences, and recognize sources of errors.” This parallels the DQP outcome describing the ability to “construct valid arguments using the accepted symbolic system of mathematical reasoning.”
- Our rubric specifies that the student demonstrate “problem solving in context: uses appropriate mathematical and numerical tools to solve discipline specific problems.” The DQP also addresses the issue of context requiring students to use and analyze “public information in papers, projects or multimedia presentations” and to construct “mathematical expressions where appropriate for issues initially described in non-quantitative terms.”
Given the context described above, we see the quantitative reasoning performance task and rubric developed at the University of North Dakota as well aligned with the DQP.
This performance task is completed by students enrolled in our ES capstone courses, which are limited to seniors or second semester juniors. ES capstones must be completed by all students prior to graduation. The capstones are typically (although not always) taught within individual majors, but the performance task is intended to be cross disciplinary. The task was developed with the aim of recruiting students to complete it out of class (students are allowed up to 90 minutes to complete the task, and few capstone classes have that length of class period). The quantitative reasoning performance task was created for assessment of general education learning outcomes rather than for a grade in a course.
Students from all majors complete the assignment. Although it is done out of class, it needs to result in students’ best work in order to provide meaningful assessment data. Therefore, it was a priority to ensure that the task would be intrinsically engaging and relevant to our graduating seniors across the university. Our aim was to provide an assessment task that would not favor students from any particular major or perspective and would allow us to collect meaningful information about student learning as students near completion of their bachelor’s level degrees. We know that students found the task generally engaging, simply because the vast majority of them (who knew only that they would be volunteering 90 minutes of their time to participate in an assessment of our general education program) chose to complete the task, once they arrived at the testing site and learned what they would be expected to do. In addition, students produced written analyses that scorers generally agreed represented credible and serious work. The assessments themselves are completed during a designated Assessment Week occurring in late February, and faculty scoring sessions occur after the end of the semester.
Although our task was designed to be particularly engaging to our own students (i.e., using data and information specifically relevant to an offer of employment after graduating from college), this concept seems highly transferrable to other institutions where there is a need for meaningful outcomes-level assessment of general education goals. Since virtually all colleges and universities today face this same assessment challenge, this assignment has broad applicability.
Alignment and Scaffolding
This assignment is given to students toward the end of their undergraduate studies. It requires students to evaluate a variety of data types in order to make a decision regarding employment options after graduation. The vast majority of students who take this assignment have completed (or are enrolled in) a course designated as meeting the ES Quantitative Reasoning requirement. Although the scope and context of ES Quantitative Reasoning courses can vary tremendously, they all emphasize skills associated with using numerical data of some sort to draw conclusions. This assignment not only provides a useful quantitative reasoning assessment tool, but also forces participants to think seriously about various factors that must be considered when deciding whether or not to accept a new job.
Reflections
Our first experience with using performance tasks during a designated Assessment week occurred in February 2014 and since then we have also administered the Quantitative Reasoning performance task during the spring semester, 2015. There is a high degree of enthusiasm for this process on our campus based on these first few trials. Those who worked on writing the performance tasks believed they were effective for our general education outcomes assessment purposes, students appeared to take them seriously (proctors were impressed by the seriousness of purpose observed), and scorers agree that this approach is highly useful and effective. Based on our first few tries with this process, we anticipate refining the rubric used to assess student work.
Please select an option
The assignment library and the assignments within are licensed under a Creative Commons Attribution 4.0 International License. By clicking “Ok” you agree to cite each assignment (including modifications), with the provided citation on the assignments downloaded from this site.
OK