National Institute for Learning Outcomes Assessment |

National Institute for Learning Outcomes Assessment

Occasional Paper 35 - Technology Solutions to Support Assessment


 

 

Harrison, J. M., & Braxton, S. N. (2018, September). Technology solutions to support assessment. (Occasional Paper 35). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).

Paper Abstract

In this paper, we explore how assessment technologies can support college and university assessment processes at multiple levels. Our goal is to help you think through your institutional assessment culture and processes to identify tools that support your institution’s approach to assessment. Multiple software systems can offer institutions rich and nuanced information about students—most schools have learning management systems (LMS) and student information systems (SIS), often supported by analytics programs that integrate the data. Faculty rely on the LMS and other tools like student response systems (i.e., “clickers”), Scantron, and e-portfolios to assess students’ work at the program and course levels. At program and institutional levels, many schools have adopted Assessment Management Systems (AMS) to streamline assessment processes and enrich their evidence about student learning. Yet “meaningful implementation remains elusive”—while 29% of provosts would like tools that can “aggregate assessment results to represent overall institutional performance,” 51% of provosts do not find their AMS fully supportive of assessment efforts (Jankowski, Timmer, Kinzie, & Kuh, 2018, p. 4, 15, 23). How can institutions select useful assessment technologies and integrate them with existing tools, so faculty and administrators can easily extract and use the data to improve student learning? What elements should we consider when selecting technologies? Do any systems exist that address the requirements of authentic assessment in one solution?

To explore these questions, we discuss how technologies can address assessment challenges. Next, we classify the functional criteria in a taxonomy. We then sketch a process to help you reflect on your assessment technology needs, giving attention to your institution’s assessment culture, data, technology users, and audiences. Finally, we present evaluation criteria for judging the appropriateness of technologies.

 

Biography of the Authors


Jennifer M. Harrison has worked in higher education for almost 30 years and is currently UMBC’s Associate DirectorforAssessment in the Faculty Development Center. She has expertise in accreditation, institutional effectiveness, student learning assessment, critical pedagogy, curriculum development, educational technology, and online and face-to-face active learning. She currently specializes in interdisciplinary educational development. An experienced speaker, she has created hundreds of workshops, programs, and presentations for a range of higher education audiences, including national, regional, and local conferences. At UMBC, she consults with faculty and staff to strengthen learning assessment practices and offers programs and workshops to support faculty development. She was key contributor to UMBC’s successful re-accreditation efforts and continues to work with faculty, staff, and leaders to support authentic assessment.

Before joining UMBC, she served the labor movement for 15 years at the National Labor College, crafting interdisciplinary writing, research, and critical thinking curricula; leading faculty development, prior learning assessment, and educational technology processes; cultivating strategic, institutional effectiveness, and learning assessment plans, and successfully contributing to re-accreditation as Associate Professor of Writing and Director of Assessment. After earning tenure, she chaired the admissions committee, brokering a FIPSE grant into a redesigned student-success oriented matriculation process designed to integrate with prior learning assessment and improve graduation rates; redesigned the capstone program; crafted key policy documents; and contributed to continuous improvement initiatives by founding and chairing the Assessment Committee.

Dr. Harrison holds an interdisciplinary Ph.D. in Language, Literacy, and Culture from UMBC, a master’s degree in English Language and Literature from University of Maryland, College Park, and a bachelor’s degree in English with a minor in art from Washington College. Her current research focuses on authentic assessment, including inclusive curriculum mapping and design; graduate, co-curricular, and interdisciplinary assessment; assessment technologies; and the benefits of contextualizing learning analytics with direct learning evidence.

 

Dr. Sherri N. Braxton is the Senior Director of Instructional Technology at UMBC where she is responsible for leading the Division of Information Technology’s (DoIT) strategy for end-user support of instructional technologies including online, hybrid, and traditional, “face-to-face” technologies. With over 20 years of experience in traditional classroom instruction and adult education strategies grounded in instructional design models, she also possesses over 17 years of experience using learning technologies in higher education settings, including the design and facilitation of online and hybrid courses. Dr. Braxton is a dynamic presenter known for her ability to engage audiences and capture their attention, even for highly complex topics. She collaborates with her staff to devise learning opportunities delivered in multiple modes that meet the varied and shifting needs of both UMBC faculty and students. Dr. Braxton is also the DoIT representative on the University System of Maryland (USM) Academic Transformation Advisory Council, a group spearheaded by the William E. Kirwan Center for Academic Innovation. Dr. Braxton has crafted a national presence through her participation in educational technology associations like EDUCAUSE, the Online Learning Consortium (OLC),  and the IMS Global Learning Consortium; in addition to presenting at national, regional, and local conferences, she serves as a proposal reviewer, constituent group leader, leadership institute faculty, and both task force leader and working group participant.

Dr. Braxton earned a Doctor of Science in Computer Science with Minors in Educational Leadership and Management Science from the George Washington University. She also holds a Master of Science in Computer Science with a Math Minor from North Carolina State University and a Bachelor of Science degree in Mathematics with a minor in Computer Science from Wake Forest University.

 

 

 

 

 

Additional Resources:

OccPaper18Cover

National Institute for Learning Outcomes Assessment. (2018). Assessment related technologies. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).

OccPaper18Cover

Contact North (2018, May). Ten guiding principles for the use of technology in learning. Thunder Bay, Ontario: Contact North and Teachonline.ca.

OccPaper18Cover

Office of Educational Technology. (2017, January). Reimagining the role of technology in higher education: A supplement to the National Education Technology Plan. Washington, D.C.: U.S. Department of Education Office of Educational Technology (OET).

News Article:

Palmer, I. (n.d.). Produce Your Own Analytics or Hire a Vendor? New America.