Leveraging Data to Assess and Evaluate Competence Through the Curriculum
Hear from the experts.

Los precios mostrados están en dólares estadounidenses (USD/$)
Hear from the experts.
If you're in nursing education, your school is likely gathering many forms of data. You may find that this data originates from different sources and gets organized into silos – but it doesn’t always get used. But when you're able to weave all of your data together into something meaningful, it can tell a powerful story – one that matters to your institution, your program, your faculty, and your students.
We sat down with some experts from three schools of nursing to learn how they leveraged the power of data to embrace competency-based education in their programs. Below we share some takeaways from our webinar: Assessing and Evaluating Competence Through the Curriculum.
Rosemary Samia, MSN, RN, CNS, CHSE, Director of the Center for Clinical Education & Research at the University of Massachusetts Boston, introduced peer-to-peer assessment and evaluation of skills practice using SimCapture for Skills for her school's health assessment and fundamentals of nursing courses in the first clinical semester. SimCapture for Skills brings faculty closer to students by combining the effectiveness of the peer-to-peer learning methodology with digital assessment and evaluation tools.
Under the supervision of a grad student or simulation educator, groups of three students take turns performing the skill as the learner, assessing and evaluating another person doing the skill as the facilitator, and having the skill done on themselves as the patient.
Looking at student test scores after implementing this approach, Rosemary found that the lab faculty who used peer-to-peer on a regular basis had students with much higher first-time pass results than other groups. "We could see that this was a strategy that was helpful," she explained.
Using SimCapture for Skills for practice time resulted in a decrease from 60 retakes to 9 retakes from one semester to the next.
In an effort to increase efficiencies, UMass Boston also turned to the peer-to-peer approach to streamline their remediation process.
- Rosemary Samia, MSN, RN, CNS, CHSE
Director, Center for Clinical Education & Research, University of Massachusetts Boston
Using SimCapture for Skills allowed them to remediate 30 students per hour, with fewer faculty.
For Chris Garrison, PhD, RN, CNE, CHSE, Associate Teaching Professor and Director of the Simulation Lab at Pennsylvania State University, collecting data on student performance told a story that led to a rethinking of his school’s approach to skills training in its large prelicensure program.
Like in many other nursing programs, Penn State had traditionally used student satisfaction surveys to evaluate simulation-based experiences. But they began to realize that they needed to start collecting objective performance data to measure learning outcomes. "We wanted something objective and valid to make decisions on," Chris explained. And, with the new AACN Essentials requiring competency-based evaluation, they knew they would need this data to document how students were meeting those AACN competencies.
Using the SimCapture simulation learning management system (LMS) with the Creighton Competency Evaluation Instrument to evaluate high-fidelity simulations, they uncovered a number of skills where students demonstrated strong competency. Upon digging deeper, they found students weren’t where they should be in other areas – such as some of the elements of safe medication administration and performing procedures correctly. Seeing this data left Chris and his colleagues asking themselves, "how can we get our students to the competency level that they need to be for practice?"
Chris explained that frontloading skills and not revisiting them is a common problem.
- Christopher Garrison, PhD, RN, CNS, CHSE
Director, Associate Teaching Professor, Pennsylvania State University
Ultimately, the data led to the decision to take a more deliberate practice approach and revisit the skills throughout the curriculum. They made sure to give students opportunities to redevelop their competency, and began using peer-to-peer methodologies with rubrics to enable students to get feedback as they were practicing. As they continue to collect data on this approach, Chris expects to see major improvements in these areas with the current cohort.
The experts shared some examples of how data helped them elevate professional development:
Upon looking at student scores gathered using the Lasater Clinical Judgment Rubric, Rosemary observed that the scores were exemplary – despite being the first simulation the students had ever done. “I saw that there was an opportunity for some faculty development here,” she said. They addressed the need between semesters. When the data showed no improvement after trying to address the issue, they changed the format of the questions that faculty would answer to give students feedback. Instead of a 1 through 4 rating, they took the same categories and asked faculty to provide free-text comments back to the students. “At the end of that semester, we then saw that the feedback being provided was much more valuable, the instructors were taking the time to read the evaluations and reflections, and then really guide the students through those four phases.”
Rosemary shared that UMass Boston moved away from student satisfaction surveys for measuring simulation effectiveness and started utilizing the Debriefing Assessment for Simulation in Healthcare© (DASH) tool. Developed by the Center for Medical Simulation in Boston, the DASH tool takes an objective approach to assessing debriefings. UMass Boston started with the DASH student version of the tool, which allows students to evaluate the facilitators. They later adopted the instructor version of the tool, with their simulation educators viewing their debriefings and checking themselves off to see where they need to focus on professional development.
Jennifer Roye, MSN, RN, CHSE, CNE, the Assistant Dean for Simulation and Technology and a Clinical Assistant Professor at the University of Texas at Arlington, moved to a standardized, easy-to-use debriefing method called Plus Delta at her school. She’s conducted a number of faculty development workshops to ensure effective and consistent utilization. Recognizing the importance of leaning into objective data to measure effectiveness, Jenny also plans to utilize the DASH tool to evaluate faculty use of the Plus Delta method.
After the webinar, we asked the experts to share some tips for schools that are just getting started with using data.