Skip to content
Open navigation

Leveraging Data to Assess and Evaluate Competence Through the Curriculum

Hear from the experts.

If you're in nursing education, your school is likely gathering many forms of data. You may find that this data originates from different sources and gets organized into silos – but it doesn’t always get used. But when you're able to weave all of your data together into something meaningful, it can tell a powerful story – one that matters to your institution, your program, your faculty, and your students
 
We sat down with some experts from three schools of nursing to learn how they leveraged the power of data to embrace competency-based education in their programs. Below we share some takeaways from our webinar: Assessing and Evaluating Competence Through the Curriculum.

1. Data can help you maximize learning outcomes with major efficiency

Rosemary Samia, MSN, RN, CNS, CHSE, Director of the Center for Clinical Education & Research at the University of Massachusetts Boston, introduced peer-to-peer assessment and evaluation of skills practice using SimCapture for Skills for her school's health assessment and fundamentals of nursing courses in the first clinical semester. SimCapture for Skills brings faculty closer to students by combining the effectiveness of the peer-to-peer learning methodology with digital assessment and evaluation tools. 
 
Under the supervision of a grad student or simulation educator, groups of three students take turns performing the skill as the learner, assessing and evaluating another person doing the skill as the facilitator, and having the skill done on themselves as the patient.

Looking at student test scores after implementing this approach, Rosemary found that the lab faculty who used peer-to-peer on a regular basis had students with much higher first-time pass results than other groups. "We could see that this was a strategy that was helpful," she explained.

Using SimCapture for Skills for practice time resulted in a decrease from 60 retakes to 9 retakes from one semester to the next.

In an effort to increase efficiencies, UMass Boston also turned to the peer-to-peer approach to streamline their remediation process. 

"Nursing programs are being challenged with all sorts of issues. We’ve got lower faculty, higher turnover, our cohorts are growing in size – and so we needed to be really strategic about how we approached remediation because it wasn’t an option to water that process down."

- Rosemary Samia, MSN, RN, CNS, CHSE
Director, Center for Clinical Education & Research, University of Massachusetts Boston

rosemary-samia-circle.png

Using SimCapture for Skills allowed them to remediate 30 students per hour, with fewer faculty.

2. Data can help you uncover the need for a change in approach

For Chris Garrison, PhD, RN, CNE, CHSE, Associate Teaching Professor and Director of the Simulation Lab at Pennsylvania State University, collecting data on student performance told a story that led to a rethinking of his school’s approach to skills training in its large prelicensure program.

Like in many other nursing programs, Penn State had traditionally used student satisfaction surveys to evaluate simulation-based experiences. But they began to realize that they needed to start collecting objective performance data to measure learning outcomes. "We wanted something objective and valid to make decisions on," Chris explained. And, with the new AACN Essentials requiring competency-based evaluation, they knew they would need this data to document how students were meeting those AACN competencies.

Using the SimCapture simulation learning management system (LMS) with the Creighton Competency Evaluation Instrument to evaluate high-fidelity simulations, they uncovered a number of skills where students demonstrated strong competency. Upon digging deeper, they found students weren’t where they should be in other areas – such as some of the elements of safe medication administration and performing procedures correctly. Seeing this data left Chris and his colleagues asking themselves, "how can we get our students to the competency level that they need to be for practice?"  
 
Chris explained that frontloading skills and not revisiting them is a common problem.

christopher-garrison-circle.png
"I think it’s a faulty assumption that once you check somebody off as being competent in your Fundamentals or Foundations course, that they maintain that competence throughout the curriculum."

- Christopher Garrison, PhD, RN, CNS, CHSE
Director, Associate Teaching Professor, Pennsylvania State University

Ultimately, the data led to the decision to take a more deliberate practice approach and revisit the skills throughout the curriculum. They made sure to give students opportunities to redevelop their competency, and began using peer-to-peer methodologies with rubrics to enable students to get feedback as they were practicing. As they continue to collect data on this approach, Chris expects to see major improvements in these areas with the current cohort.

3. Data can help you discover opportunities for professional development

The experts shared some examples of how data helped them elevate professional development:

Providing effective student feedback

Upon looking at student scores gathered using the Lasater Clinical Judgment Rubric, Rosemary observed that the scores were exemplary – despite being the first simulation the students had ever done. “I saw that there was an opportunity for some faculty development here,” she said. They addressed the need between semesters. When the data showed no improvement after trying to address the issue, they changed the format of the questions that faculty would answer to give students feedback. Instead of a 1 through 4 rating, they took the same categories and asked faculty to provide free-text comments back to the students. “At the end of that semester, we then saw that the feedback being provided was much more valuable, the instructors were taking the time to read the evaluations and reflections, and then really guide the students through those four phases.

Ensuring debriefing competence

Rosemary shared that UMass Boston moved away from student satisfaction surveys for measuring simulation effectiveness and started utilizing the Debriefing Assessment for Simulation in Healthcare© (DASH) tool. Developed by the Center for Medical Simulation in Boston, the DASH tool takes an objective approach to assessing debriefings. UMass Boston started with the DASH student version of the tool, which allows students to evaluate the facilitators. They later adopted the instructor version of the tool, with their simulation educators viewing their debriefings and checking themselves off to see where they need to focus on professional development.

Jennifer Roye, MSN, RN, CHSE, CNE, the Assistant Dean for Simulation and Technology and a Clinical Assistant Professor at the University of Texas at Arlington, moved to a standardized, easy-to-use debriefing method called Plus Delta at her school. She’s conducted a number of faculty development workshops to ensure effective and consistent utilization. Recognizing the importance of leaning into objective data to measure effectiveness, Jenny also plans to utilize the DASH tool to evaluate faculty use of the Plus Delta method.

jennifer-roye-circle.png

Bonus: Where to Start?

After the webinar, we asked the experts to share some tips for schools that are just getting started with using data. 

 

Watch the full webinar on-demand

Assessing and Evaluating Competence Through the Curriculum

Watch now

Speaker Bios

rosemary-samia.jpg

Rosemary Samia, MSN, RN, CNS, CHSE

Rosemary Samia, MSN, RN, CNS, CHSE, is the director of the Center for Clinical Education & Research at the University of Massachusetts Boston. Her clinical background includes 15 years in medical-surgical nursing before transitioning to teaching with simulation in the academic setting in 2014. In 2020, she completed the National League for Nursing’s year-long Leadership Development Program for Simulation Educators. Rosemary facilitates a local networking group for simulation educators in the greater Boston area and assists with the development and organization of regional simulation symposiums. She serves on the advisory board for safeMedicate, a virtual environment for simulated medication administration. In 2020, Rosemary was recognized by Sigma Theta Alpha for Excellence in Nursing Practice and was the recipient of the Maureen Oh Eigartaigh award.

christopher-garrison.jpg

Christopher Garrison PhD, RN, CNE, CHSE

Christopher Garrison, PhD, RN, CNE, CHSE, is an Associate Teaching Professor in the Ross and Carol Nese College of Nursing at the Pennsylvania State University and the Director of the Simulation Laboratory at the University Park campus. His clinical background includes experiences in medical-surgical, cardiology, home health, and as an adult/gerontological nurse practitioner. He is a Certified Nurse Educator (CNE) and a Certified Healthcare Simulation Educator (CHSE). Christopher holds an Associate’s degree from Northern Virginia Community College, a BS degree from the Pennsylvania State University, an MSN from George Mason University, and a PhD from Nova Southeastern University in Nursing Education. He teaches in classroom, simulation, and clinical settings across the curriculum in the BSN program at Penn State. In 2020, Christopher received the Janet A. Williamson Excellence in Teaching Award from the Penn State College of Nursing. He has 13 years of experience in designing and delivering simulation-based learning. His research interests include evaluating the effectiveness of simulation and virtual simulation as an educational strategy. Christopher has presented at national and international conferences on simulation and other nursing education topics.

jennifer-roye.jpg

Jennifer Roye, MSN, RN, CHSE, CNE

Jennifer Roye, MSN, RN, CHSE, CNE is The Assistant Dean for Simulation and Technology and a Clinical Assistant Professor at the University of Texas at Arlington College of Nursing and Health Innovation. She is lead faculty for the Fundamental Telehealth Skills course in the Health Informatics Certificate Program. Mrs. Roye received her MSN from UTA in 2003 and is currently enrolled at The University of Alabama in the EdD Instructional Leadership program. She practiced as a CPNP in private practice for 10 years and as an RN in the Emergency Department at Cook Children’s Medical Center in Ft. Worth, Texas for 16 years. Her areas of research interest include simulation, telehealth, student engagement, enhancing online education, and moral distress in the undergraduate nursing student population.