May 2, 2022

A Plea for Standardization

Share this:

A Plea for Standardization

Natasha Nicol, PharmD, FASHP

Director of Global Patient Safety Affairs, Cardinal Health, Pawleys Island, South Carolina

Natasha Nicol is Director of Global Patient Safety Affairs for Cardinal Health, Inc. She received her doctorate of pharmacy degree from the University of Maryland School of Pharmacy. She is a certified trainer in Just Culture, Event Investigation, and Disclosure of Unanticipated Outcomes, and an invited professor for the South Carolina and Presbyterian Colleges of Pharmacy. She is past-President of the South Carolina Society of Health-System Pharmacists. She is a Fellow of the American Society of Health-System Pharmacists and served on the Council on Education and Workforce Development, as well as the House of Delegates. She is Program Chair for the ASHP Medication Safety Collaborative. She was recognized for her work with the ASHP Award for Excellence in Medication-Use Safety and was named Pharmacist and Mentor of the Year by SCSHP. She is a frequent presenter to professional groups, primarily focusing on safety as it relates to culture, use of technology, and development of processes.


What barriers to implementing standardized concentrations of IV medications and oral liquid medications at your institution have you encountered: (select all that apply)

Thank you!
Your submission has been counted.

Relevant Financial Relationship Disclosure

No one in control of the content of this activity has a relevant financial relationship (RFR) with an ineligible company.

As defined by the Standards of Integrity and Independence in Accredited Continuing Education definition of ineligible company.


There are too many stories of events – no, let us call them what they are: tragedies – in healthcare that never should have happened.  With every story, I cannot help but think: if only we had implemented a process that would have removed that one most error-prone variable in virtually every scenario.  Yes, I am talking about us: humans.

What do we know about error etiology?  What have we learned?  In addition, why do we still, hopefully, rely on human intervention as a strategy for safety?

You have likely seen or heard reference to “low-level” versus “high-level” risk mitigation strategies.  Low-level strategies include things like rules and policies, education, and actions such as telling staff to remember next time and be more careful.  High-level strategies work to remove the human element with things like forcing functions, hard-wired barriers, automation and standardization.  Unfortunately, I still see the low-level strategies of education and reminding staff to be more careful as the two most commonly chosen approaches in response to error.  We humans really struggle with moving beyond the idea that errors occur because humans mess up, and those humans need to make sure they never do that again.  We have been raised believing making mistakes is unacceptable and blame must be placed on someone after any event occurs.  We must find who “did it”!  Is that not the first question everyone asks after something happens?

It makes sense that a strategy that relies on human memory, attention, or – dare I say it – the ability to perform flawless math – is a failed one.  Yet we still rely on low-level strategies but also remain plagued by a large percentage of society that views human error as equal with criminal intent.  Consider RaDonda Vaught, a nurse who accidentally administered vecuronium instead of midazolam and was subsequently charged – and convicted – with criminally negligent homicide.  As if to say Ms. Vaught willfully and consciously made the decision to kill someone.  It is remarkable to me how many have since righteously proclaimed: “I would never make a mistake like that” or “How could she have made such an egregious error?”

The focus needs to be on what we may do in order to create the safest system possible, where doing the right thing is significantly easier than doing the wrong thing.  A good example of a higher-level strategy that many hospitals have adopted is standardized crash cart medication drawers.  Has your organization done this?  Did you think about why you did it?  The intent of standardization is to take some of the human element (that fallible variable) out of the equation.  This is particularly important during stressful tasks, such as taking care of a patient actively coding.  The last thing we want that practitioner to be doing is fumbling around with the contents of an unfamiliar cart looking for drugs that could save a life.

The human brain can only store so much information and only handle so much stress.  Whenever we can identify an opportunity to make it easier to do the right thing and harder to do the wrong thing, we need to do it!

Speaking of doing it…. As a side note, I am often asked about my preferred methodology for safety work.  Is it Lean?  Six Sigma?  FMEA?  PDSA?  Am I a black belt?  I believe in efficiently moving from point A to B and not making it any more complicated than it has to be.  That is why my methodology follows the Nike doctrine of the “swoosh” – I prefer to “Just Do It”.  I am a firm believer in looking at a problem or opportunity, identifying some potential strategies, making “small tests of change”, quickly measuring for effect, and moving on.  I have witnessed so many colleagues analyze data to death, yet are unable to follow that with any meaningful action.

Part of the problem is the tendency to make things too complicated, and falling into the trap of chasing our tails and trying to fix the “one-offs”.  There is no faster way to burnout staff than changing an entire process every time something happens; and burnout is real.  We need to be more sophisticated in our safety strategies and look to make more system-wide, meaningful changes.  We need to stop measuring things just to measure them or because we have always measured them.  Instead, we need to measure what is meaningful and important to our organization, and what we may actually influence.  Just ask yourself: “what can I actually do with this data?”  If you cannot come up with anything besides “continue to track and trend” or “report it to P&T” – then STOP measuring it!

What do we know that works?

In the complex processes that make up the medication use system, every step offers an opportunity for error.  Effective error reduction is possible if we look at each step and devise ways to reduce the human element, through strategies such as forcing functions (e.g., profiled automation), barriers (e.g., bar coding), and standardization.  Limit the choices, the variables, and the requirement for human memory, you limit the chance for error.  If this is the approach used, instead of what I’ve seen at hundreds of hospitals where individual events are sent to the nurse manager on the unit where the event “occurred” and they are told to “fix it”, we could begin to move the needle.

We know that only a small percentage of errors that occur are actually reported.  There are many reasons for this but the most prominent is culture.  We know from the latest AHRQ culture survey results that only 47% of healthcare workers feel it is safe to talk about errors without the fear of retribution.1 Without a just culture, where human fallibility is acknowledged and accepted and behavioral choices, not outcomes, are the focus, we cannot improve safety.

Human factors research has helped identify strategies to mitigate human error through better understanding of how we interact with our environment.  For instance, the idea behind standardizing drug concentrations recognizes that the more options you have to choose from, the greater chance for error.  Not only does it add confusion for the providers when faced with multiple selections, it trickles down to pharmacy with product procurement, preparation and dispensing, and then to nursing.  This especially becomes problematic with IV drips that require nurses to titrate – and do math.  The math becomes significantly less problematic with standardized concentrations.

I have witnessed and been brought in to investigate too many avoidable errors.  One of the most tragic involved a nurse who accidentally chose the wrong drip concentration from the smart pump library – there were three – and ended up killing her best friend’s mother.  The family had requested she be their mother’s nurse because they trusted her.  This nurse trusted the system and it failed her.

We need to move past egos in healthcare and doing things the way we have always done them and implement the proven risk-reduction strategies available to us.  We know standardization, forcing functions, and automation and barriers work because they limit the need for humans to be perfect.  We know education and policies are a necessity but are not strong strategies to prevent human error.

Let us work to learn to first acknowledge human fallibility and then accept it.  Then we can work to implement effective strategies to make it easy to do the right thing, and virtually impossible to do the wrong thing.  Let us limit the choices by standardizing to one concentration, introduce forcing functions and barriers, and we will dramatically increase our chances of doing the right thing.  It is a simple decision and not a lot of work to make it happen, but the benefits are enormous.

More Information


Famolaro T, Hare R, Yount ND, et al. Surveys on Patient Safety CultureTM (SOPS®) Hospital Survey 2.0: 2021 User Database Report. Agency for Healthcare Research and Quality; March 2021. AHRQ Publication No. 21-0017.