RSSRecent Articles

Do you remember? Or even yesterday…

Way back in September of last year, we were chatting about the importance of appropriately managing conditions in the patient environment, primarily the surgical environment. For those wishing for a refresher, you can find that post here. (I talked about how I’ve noticed recent citation in surveys regarding the surgical environment, including the maintenance of temperature and humidity, ensuring appropriate air exchange rates, and making sure that your HVAC systems are appropriately maintaining pressure relationships, etc.)

One of the things I didn’t really cover back then was when you have documented out-of-range values. Could be temperature, could be humidity, could be those pesky air exchanges and/or pressure relationships. The fact of the matter is that we live in an imperfect world and, more often than not, our success comes down to how effectively we manage those imperfections. And that can, and does, come down to how well we’ve prepared staff at the point of care/service to be able to respond to conditions in the environment. But, in order to get there, you have to undertake a collaborative approach, involving your infection preventionist and the folks in the surgical environment.

The management of risk in the environment doesn’t happen because we have (or don’t have) nifty technology at our disposal; it’s because we can work collaboratively in ways that no building automation system or self-regulating HVAC equipment can. This idea has become an increasingly important part of the survey process. We know that more folks are harmed by hospital-acquired infections and other related conditions and I’ve seen it become a fairly significant survey vulnerability. So, let’s start talking about this stuff with the end users and make sure that we’re ahead of the curve on the matters of the care environment.

Panic in Detroit – Panic at the Disco – Panic at the Surgery Center…Fire in the Hole!

I’m presuming (and please don’t attempt to disabuse me of this notion) that you are all dutifully conducting security risk assessments on a regular basis. As you conduct them, I’m sure you find risks of some events that are greater than some other areas. So, I to ask: When you’ve completed your security risk assessment, do you identify specific strategies, including the use of technology, for minimizing those risks to the extent possible? If you’re not including that facet in the risk assessment process, you might want to consider doing so.

Recently, I was looking at a survey report in which an ambulatory surgery center was cited during a TJC survey because they had not installed a panic alarm “at the registrar’s desk in order to obtain immediate assistance in an emergent or hostile situation.” Now, as with so many things that have been popping up during surveys, I don’t disagree with the concept of having panic alarms at those customer service/interaction points where unhappy folks (or folks of any ilk) can experience the need to vent their frustrations, etc. But in that disagreement, I think I’d first be looking at what tools have been provided to staff to actively manage, if not de-escalate, these negative encounters. I would much prefer to avoid having to use a panic alarm by appropriately managing the encounter, much like I would just as soon not “need” to have an emergency eyewash station.

I’m a great believer in the proactive management of risk, but I’m also a great believer in implementing risk management and response strategies that make operational sense. So, the question to the studio audience is: Where have you installed panic alarms and where have you not installed panic alarms, and why? There’s always the risk that some surveyor will disagree with your strategy, but if that strategy was derived through thoughtful analysis of the involved risks, does that not meet the intent of all this?

I like the concept of best practice as much as anyone, but I also recognize that there is a tremendous amount of variability in the safety landscape. Just because something works in one place does not necessarily mean that it will work in all cases—that’s the mystical, magical, and ultimately mythical power of the panacea. One size doesn’t fit all—never has, never will. But if we’re going to be held to that type of an expectation, how does that help anyone? Ok, jumping down from soapbox for now, but rest assured, you’ll see me back up here before too long.

Shock the monkey (part x + y to the 10th power) – here we go again…

OK, so now it appears that we’re going to have to rethink how we schedule preventative maintenance (PM) activities on our critical equipment, particularly if that criticality affects patient health and safety. I believe that we’ve already chatted a bit about the whole clarification of PM frequencies and where CMS stands on the issue (in case you hadn’t noticed, they’re pretty much standing on your head).

In issuing the clarification (and I will freely admit that I missed this at first – check it out at:, the Feds have decided that, in the matter of critical equipment, the frequency will reflect manufacturer recommendations, AND NOTHING ELSE! Let me repeat that: AND NOTHING ELSE!

For example, PM’ing defibrillators on an annual basis (despite what your experience might indicate) is a big freaking no-no! Isn’t that special? Yeah, I thought so, too.

Maybe this isn’t anything to you folks, but I know of at least one hospital that got cited during a recent survey, so when there’s one, there’s usually others (these things almost never happen in isolation). So, if you think you may be taking advantage of logic and common sense approaches to the management of the risks associated with the use of medical equipment, think again (hopefully this won’t shift again, but if history tells us anything.)

Shoo beedoobee – splattered, splattered!

In the never ending discourse on the subject of emergency eyewash stations, I’d like to take a moment to remind folks that it appears that the TJC surveyors have access to the ANSI Emergency Eyewash and Shower Equipment Standards and they have become very diligent in ferreting out (apologies to the ferrets in the audience – I don’t mean to offend) practices that are not consistent with the “recommendations” contained therein. And so, let me say this:

If your organization has chosen to maintain your emergency eyewash stations on a lesser frequency than weekly, then you had best conducted a risk assessment to demonstrate that you are ensuring the same level of safety that you would if your were maintaining them on a weekly basis. Water temperature, water pressures, “cleanness” of the flushing liquid, whether access to the equipment is obstructed – these all need to be considered in the mix, because, and I can tell you this with a great deal of certainty, if you are doing these inspections less than weekly, you will be cited during survey. If you have not conducted the risk assessment to demonstrate that the lesser frequency is appropriate, then you will have to move to the weekly program. (Sort of a “you can pay me now or you can pay me later” kind of deal.) But rest assured that eyewash stations are definitely in the mix, so make like a Boy Scout (do I really need to finish that thought? I didn’t think so…)

By the way, I’ve also caught wind of the invocation of AAMI standards when it comes to the placement of emergency eyewash equipment in the contaminated section of central sterile. (Also, don’t forget to keep those pressure relationships in check. Clean central sterile should never be negative to dirty central sterile). Now I will freely admit that I am not as conversant with the AAMI standards as I am with, say, OSHA standards, and perhaps even the ANSI standards dealing with this stuff. Again, the rhetorical question becomes: How many rocks do we need to turn over before we can safely determine that there isn’t some funky consensus standard lurking in the weeds that is not in strict concurrence with accepted practice? Why can’t these guys just get along…jeez!

She’s a laughing, giggling whirly bird – oh heli!

Interesting development on the survey front in the last couple of weeks. I’m not at all sure what it means, but I thought I would share it with you all, make of it what you will.

During a recent survey in the Sunshine State, a hospital was cited for not having the “recommended (maximum) rotor circumference signage on the pad nor the other recommended signs that are recommended by the FAA” (signs such as “MRI in use,” etc.). Now we could certainly have a good time parsing the whole “recommended signs that are recommended” phraseology, but I keep coming back to that word “recommended.”   How far do we have to go to ensure that we have somehow accounted for every recommendation for every possible risk that we might encounter?. Yeah, beats me, too, but in the interest of furthering the applicable knowledge base, let’s step to the web for some edification:

First I draw your attention to the Advisory Circular issued by the FAA back in 2004. I can’t seem to lay my mitts on anything more contemporary than this, but if you find something more recent, please share.

Now, as we scan the first page of this most comprehensive document, we see a little statement that I think makes the TJC survey citation a little more squishy than I would prefer: “This AC is not mandatory and does not constitute a regulation except when Federal funds are specifically dedicated for heliport construction. “ To me, “not mandatory” sounds like a really big case of the “we don’t have to’s,”  what do you think?

Turning to Chapter 4, Hospital Heliports (this is on page 95 of the document), I will freely admit that there’s a lot of interesting/cool information. (Did you know that the FAA recommends Portland Cement Concrete for ground level facilities—who knew? Do you have Portland Cement Concrete for your ground level facilities? I certainly hope so). Anyway, the chapter describes a lot of important stuff about hazards and markings, including MRI impact, etc.

I’m going to guess that you’ve been having helicopters fly in and out of your airspace on a regular basis, and in all likelihood, some of you are already up to speed on this.  For those of you for whom this might be virgin territory, my advice would be to consult with the folks actually doing the flying and find out what they want to see with your helicopter setup. This would constitute what I euphemistically refer to as a risk assessment. You may have encountered that term once or twice here in Mac’s Safety Space. I can’t imagine that you’d not have heard by now if the pilots using your pad have issues. (I’ve never found them to be shy about safety—nor should they be.) Still, it’s never a bad idea to reach out periodically to make sure that everything is both hunky and dory; it’s really the least we can do.

Update: Link correction for CMS memorandum on LSC

I have been alerted that the link below did not work. I have corrected that link, but I’ll provide it here too:

Click here to directly access the CMS memorandum the changes regarding the Life Safety Code®.

(Ref: S&C-12-21-LSC)

There’s a light, a certain kind of light – and it’s not an oncoming train!

This one has the potential to be the game-changer we’ve been hoping (waiting) for – the emergence of the 2012 edition of the Life Safety Code® as a CMS-sanctioned regulatory standard.

Once you lay your hands on this plucky little document –  the official CMS memorandum – you will see that it appears) to represent a fair degree of flexibility when it comes to, among other things, corridor storage, and the amount of combustible decorations that are allowed. One thing this likely means is that everyone’s going to be inundating NFPA for their own personal copy of the 2012 Life Safety Code® – this is going to become a go-to resource from here on out.

Now, the first thing you will notice is that there’s a lot of mention of nursing homes, and not so much of hospitals, particularly on Page 1. To that end, let me direct you toward the bottom of page 2 of the document (under the section titled “Effective Date”), which specifically indicates that the memorandum and all its components are “in effect for all applicable healthcare facilities such as Hospitals and Nursing Homes.”

The other caveat, at least for the moment, is that it appears that the changes are only “accessible” through the CMS waiver request process, which will, in turn, result in a process in which “each waiver request will have to be evaluated separately in the interest of fire safety and to ensure that the facility has followed all LSC requirements and the equipment has been installed properly by the facility.” I’m not entirely certain whether this would drive anything more than a review of the waiver request, but I’m not entirely certain how they’d be able to ensure compliance with LSC requirements, etc., without eyeballing a facility. That said, there’s a whole heck of a lot of hospitals that would be pursuing this, so maybe there’s a process in place, maybe based on past TJC/DNV/HFAP and/or CMS survey results.

So, what it looks like we have here is some room for stuff in the corridors, including fixed furniture; and the presence of combustible decorations on “walls, doors and ceilings.”

That’s enough yapping from me for the moment; I encourage you to check out the document and let us know what you think. I think it’s very interesting.

And your bird can sing…

One of the topics that resurfaces every once in a little while concerns those most critical documents– your life safety drawings – and what should be contained therein. If you are still uncertain about what those suckers oughta look like, I would direct your attentions to the February 2012 edition of The Joint Commission’s EC News in the “Asked and Answered” section. The laundry list of items to be included on your life safety drawings is not particularly surprising by those among us who have been advocating for a certain contingent of information. So, if you were going to air out your “dirty” life safety drawings, some items for consideration might include:

  • a legend that clearly identifies fire safety features of your building
  • identification of those areas of the building that are fully sprinkled (if your building it partially sprinkled – no need for such detail if you’re fully sprinkled)
  • the location of all of your hazardous storage areas (if you’re not sure what that entails, check out EP #2 under LS.02.01.30 and/or NFPA 101-2000: 18/;
  • the locations of all your rated barriers (yes, all of them – don’t leave any out)
  • locations of all your smoke barriers
  • the boundaries of any areas that have been designated as suites – and don’t forget to include the square footage of the suites – both sleeping (maximum 5,000 square feet) and non-sleeping (maximum 10,000 square feet)
  • locations of your smoke compartments
  • the locations of any chutes and/or shafts (as opposed to chutes and ladders – that’s kids’ stuff)
  • any approved waivers or equivalencies.

A quick word about waivers and equivalencies: It’s always nice to share those ahead of time with your various “Authorities Having Jurisdiction.” A proactive approach to communications, as with most proactive approaches, will yield much goodwill. This whole thing works best as a collaborative process. No surprise on either side and you’ve got yourself a pretty good survey experience.

AHAP Conference opportunities

The Association for Healthcare Accreditation Professionals (AHAP) is hosting its 6th Annual Conference May 10, 2012 – May 11, 2012 in Orlando, FL. It offers so many amazing opportunities to save money, get expert advice, and show off your hospital a bit. I should also note that if you’re one of the first 50 paid registrants you’ll receive a free full-day ticket to any Walt Disney World® Theme Park*! Download the online brochure to learn more.

So what is it and why am I talking about it? The 6th Annual AHAP Conference brings together survey professionals from across the country to discuss solutions and best practices to achieve continual survey readiness and compliance with ever-changing standards and regulations.

What are the opportunities?

  •  Accreditation Specialist Boot Camp.
  • Presentation of the first annual Accreditation Professional of the Year award
  • Unique roundtable discussion with representatives from HFAP, DNV, the American Heart Association, and The Joint Commission
  • Exciting new poster event featuring research and best practices from your peers. Find out how to submit a poster and save 50% on your registration.
  • Learn about:
    • Regulatory changes in 2012 and top RFIs: Staying ahead of The Joint Commission and CMS
    • What accreditation professionals need to know about Life Safety Code®
    • To certify or not to certify? Seeking The Joint Commission disease-specific certifications
    • Making the switch from The Joint Commission to DNV: One hospital’s experience with both surveys
    • Understanding tracer methodology and the survey process
    • A practical approach to policy management
    • Suicide Risk: Solutions to rapid assessment, Environment of Care, and documentation issues
    • Understanding hospital recognition programs for optimal cardiovascular and stroke care

Learn more.

*Offer ends March 8th.

Things can only get better…

As we sprint rapidly toward March and beyond (it seems like it was just January!), I suspect that folks are wrapping up their annual evaluations of the objectives, scope, performance, and effectiveness of your environment of care program. One of the “open” questions I like to ask during EC interview sessions is, in looking back over the last 12 months or so, how did you manage to improve the management of risk in the physical environment, or, in the vernacular: What got better over the last year?

This question usually engenders a fair amount of discussion, depending on the group (try to avoid having too many wallflowers in your real EC interview – it’s all about group participation). But the question that I don’t always have the time to ask is: Based on the initiatives.just described, how do you know that it actually represented improvement? Do you look at some specific metrics/benchmarks/ performance measures to make that determination? Is it based on impression? If you were tasked with really having to prove the value of the EC program’s efforts, how would you do that?

The other side of that coin: What about initiatives that didn’t work? Realistically, nobody succeeds with everything that might be implemented over the course of time. Sometimes organizational culture is not ready for certain types of changes; sometimes other factors come into play. So, do you have any home runs or ugly strike outs to share?  Minimally, it will serve as a means of identifying (yet again) that the similarities in our various practices generally outweigh the disparities ; we are, after all, in this together. Looking forward to hearing how folks are doing! So, in a nutshell:

What got better – how do you know?

What didn’t get better – do you know why?