RSSRecent Articles

24 + 24 + 24 + 24 = ?

In visiting with hospitals across the country since the unveiling of the new emergency management standards, there’s been an increasing murmur relative to the presence of a certain temporal indicator that you can find under EC.4.12, EP #6.

96 hours. Four days. 5,760 minutes.

That time span brings with it some questions:

  • Is this a long time to be without the support of the local community?
  • Does it vary within the six critical areas of EC.4.13 through EC.4.18 (communications, resource and asset management, safety and security, management of staff, management of utilities, and management of clinical and support activities)?
  • Does it mean I need to have four days worth of stuff in my warehouse?
  • What if I don’t have a warehouse?
  • How prepared is prepared?

These are all excellent questions for which your organization is going to have to identify answers. For good or ill, there are no correct responses for these questions, and the practical applications are going to vary from organization to organization.

What’s important to remember is that this particular EP is not telling you that you have to do one thing or another (like having 96 hours worth of stuff in your warehouse). What is required is that you have a sense of what would happen if you were cut off from support for those 96 hours. Some organizations might be able to do 96 hours on their own with very little difficulty, while others might struggle to get to 48 or even 24 hours (probably not many in that group, but it is possible). The ultimate questions are: How far can you go? And what do you do when you’ve gone as far as you can go?

One of the clear lessons learned in the aftermath of Hurricane Katrina is that holding on past the point of reason is, well, not a reasonable strategy. But prior to recent tragedies, it’s almost as if the “defend-in-place” strategy of life safety management was carried across to the annals of emergency response. Right or wrong, getting out appeared to be entertained very infrequently in our response plans.

Now we know that in order to even approximate the safe management of a catastrophic event, we must consider the inconsiderable, think the unthinkable, try to gain some measure of control over situations that are, for all intents and purposes, uncontrollable.

What would we do if faced with an event of such magnitude? How far can we go? How do we tell when we’ve crossed that threshold? All questions to answer-and soon.

Use your full compliance data when clarifying RFIs

In my last post I talked about the advantages of using The Joint Commission’s clarification process when you receive an RFI.

Just to give you an example of how this could manifest itself, say you have a couple of off-site clinics that are visited during a hospital survey and surveyors find 10 fire extinguishers that missed a monthly inspection. (By the way, frequently survey verdicts are based on sample size-for all intents and purposes, if surveyors find three or more noncompliant conditions within an EP or standard, they have to score it as a “0” or noncompliant.)

It appears the surveyors have the hospital in a bad situation. As they might say in the U.K., “It’s a fair cop!” But, if you look at that EP (EC.5.40, EP #12), why, it’s a “C” score. Can we do something here? You betcha!

We know how many extinguishers we have in our inventory: 75 extinguishers at the main hospital and another 10 at all of the off-sites, for a total of 85 devices. Ah, but we still don’t make the mark-10 out of 85 is a compliance rate of 88.2%, which may be enough to downgrade to a supplemental citation, but not enough for outright removal of an RFI.

But is that really the case? I’m thinking perhaps not.

As you know, compliance is generally measured as a function of 12-month periods. If you take your fire extinguisher program to the 12-month parameter, you end up with a total number of monthly inspections as a function of the number of extinguishers (85) and the number of inspections per year per extinguisher (12), which yields 1,020 activities.

With that number in mind, if you only missed 10 inspections, then you still have a compliance rate of better than 99%. No RFI for you, my friend. Your compliance data is most useful in the clarification process.

One further note about auditing: You need to base the timeframe for review on the date of your survey. The only sure way to vacate an RFI is to demonstrate that you were in compliance during the survey.

Now you may say my extinguisher anecdote is an extreme example, but I can assure you that this solution absolutely works in real life (this falls under the “been there, done that” category).

But what if my hospital isn’t in any real accreditation jeopardy? What’s the point of chasing these rainbows? Well, turning once more to Joint Commission official Darlene Christianson in her remarks at the Executive Briefings conference, we learn that, beginning in 2008, the number of RFIs you receive during your triennial survey can influence how soon (or not) you can expect your next survey.

Remember, the survey window will be opening to any time from 18-39 months after your most recent survey-a mighty big window, yes? I’d rather see the 39-month-sized window.

And perhaps most importantly, do you really want to submit an “evidence of standards compliance” response and be responsible for fixing a process that wasn’t broken in the first place? Don’t you have enough to do fixing things that ought to be fixed? I thought so…

Butter isn’t the only thing you can clarify…

Okay, Elvis has left the building (picture a Joint Commission surveyor with sideburns, sunglasses and a big ‘ol belt buckle-bell-bottoms optional) and you’re the proud owner of a handful of EC RFIs.

You tried, tried, tried to negotiate a favorable result during the survey, but a couple of things would not go away. Adding insult to injury, your organization is looking at an adverse decision of conditional accreditation (CON) or even preliminary denial of accreditation (PDA) and the rest of the team is giving you the hairy eyeball because you were supposed to have everything under control-right?!?

Well, the departure of the survey team does not signal the end of things. And, importantly, if your organization is facing an adverse accreditation decision, you need to start digging yourself out of that CON or PDA hole.

The best tool for the job? The clarification process.

But don’t spend too much time moping around because the clock is ticking.

Organizations faced with an adverse decision have a mere 10 business days from your final report to submit clarifications for findings during survey. You need to carefully analyze each finding to identify any RFIs for which clarifying evidence will reverse one or more citations. But which one (or ones) do you choose?

At a recent Joint Commission Executive Briefings conference, Darlene Christianson, executive director of accreditation and certification services at the Joint, urged the audience at a minimum to perform a post-survey audit on all RFIs involving C elements of performance and submit the results as a clarification if the audit demonstrates 90% compliance or better.

What’s that? A score of 90% is all I need? Yes, my friends, you heard it right. And, to sweeten the clarification pot even more, an audit compliance percentage of 80%-89% can, in certain cases, result in a reduction from an RFI to a supplemental recommendation.

So I ask-what are you waiting for? More on this in my next posting . . .

Steve Mac.

EPA issues a ruling on epinephrine salts

Hey everyone, it’s Scott Wallask checking in with an interesting note from the EPA. It appears ephineprine salts are officially off the list of P-listed wastes that the agency regulates.

Click here to read the decision.

Sounds like that’s a welcome announcement.

The data that drives your BMP during a survey

There are a couple of important points with The Joint Commission’s building maintenance program that somehow seems to get lost in the shuffle:

  • The critical role of data in this process
  • The practical application of the BMP during a survey

As the decisions you make regarding inspection frequencies and related activities should be validated by the data you’ve collected, remember that the findings of Joint Commission surveyors also are rendered based on the data collected during survey. This data collected during survey is a significantly smaller sample size than you would be using to validate your program (your entire inventory of a device versus the number viewed during survey).


As a result, you must be prepared to demonstrate the compliance of your program as a function of the post-survey clarification process. A rated door that doesn’t latch here, an exit sign that is not illuminated there, a couple of penetrations somewhere else-you can get into RFI territory very quickly.


My consultative advice for starting this process is the following:

  1. Pick whatever BMP elements you’re going to manage in this fashion (the current list if items you may include can be found on Page 3-15 of the Statement of Conditions)
  2. Identify an inventory of the devices in each category (that’s really the only way to be able to demonstrate that you have a 95% compliance rate for that device)
  3. Determine what frequency you can attain given current resources, though I would counsel at starting frequency of no less than quarterly

Again, there are elements that are not going to require as much attention, but you need to make that decision based on the failure data collected during BMP activities. That can be another challenge: getting the folks doing the inspections to tell you when they found something that wasn’t working properly. Frequently they will just do the repair work and move on without documenting, but the key data is knowing how often the device is not working correctly.


For some additional information, almost akin to a glimpse behind the velvet curtain, you can access the technical paper about the BMP that George Mills (senior engineer at The Joint Commission) and some other folks penned for the American Society for Healthcare Engineering some time ago.


While the paper is somewhat outdated in terms of specific compliance concerns (for example, it references the 1997 Life Safety Code

Fit to be (tied and) tested

I’m sure many of you are watching, with various degrees of trepidation, the pending federal budget that, among other things, will once again let loose the hounds of the Occupational Safety and Health Administration in pursuit of fresh fines. I’m talking about Congress letting OSHA enforce annual tuberculosis fit-testing for respirators.

We could probably spend a good long time (and mayhap one day we will) discussing the efficacy of the practical application of the respiratory protection standard (CFR 1910.134) as a function of managing occupational exposures to TB, or indeed whether there was a significant shortcoming in the nondevelopment of a TB standard for healthcare workers. That said, it appears that enforcement of annual TB fit-testing is going to become a way of life for hospitals.

Hopefully-and you definitely want to do a little assessment here to make sure-you have your new hire process under control from a fit-testing perspective (though I do know of more than a few organizations that are a little soft in this area). Clearly starting at the front end of the process is the way to establish a solid foundation for your program.

Ideally, you will be able use the practical experience from the new hire process to identify an appropriate level of resources for expanding the respiratory protection program to include annual TB fit-testing and all its component pieces (medical evaluations, pulmonary function tests, and the like).

I’m guessing that there aren’t many of you out there with sufficient existing resources to carry this off (if you do-good for you!). It is more than likely that in the near future, you will have to submit some sort of business plan to your organization’s leaders in order to obtain those additional resources, including a fairly well-detailed accounting of the process (this is likely going to be a shared responsibility within the organization, but, make no mistake, this is the organization’s responsibility).

My best advice would be to get a group together, flowchart the process, determine a per-unit expense, and get that request to your organization’s leaders before the compliance canines beset your house.

Things that go BMP in the night

I’m seeing an interesting phenomenon relating to the life safety surveys, the building maintenance program (BMP) as outlined in the SOC, and how the two (sort of) co-exist during surveys.

I know a lot of folks are really working towards a point where they can take advantage, so to speak, of the BMP. That said, I’m not so sure that the BMP is something to be taken advantage of, at least in the classic sense-though an advantage can clearly be gained by adopting this most practical of strategies for managing certain specific elements of your life safety equipment and building features.

The issue with the BMP is that, in and of itself, there is not a great deal of guidance in how one is to set it up. Ideally, the goal of the program is to ensure that it is 95% compliant at any given point in time with the listed items that you’ve chosen to include.

Ultimately, the frequencies with which you’d be checking will be dictated by the performance data you collect during your inspection activities. That can mean there are certain elements that will need to be inspected at greater frequencies than others.

As an example, a client of mine utilizes rolling fire doors to isolate the elevator lobbies. However, given their proximity to the elevators and the very nature of a lot of the traffic using the area (food carts, linen carts, storeroom carts, etc.), these doors receive a more than equitable share of abuse.

Consequently, these doors experience a much greater rate of failure to close and latch than other rated doors in the organization. To manage such a condition using a BMP, it is likely that a greater inspection frequency would need to be employed than, say, fire doors leading into stairways.

As another example, there might be fire doors adjacent to areas like the kitchen, the storeroom, or environmental services that get banged around more and would probably need to be inspected more frequently.

In conversation with George Mills, The Joint Commission’s senior engineer, he described it thusly: You may have X number of fire doors in your facility and 90% of those doors may work correctly every time, but that other 10% of your door inventory is where you need to be more attentive.

You might need to inspect the 10% on a quarterly, monthly, weekly, or even daily frequency depending on what the data tells you. And you might be able to do the remaining 90% on a semiannual or even annual basis (I don’t think you could ever get to a point where a frequency of less than a year would be diligent).

OSHA isn’t checking for annual TB fit-testing yet

Hi everyone —

It’s Scott Wallask up at HCPro. Just an FYI, an OSHA spokesperson confirmed for me today that the agency has not started inspecting for annual fit-testing for tuberculosis (TB), despite what you might have read elsewhere.

OSHA, like all of us, is awaiting final approval of the funding budget for fiscal year 2008. When that happens, it is almost certain that the annual TB fit-testing enforcement will be in effect.

Since 2004, Congress has prohibited OSHA from using budget funds to enforce annual fit-testing provisions for TB, which falls under the respiratory protection standard. But politics and that fellow who snuck back into the country with TB in May shifted the landscape.

Scott W.

A shift in the thinking behind closing hospitals during a disaster

Within The Joint Commission’s revised emergency management standards, an important consideration (and this is clearly derived from the Gulf Coast experience) is to know when your organization can no longer safely sustain patient care and thus must take steps to cease operations, either partially or completely.

This may involve relocation of your operations, the migration of your patients to another facility, or even a mix of the two. Every circumstance has a tipping point and the new defining preparedness characteristic for hospitals is a level of self-awareness that can recognize and act upon that point.

In the past, I think that there was a tacit understanding on the part of everyone involved (hospitals, regulators, communities, etc.) that hospitals would not close, or more to the point, could not close. We need look no further than the legal imbroglios regarding the disposition of patients in the aftermath of Katrina to see that, as an industry, a critical part of our continuity plans is to know when continuation is not possible and, I daresay, could be considered dangerous.

With luck, we will never have to face such circumstances again, but I don’t think the odds are in our favor.

How the revised emergency management standards tie into federal rules

If anything, the Joint Commission’s updated emergency management standards represent a much clearer picture of what might be considered best practices for the structure of your emergency operations plan (which used to be called your disaster or emergency response plan in the standards). The revisions take effect January 1.

Clearly, in this (still) post-9/11 world, the hierarchy of regulatory oversight continues to have the requirements of the federal government at its apex. If your organization has any hopes of funding additional improvements to your preparedness activities, adoption of a response structure that is compliant with the National Incident Management System (NIMS) must be your primary goal. Fortunately, the following six critical areas identified in the new EC.4.13 through EC.4.18 are readily “folded” into NIMS-compliant structures:

  • Communications
  • Resources and assets
  • Safety and security
  • Staff responsibilities
  • Utilities management
  • Patient clinical and support activities

That said, there’s really very little in the way of surprises in the new standards. When the Joint Commission updated the elements of performance under EC.4.20 (the standard requiring disaster drills) last year, several of the above-bulleted critical areas were identified succinctly (communications, resource mobilization, and patient care activities). The remaining newbies primarily resulted from post-Katrina reviews of hospital response in New Orleans and the rest of the Gulf Coast.

The expectation of The Joint Commission is that if your organization is able to get and keep its act together relative to those six areas, then you should be able to manage events of every stripe and magnitude.