RSSAll Entries in the "Life Safety Code" Category

AAMI: What you gonna do?

By necessity, this is going to be somewhat of a hodgepodge of stuff as there are no new postings to The Joint Commission’s PEP site. Looks like we’ll have to wait for July to see what they have to say about the intricacies of LS.02.01.30, though I suspect that we can anticipate some coverage of hazardous locations (Can’t they come up with a better descriptor? While I absolutely understand that the requirements are driven by the point at which the level of stored combustibles in a space is considered “hazardous,” I find that it can be rather challenging when communicating to the clinical folks as to why they shouldn’t randomly convert patient rooms to storage rooms. The refrain is typically “We don’t have anything hazardous in the room, we just have paper supplies, etc.” to which I respond, “Well, you’ve got an amount of paper supplies, etc., that is considered hazardous by the Authority Having Jurisdiction,” to which I tend to receive a blank look and an “oh.” Told you this was going to be a hodgepodge…

Getting back to our PEPpy discussion; I think we’ll find out a bit about smoke barriers, corridor walls, corridor doors, etc. But then again, with the adoption of the 2012 Life Safety Code®, the scheduled PEP modules may be deferred and perhaps they’ll post something relating to whatever changes will be occurring to the Life Safety chapter of the accreditation manual. I think that’s enough blathering about something that didn’t happen, so let’s talk about something that did happen.

At this point, you (or your organization—why I would think that you would have been involved in this is beyond me…) should be well-ensconced in the practical applications of the National Patient Safety Goal regarding the management of risk associated with clinical alarm systems as a function of the Joint Commission requirements thereof:

  • Leadership establishment of alarm system safety as an organizational priority
  • Identification of the most important alarm signals to manage
  • Establishment of policies and procedures for managing those most important alarm signals
  • Education of staff and licensed independent practitioners about the purpose and proper operation of alarms systems for which they are responsible

Hopefully you’ve got that all going in the right direction. But if you’re falling a little short or if, indeed, you’re not sure about progress, the good folks at the Association for the Advancement of Medical Instrumentation (AAMI) have a webpage devoted to information regarding clinical alarms. Those resources include a downloadable Clinical Alarm Management Compendium (maybe CLAMCOMP would be a good acronym) to assist you in these endeavors. If you haven’t yet checked out the AAMI materials, I would very much encourage you to do so (or encourage you to encourage whoever is managing this process in your organization) is because reference to this webpage is included in the rationale statement for this particular safety goal and the stuff that ends up in the accreditation manual has a tendency to take on an elevated criticality in the hands of some surveyors. We could certainly talk about what is and what isn’t “surveyable” as a function of its appearance in any of the manuals, but if you haven’t referenced the AAMI stuff, you are in a position to have defend why you haven’t used such an august reference in your pursuit of alarm safety. To add a bit of heft to this advice, you might also consider the May 18, 2016 edition of Joint Commission Online, in which there is a link to an article published by the AAMI Foundation entitled “Framework for Alarm Management Maturity.”

This may be much ado about nothing, or more likely, folks are already “down” with this. But on the off chance that somebody out there actually uses this space for information and hasn’t run across this, I think it’s worth a mention. You certainly want to make sure that your alarm management process is maturing in proper fashion.

Everybody here comes from somewhere else…

First off, just wanted to wrap up on the missives coming forth from our compadres at The Joint Commission and ASHE relative to the adoption of the 2012 Life Safety Code® (LSC) by CMS. The word on the street would seem to be rather more positive than not, which is generally a good thing. Check out the statements from TJC and ASHE; also, it is useful to note that the ASHE page includes links to additional materials, including a comparison of the 2000 and 2012 editions of the LSC, so worth checking out.

At this point, it’s tough to say how much fodder there will for future fireside chats. It does appear that the adoption of the 2012, while making things somewhat simpler in terms of the practical designation of sleeping and non-sleeping suites (Don’t you wish they had “bumped” up the allowable square footage of the non-sleeping suites? Wouldn’t that have been nice.), combustible decorations and some of the other areas covered by the previously issued CMS categorical waivers (If you need a refresher, these should do you pretty well: ASHE waiver chart and Joint Commission), isn’t necessarily going to result in a significant change in the numbers and types of findings being generated during Joint Commission surveys. From my careful observation of all the data I can lay hands on, the stuff that they’re finding is still going to be the stuff that they are likely to continue to find as they are the “deficiencies” most likely to occur (going back to the “no perfect buildings” concept—a lovely goal, but pretty much as unattainable as Neverland). I’m not entirely certain what will have to occur to actually bring about a change in EC/LS concerns predominance on the Top 10 list; it’s the stuff I can pretty much always find (and folks usually know when I’m coming, so I’ve pretty much lost the element of surprise on the consulting trail). Now, it may be that the new matrix scoring methodology will reduce the amount of trouble you can get into as the result of existing deficiencies—that’s the piece of this whole thing that interests me the most—but I see no reason to think that those vulnerabilities will somehow eradicate themselves. I suppose there is an analogy relative to the annual review of our hazard vulnerability analysis (HVA)—the vulnerabilities will always exist—what changes (or should change) is our preparedness to appropriately manager those vulnerabilities. Makes me wonder if it would be worth doing an EC/LS HVA kind of thing—perhaps some sage individual has already tackled that—sing out if you have. At any rate, I’ll be keeping a close eye on developments and will share anything I encounter, so please stay tuned.

Hopping over to the bully pulpit for a moment, I just want to rant a bit on what I think should be on the endangered species list—that most uncommon of beasties—the kind and decent person. I know that everyone is nice to folks they know (more or less), but there seems to be a run on a certain indifference to politeness, etc., that, to be honest, makes me see a little read from time to time. But then I think to myself that it is probably just as rude to overreact to someone else’s rudeness, so take some deep cleansing breaths and let it go. Now I would love to hear from folks that they haven’t noticed this shift and that their encounters with folks are graced with tolerance, kindness, etc.; it would do my heart good. Maybe it’s just me…but somehow I’m thinking maybe not.

Please enjoy your week responsibly and we’ll see what mischief we can get into next week.

You better start swimming or you’ll sink like a stone…

In their pursuit of continuing relevance in an ever-changing regulatory landscape, The Joint Commission announced what appears to be a fairly significant change in the survey reporting process. At first blush, it appears that this change is going to make the post-survey process a little simpler, recognizing that simplification of process sometimes ends up not being quite so simple. But as always, I will choose to remain optimistic until proven otherwise.

So the changes in the process as outlined in this week’s missive shake out into three categories: scoring methodology, post-survey follow-up activities, and submission time frames for Evidence of Standards Compliance (ESC). And I have to say that the changes are not only interesting, but appear to represent something of a shift in the framework for administering surveys. Relative to the scoring methodology, it appears that the intent is to do away with the “A” and “C” categories, as well as the designation of whether the performance element is a direct or indirect impact finding. The new process will revolve around a determination of whether a deficient practice or condition is likely to cause harm and, more or less, how frequently the deficient practice or condition is observed. As with so many things in the regulatory realm, this new methodology reduces to a kicky new acronym: SAFER (Survey Analysis For Evaluating Risk) and comes complete with a matrix upon which each deficiency will be placed. You can see the matrix in all its glory through the link above, but it’s basically a 3 x 3 grid with an x-axis of scope (frequency with which the deficiency was observed) and a y-axis of likelihood to result in harm. This new format should make for an interesting looking survey report, to say the least.

Relative to the post-survey follow-up activities, it appears that the section of the survey report (Opportunities for Improvement) that enumerates those single instances of non-compliance for “C” Elements of Performance will “no longer exist” (which makes sense if they are doing away with the “C” Element of Performance concept). While it is not explicitly noted, I’m going to go out on a limb here and guess that this means that the deficiencies formerly known as Opportunities for Improvement will be “reported” as Requirements for Improvement (or whatever RFIs become in the SAFER model), so we may be looking at having to respond to any and all deficiencies that are identified during the course of the survey. To take that thought a wee bit further, I’m thinking that this might also alter the process for clarifying findings post-survey. I don’t imagine for a moment that this is the last missive that TJC will issue on this topic, so I guess we’ll have to wait and see how things unfold.

As far as the ESC submission timeframes, with the departure of the direct and indirect designations for findings comes a “once size fits all” due date of 60 days (I’m glad it was a “45 days fits all” timeframe), so that makes things a little less complicated. But there is a notation that information regarding the sustainment of corrective actions will be required depending on where the findings fall on the matrix, which presumable means that deficiencies clustered in the lower left corner of the matrix (low probability of harm, infrequent occurrence) will drive a simple correction while findings in the upper right corner of the matrix will require a little more forethought and planning in regards to corrective actions.

The rollout timeframe outlined in the article indicates that psychiatric hospitals that use TJC for deemed status accreditation will start seeing the new format beginning June 6 (one month from this writing) and everyone else will start seeing the matrix in their accreditation survey reports starting in January 2017. I’m really curious to see how this is all going to pan out in relation to the survey of the physical environment. Based on past practices, I don’t know that (for the most part) the deficiencies identified in the EC/EM/LS part of the survey process wouldn’t mostly reside in that lower left quadrant, but I suppose this may result in focus on fewer specific elements (say, penetrations) and a more concerted approach to finding those types of deficiencies. But with the adoption of the 2012 Life Safety Code®, I guess this gives us something new to wait for…

News flash: Vacuum cleaner sucks up budgie! Is you is or is you ain’t my baby?

As we continue our crawl (albeit an accelerated one) towards CMS adoption of the 2012 edition of NFPA 101 Life Safety Code® (LSC), we come face to face with what may very well be the final step (or in this case, leap) in the compliance walkway. While there is some language contained in the final rule (and in the press release) that I feel is a little contradictory (but after all, it is the feds), the summary section of the final rule does indeed indicate that “(f)urther, this final rule will adopt the 2012 edition of the Life Safety Code (LSC) and eliminate references in our regulations to all earlier editions of the Life Safety Code. It will also adopt the 2012 edition of the Health Care Facilities Code, with some exceptions.” I suspect that there will be multiple machinations in the wake of this, but it does appear that (cue the white smoke) we have a new pope, er, Life Safety Code®! You can find all 130+ pages here.

Interestingly enough, the information release focuses on some of the previously issued categorical waivers seemingly aimed at increasing the “homeyness” (as opposed to homeliness) of healthcare facilities (primarily long-term care facilities) to aid in promoting a more healing environment. It also highlights a couple of elements that would seem to lean towards a continuation of the piecemeal approach used to get us to this point, so (and again, it’s the feds), it’s not quite framed as the earth-shattering announcement that it appears to be:

  • Healthcare facilities located in buildings that are taller than 75 feet are required to install automatic sprinkler systems within 12 years after the rule’s effective date. So, the clock is ticking for you folks in unsprinklered tall buildings
  • Healthcare facilities are required to have a fire watch or building evacuation if their sprinkler system is out of service for more than 10 hours. So, a little more flexibility on the ILSM side of things, though that building evacuation element seems a little funky (not necessarily in a bad way).
  • For ambulatory surgery centers (ASC), all doors to hazardous areas must be self-closing or must close automatically. To be honest, I always considered the requirements of NFPA 101-2000:8.4.1.1 to be applicable regardless of occupancy classification, but hey, I guess it’s all in the eye of the beholder.
  • Also, for ASCs, you can have alcohol-based hand rub dispensers in the corridors. Woo hoo!

I guess it will be interesting to see what happens in the wake of this final rule. I guess this means we’ll have to find something else upon which to fret…

As a related aside, if you folks don’t currently subscribe to CMS News, you can sign up for e-mail updates by going to the CMS homepage and scrolling down to the bottom of the page. I will tell you that there’s a lot of stuff that is issued, pretty much on a daily basis, much of it not particularly germane to the safety community, but every once in a while…

It’s a new dawn, it’s a new day, it’s a new life for you. What do you plan on doing now?

What is it they say about the best-laid plans? A chortle-free, portal-free zone!

Well, I don’t know that I’m disappointed, per se, but I was expecting The Joint Commission to add something new to its physical environment portal, but that appears not to be the case. I guess this calls for an extended drum roll…

But that’s not to say that our friends in Chicago have not been busy—anything but. In fact, it’s been quite a preponderance of stuff this past few days, starting with the 2015 Top 5 most-cited standards. Anyone who bet the under on findings in the physical environment came up a bit short, but surely that can’t be very much of a surprise. We’ve covered the particulars pretty much ad nauseum, but if there’s anybody out there in the studio audience that has any specific questions regarding our top 5, I would be happy to do so again.

So we have the following:

 

EC.02.06.01—Maintaining a safe environment

IC.02.02.01—Reducing infection risk associated with equipment, devices. and supplies

EC.02.05.01—Managing utility system risks

LS.02.01.20—Maintaining egress integrity

LS.02.01.30—Building features provided and maintained to protect from fire and smoke hazards

 

I suppose a wee bit of shifting in terms of the order of things, but I can’t say that there are any “shockahs” (after all, I am from Bawston) in the mix. Again, if someone has something specific they’d like me to discuss, I would be more than happy to do exactly that. Check out the online stuff; alternatively, you can also refer to the April edition of Perspectives.

But wait, there’s more…

We also have some new/updated resources for Life Safety Code® compliance, including guidance on how the facility tour is going to be administered, a comprehensive list of documents that would be included in the survey process, information regarding PFI change and equivalency requests, and a bunch of other stuff. You can find all this information online. Something tells me that, at some point, you may be able to link to all this stuff from the Portal (if that is not already the case, that’s what I would do).

And, to finish off a big week of new information, there is a new posting to help the Emergency Management cause. Namely, some resources having to do with the management of active shooter incidents, etc., featuring the joint resource for healthcare providers issued by the Departments of Homeland Security and Health and Human Services to assist with situational awareness and preparedness in the aftermath of the terrorist attacks in Brussels. The focus/intent being to use recent events as an opportunity to reinforce the importance of vigilance and security in our organizations. It is certainly an area for some concern (and, as always, an area of opportunity) and I think that it is very likely that this will continue to be a big piece of the survey puzzle when it comes to emergency management. The risks associated with acts of violence appear to be relatively unabated in society at large and it comes back to the healthcare safety and security professionals to ensure that our organizations are appropriately managing those risks to the extent possible and working towards an emergency response capability that keeps folks safe.

That’s the wrap-up for this week; not sure if any fireside chats are looming close on the horizon, but rest assured, we will keep you apprised of any and all portal-related activity.

Welcome Spring!

So many FSAs, so little time…and all we get is MBW

Flexible Spending Account, Federal Student Aid, Food Services of America, Focused Standards Assessment.

So, I am forced to pick one. While I’m sure the lot of them is most estimable in many ways, I suppose the choice is clear: the freaking Focused Standards Assessment (kind of makes it an FFSA, or a double-F S A…what the…).

Just to refresh things a bit, the FSA is a requirement of the accreditation process in which a healthcare organization (I’m thinking that if you weren’t in healthcare, you probably would be choosing one of the other FSAs) reviews its compliance with a selected batch of Joint Commission accreditation requirements. The selections include elements from the National Patient Safety Goals, some direct and indirect impact standards and performance elements, high-risk areas, as well as the RFIs from your last survey—and I know you’ve continued to “work” those Measures of Success from your last survey. Ostensibly, this is very much an “open book” test, if you will—a test you get to grade for yourself and one for which there is no requirement to share the results with the teacher (in this case, The Joint Commission—I really don’t understand why folks submit their results to TJC, but some do—I guess some things are just beyond my ken…).

The overarching intent is to establish a process that enhances an organization’s continuous survey readiness activities (of course, as I see various and sundry survey results, I can’t help but think that the effectiveness of this process would be tough to quantify). I guess it’s somewhat less invasive than the DNV annual consultative visits, though you could certainly bring in consultants to fulfill the role of surveyor for this process if some fresh eyes are what your organization needs to keep things moving on the accreditation front.

I will freely admit to getting hung up a bit on the efficacy of this as a process; much like the required management plans (an exercise in compliance), this process doesn’t necessarily bring a lot of value to the table. Unless you actually conduct a thorough evaluation of the organization’s compliance with the 45 Environment of Care performance elements, 13 Emergency Management performance elements, 23 Life Safety performance elements (15 for healthcare occupancies, eight for ambulatory healthcare occupancies)—and who really has the time for all that—then does the process have any value beyond MBW (more busy work)? I throw the question out to you folks—the process is required by TJC, so I don’t want anyone to get in trouble for sharing—but if anyone has made good use of this process, I would be very interested in hearing all about it.

This is my last piece on the FSA process for the moment, unless folks are clamoring for something in particular. I had intended to list the EPs individually, but I think my best advice is for you to check them out for yourself. That said, I have a quick and dirty checklist of the required elements (minus the EP numbers, but those are kind of etched into my brain at this point). If you want a copy, just email me at smacarthur@greeley.com.

Brother, can you spare any change…

In the interest of time and space (it’s about time, it’s about space, it’s about two men in the strangest place…), I’m going to chunk the EM and LS risk areas that are now specifically included in the Focused Standards Assessment (FSA) process (previously, the risk areas were only in the EC chapter). Next week, I want to take one more chunk of your time to discuss now the FSA process (particularly as a function of what EPs the folks in Chicago have identified as being of critical importance/status). But for the moment, here are the add-ons for 2016:

Emergency Management

 

  • participation of organizational leadership, including medical staff, in emergency planning activities (you need to have a clear documentation trail)
  • your HVA; (interesting that they’ve decided to include this one—they must have found enough folks that have let the HVA process languish)
  • your documented EM inventory (I think it’s important to have a very clear definition of what this means for your organization)
  • participation of leadership, including medical staff, in development of the emergency operations plan (again, documentation trail is important)
  • the written EOP itself (not sure about this addition—on the face of it, it doesn’t necessarily make a lot of sense from a practical standpoint)
  • the annual review of the HVA (my advice is to package an analysis of the HVA with the review of the EOP and inventory)
  • annual review of the objectives and scope of the EOP
  • annual review of the inventory
  • reviewing activations of the EOP to ensure you have enough activations of the right type (important to define an influx exercise, as well as, a scenario for an event without community support)
  • identification of deficiencies and opportunities during those activations—this means don’t try to “sell” a surveyor an exercise in which nothing went awry—if the exercise is correctly administered, there will always, always, always be deficiencies and/or opportunities. If you don’t come up with any improvements, the you have, for all intents and purposes, wasted your time… (Perhaps a little harsh, but I think you hear what I’m saying)

Life Safety

 

  • Maintenance of documentation of any inspections and approvals made by state or local fire control agencies (I think you could make a case for having this information attached to the presentation of waivers, particularly if you have specific approvals from state or local AHJs that could be represented as waivers)
  • Door locking arrangements (be on the lookout for thumb latches and deadbolts on egress doors—there is much frowning when these arrangements are encountered during survey)
  • Protection of hazardous areas (I think this extends beyond making sure that the hazardous areas you’ve identified are properly maintained into the realm of patient spaces that are converted to combustible storage. I think at this point, we’ve all see some evidence of this. Be on the lookout!)
  • Appropriate protection of your fire alarm control panel (for want of a smoke detector…)
  • Appropriate availability of K-type fire extinguishers (this includes appropriate signage—that’s been a fairly frequent flyer in surveys of late)
  • Appropriate fire separations between healthcare and ambulatory healthcare occupancies (a simple thing to keep an eye on—or is it? You tell me…)
  • Protection of hazardous areas in ambulatory healthcare occupancies (same as above)
  • Protection of fire alarm control panels in ambulatory occupancies (same as above)

 

I would imagine that a fair amount of thought goes into deciding what to include in the FSA (and, in the aggregate, the number of EPs they want assessed in this process has gotten decidedly chunkier—I guess sometimes more is more), so next week we’ll chat a bit about what it all means.

Fear is not sustainable

A Welshman of some repute once noted that “fear is a man’s best friend” and while that may have been the case in a Darwinian sense, I don’t know that the safety community can rely as much on it as a means of sustainable improvement. I’ve worked in healthcare for a long time and I have definitely encountered organizational leaders that traded in the threat of reprisal, etc., if imperfections were encountered in the workplace (and trust me when I say that “back in the day” something as simple as a match behind a door—left by a prickly VP to see how long it stayed there—could result in all sorts of holy heck), it typically resulted in various recriminations, fingerpointing, etc., none of which ended up meaning much in the way of sustained improvement. What happened was (to quote another popular bard—one from this side of the pond), folks tended to “end up like a dog that’s been beat too much,” so when the wicked witch goes away, the fear goes too, and with it the driving force to stay one step ahead of the sheriff (mixing a ton of metaphors here—hopefully I haven’t tipped the obfuscation scales).

At any rate, this all ties back to the manner in which the accreditation surveys are being performed, which is based on a couple of “truisms”:

 

  1. There is no such thing as a perfect building/environment/process, etc.
  2. Buildings are never more perfect than the moment before you put people in them.
  3. You know that.
  4. The regulators know that.
  5. The regulators can no longer visit your facility and return a verdict of no findings, because there are always things to find.
  6. See #1.

Again, looking at the survey process, the clinical surveyors may look at, I don’t know, maybe a couple of dozen patients at the most, during a survey. But when it comes to the physical environment, there are hundreds of thousands of square feet (and if you want to talk cubic feet, the numbers get quite large, quite quickly) that are surveyed—and not just the Life Safety (LS) surveyor. Every member of the survey team is looking at the physical environment (with varying degrees of competency—that’s an editorial aside), so scrutiny of the physical environment has basically evolved (mutated?) since 2007 from a couple hours of poking around by an administrative surveyor to upwards of 30 hours (based on a three-day survey; the LS surveyor accounts for 16 hours, and then you will have the other team members doing tracers that accounts for at least another 16 hours or so) of looking around your building. So the question really becomes how long and how hard will they have to look to find something that doesn’t “smell” right to them. And I think we all know the answer to that…

It all comes back (at least in my mind’s eye) to how effectively we can manage the imperfections that we know are out there. People bump stuff, people break stuff, people do all kinds of things that result in “wear and tear” and while I do recognize that the infamous “non-intact surface” makes is more difficult to clean and/or maintain, is there a hospital anywhere that has absolutely pristine horizontal and vertical surfaces, etc.? I tend to think not, but the follow-up question is: to what extent do these imperfections contribute to a physical environment that does not safely support patient care? This is certainly a question for which we need to have some sense of where we stand—I’m guessing there’s nobody out there with a 0% rate for healthcare-acquired infections, so to what degree can we say that all these little dings and scrapes do not put patients at risk to the extent that we cannot manage that level of risk? My gut says that the environment (or at least the environmental conditions that I’m seeing cited during surveys) is not the culprit, but I don’t know. As you all know by now (if you’ve been keeping tabs on me for any length of time), I am a big proponent of the risk assessment process, but has it come to the point where we have to conduct a risk assessment for, say, a damaged head wall in a patient room? Yes, I know we want to try and fix these types of conditions, but there are certain things that you can’t do while a patient is in the room and I really don’t think that it enhances patient care to be moving patients hither and yon to get in and fix surfaces, etc. But if we don’t do that, we run the risk of getting socked during a survey.

The appropriate management of the physical environment is a critical component of the safe delivery of healthcare and the key dynamic in that effort is a robust process for reporting imperfections as soon as possible (the “if you see something, say something” mantra—maybe we could push on “if you do something, say something”) so resources can be allocated for corrective actions. And somehow, I don’t think fear is going to get us to that point. We have to establish a truly collaborative, non-knee-jerk punitive relationship with the folks at the point of care, point of service. We have to find out when and where there are imperfections to be perfected as soon as humanly possible, otherwise, the prevalence of EC/LS survey findings will continue in perpetuity (or something really close to that). And while there may be some employment security pour moi in that perpetual scrutiny, I would much rather have a survey process that focuses on how well we manage the environment and not so much on the slings and arrows of day-to-day wear and tear. What say you?

May I? Not bloody likely! The secret world of ‘NO EXIT’ signs

There’s been something of a “run” on a particular set of findings and since this particular finding “lives” in LS.02.01.20 (the hospital maintains the integrity of egress), one of the most frequently cited standards so far in 2015 (okay, actually egress findings have been among the most frequently cited standards pretty much since they’ve bene keeping track of such things), it seems like it might not be a bad idea to spend a little time discussing why this might be the case. And of course, I am speaking to that most esoteric of citations, the “NO EXIT” deficiency.

For my money (not that I have a lot to work with), a lot of the “confusion” in this particular realm is due to The Joint Commission adopting some standards language that, while perhaps providing something a little bit more flexible (and I will go no further than saying perhaps on this one, because I really don’t think the TJC language helps clarify anything), in doing so, creates something of a box when it comes to egress (small pun intended). The language used by NFPA (Life Safety Code® 2000 edition 7.10.8.1) reads “any door, passage, or stairway that is neither an exit nor a way of exit access and that is arranged so that it is likely [my italics] to be mistaken for an exit shall be identified by a sign that reads as follows: NO EXIT.” To be honest, I kind of like the “likely” here—more on that in a moment.

Now our friends in Chicago take a somewhat different position on this: Signs reading ‘NO EXIT’ are posted on any door, passage, or stairway that is neither an exit nor an access to an exit but may (my italics, yet again) be mistaken for an exit. (For full text and any exceptions, refer to NFPA 101 – 2000: 7.10.8.1.) If you ask me, there’s a fair distance between something that “may” be mistaken for something else, like an exit and something that is likely to be mistaken for something else, like that very same exit. The way this appears to be manifesting itself is those pesky exterior doors that lead out into courtyard/patio areas that are not, strictly speaking, part of an egress route. Of especially compelling scrutiny are what I will generally describe as “storefront doors”—pretty much a full pane of glass that allows you to see the outside world and I will tell you (from personal experience) that these are really tough findings to clarify post-survey. Very tough, indeed.

So it would behoove you to take a gander around your exterior doors to see if any of those doors are neither an exit nor an access to an exit and MAY be mistaken for an exit. For some of you this may be a LIKELY condition, so you may want to invest in some NO EXIT signs. And please make sure they say just that; on this, the LSC is very specific in terms of the wording, as well as the stroke of the letters: “Such sign shall have the word NO in letters 2 inch (5 cm) high with a stroke width of 3/8 inch (1 cm) and the word EXIT in letters 1 inch (2.5 cm) high, with the word EXIT below the word NO.” This way you won’t be as likely to be cited for this condition as you may have before…

I’m getting too old for this shift…

Because of the nature of the survey process as currently administered by our good friends in Chicago, I periodically have the opportunity to work with clients after they have been surveyed, sometimes developing cogent and not-too-ambitious corrective action plans, and sometimes working with them to try and clarify findings that were based on the surveyor(s) identifying the one or two imperfections in what was otherwise a pretty solid process. Generally speaking, these are “C” elements of performance, based in the concept that to demonstrate substantial compliance with the standard/EP in question, you would provide data to support an historical compliance rate of 90% or better. The classic example of a finding that one would always try to clarify is if the surveyor turns up a fire extinguisher (or two) with some missing monthly inspections (depending on how you inspect your fire extinguishers, there is always the possibility for something to get overlooked, etc.—again, an imperfection in the process). Classically, since each fire extinguisher has 12 monthly inspections per year, you could “miss” one of those 12 inspections and still have a compliance rate greater than 90%–in this instance, 91.66666666666667%, which I’m as positive as I can be is a numerical value in excess of the desired 90% level. So, unless you had a completely broken process for doing the monthly fire extinguisher inspections, even if they found a couple of missing months during survey, the overall picture would be workable through clarification. To take the example just a bit further, say you had 100 fire extinguishers in your inventory, which represents 1,200 potential data points over the course of a year, and the finding was that a fire extinguisher in each of two mechanical spaces was missing the two most recent inspections because the person who usually inspected them was out on leave. That would be four findings of non-compliance identified during survey, but if you compare those four instances of non-compliance to the 1,196 findings of compliance, you would have a compliance rate of about 99.6%, another numerical value that exceeds the 90% mark.

I think that’s pretty straightforward as a going concern, or at least it was until recently. (By the way, I have successfully used the above-noted strategy for clarifying fire extinguisher findings any number of times in the past. Really, this would be as close to a sure thing as anything I can think of.) In working with a client on a fire extinguisher clarification, the correspondence coming back from Chicago indicated that the clarification was not accepted because “100% of life safety devices are required to be inspected at the defined monthly frequency.” Without mentioning any names, I can say that this adjudication was issued by someone in the Engineering office with whom I am not familiar, so it may be that what we have here is an isolated strict, strict, strict interpretation (my choice is to remain hopeful until proven otherwise), but if this type of interpretation is to be applied to “C” Elements of Performance, then what indeed is the rationale for having the “C” Elements of Performance? I hear a lot of information regarding regulators trying to “work with” hospitals, but if the benchmark for maintaining everything under EC.02.03.05 is going to be perfection (and, presumably that requirement could be extended to elements in clinical engineering and utility systems management, both of which nominally fall under the aegis of NFPA 99, compliance with which is required by the Life Safety Code®), how are these most frequently cited standards going to subside in frequency? I understand that everyone involved (regulator and regulated) has a responsibility to ensure that patients, staff, and visitors are provided as safe and comfortable environment as possible, but if every swing of the bat has to be a home run, every pass downfield a touchdown (I could go on, but I will desist), the odds are very squarely stacked against the folks in the field who have to make this happen. And I, personally, do not think that that is a very good thing at all.

I suspect that I’ll have more to say about this through the coming weeks. Again, I will remain hopeful, but if this is the future, we’re in for a very bumpy ride!