RSSAll Entries Tagged With: "life safety"

Reefing a sail at the edge of the world…

What to do, what to do, what to do…

A couple of CMS-related items for your consideration this week, both of which appear to be rather user-friendly toward accredited organizations. (Why do I have this nagging feeling that this is going to result in some sort of ugly backlash for hospitals?)

Back in May, we discussed the plans CMS had for requiring accreditation organizations (AOs) to make survey results public, and it appears that, upon what I can only imagine was intense review and consideration, the CMS-ers have elected to pull back from that strategy. The decision, according to news sources, is based on the sum and substance of a portion of Section 1865 of the Social Security Act, which states:

(b) The Secretary may not disclose any accreditation survey (other than a survey with respect to a home health agency) made and released to the Secretary by the American Osteopathic Association or any other national accreditation body, of an entity accredited by such body, except that the Secretary may disclose such a survey and information related to such a survey to the extent such survey and information relate to an enforcement action taken by the Secretary.

So, that pretty much brings that whole thing to a screeching halt—nice work of whoever tracked that one down. Every once in a while, law and statute work in favor of the little folk. So, we Lilliputians salute whomever tracked that one down—woohoo!

In other CMS news, the Feds issued a clarification relative to the annual inspection of smoke barrier doors (turns out the LSC does not specifically require this for smoke doors in healthcare occupancies) as well as delaying the drop-dead date for initial compliance with the requirements relating to the annual inspection of fire doors. January 1, 2018 is the new date. If you haven’t gotten around to completing the fire door inspection, I would heartily recommend you do so as soon as you can—more on that in a moment. So, good news on two fed fronts—it’s almost like Christmas in August! But I do have a couple of caveats…

I am aware of 2017 surveys since July in which findings were issued because the inspection process had not been completed, and, based on past knowledge, etc., it is unlikely that those findings would be “removable” based on the extended initial compliance date. (CMS strongly indicates that once a survey finding is issued in a report, the finding should stay, even if there was compliance at the time of survey.) So hopefully this will not cause too much heartburn for folks.

The other piece of this is performance element #2 under the first standard in the Life Safety chapter. (This performance element is not based on anything specifically required by the LSC or the Conditions of Participation—yet another instance of our Chicagoan friends increasing the degree of difficulty for ensuring compliance without having a whole mess of statutory support, but I digress.) The requirement therein is for organizations to perform a building assessment to determine compliance with the Life Safety chapter—and this is very, very important—in time frames defined by the hospital. I will freely admit that this one didn’t really jump out at me until recently, and my best advice is to get going with defining the time frame for doing those building assessments; it kind of “smells” like a combination of a Building Maintenance Program (BMP) and Focused Standards Assessment (FSA), so this might not be that big a deal, though I think I would encourage you to make very sure that you clearly indicate the completion of this process, even if you are using the FSA process as the framework for doing so. In fact, that might be one way to go about it—the building assessment to determine compliance with the Life Safety chapter will be completed as a function of the annual FSA process. I can’t imagine that TJC would “buy” anything less than a triennial frequency, but the performance element does not specify, so maybe, just maybe…

Ring out, solstice bells!

And so we turn again to our perusal of the bounty that is the December issue of Perspectives and that most splendid of pursuits, the Clarifications and Expectations column. With the pending changes to the Life Safety (LS) chapter, it appears that we are in for a sequential review of said chapter, starting at the beginning (the process/program for managing LS compliance within your organization) and (at least for now) moving to a deep dive into the ILSM process in January—so stay tuned!

So let’s talk a little bit about the requirements relative to how the physical environment is designed and managed in such a manner as to comply with the Life Safety Code® (LSC). Previously, there were but four performance elements here: assigning someone to manage the process (assessing compliance, completing the eSOC, managing the resolution of deficiencies); maintaining a current eSOC; meeting the completion time frames for PFIs (did you ever think we would get to a point where we could miss those three letters?); and, for deemed status hospitals, maintaining documentation of AHJ inspections. For good or ill (time, as always, will be the final judge), the number of performance elements has grown to six with a slight modification to some of the elements due to the shift away from the eSOC as one of the key LS compliance documents and the evolution (mutation?) of our friend the Plan for Improvement into the Survey-Related PFI. With greater numbers of performance elements, I guess there will be a subsequent increase in confusion, etc. regarding interpretations (yours, mine, theirs) as to what it all means, which leaves us with requirements to:

 

  • Designate resources for assessing life safety compliance (evidence could be letters of assignment, position descriptions, documentation in meeting minutes); the survey process will include an evaluation of the effectiveness of the chosen method(s) for assessing LS compliance

 

  • Performance of a formal LS compliance assessment of your facility—based on time frames determined by your organization (big freaking hint: “best practice” would be at least annually); you can modify/adjust time frames based on the stability of your physical environment (if there’s not a lot going on, you might be able to reduce frequencies, though I haven’t been to too many places that didn’t have some activities that would impact LS compliance (Can you say “network cabling”? Sure you can!). Also, there is mention of the use of certain performance elements sprinkled throughout the LS chapter that will be used for any findings that are not specifically covered by the established performance elements. Clearly, there is a desire to leave no stone unturned and no deficiency unrecorded. Yippee!

 

  • Maintaining current and accurate life safety drawings; we’ve covered this in the past (going back to 2012), but there are still some folks getting tagged for having incomplete, inaccurate or otherwise less-than, life safety drawings. Strictly speaking, the LS drawings are the cornerstone of your entire LS compliance efforts; if they need updating and you have a survey any time in the next 12-18 months, you better start the leveraging process for getting them reviewed/revised. They don’t tell you how to do it, but if they’re not on auto-cad at this point, you better have a wizard for whatever program you are using. All they need to do is find one inconsistency and they can cite it…ugh! Check out the list in Perspectives and make sure that you can account for all of it.

 

  • Process for resolving deficiencies identified during the survey; we know we have 60 days to fix stuff found during the survey (and hopefully they don’t find anything that will take longer than that to resolve—I have this feeling that that process is going to be exceptionally unwieldy—and probably unyielding to boot). The performance element covers the process for requesting a time-limited waiver—that’s got to happen within 30 days of the end of the survey. Also, the process for requesting equivalencies lives here (if folks need a refresher on equivalencies, let me know and I will put that on the list for 2017 topics). Finally, this is also where the official invocation of the ILSM process as a function of the post-survey process is articulated (I think we covered that pretty thoroughly last week, but if you have questions—go for it!).

 

  • Maintaining documentation of any inspections and approvals (read: equivalencies) made by state or local AHJs; you’ve got to have this stuff organized and in a place you can lay your hands on it. Make sure you know how often your AHJs visit and make sure that you have some evidence of their “presence.” I think it also makes sense to keep any inspections from your property insurers handy—they are almost as powerful an AHJ as any in the process and you don’t want to run afoul of them—they can have a significant financial impact if something goes sideways with your building.

 

  • The last one is a little curious to me; I understand why they’re saying it from a global perspective, but it really makes me wonder what prompted specific mention. You can read the details of the language in Perspectives, but my interpretation of this is “don’t try any funny stuff when you’re renovating interior spaces and leave 4-foot corridor widths, etc., when you have clearly done more to the space than ‘updated finishes.’” I think this is the call-to-arms relative to having a good working knowledge of Chapter 43 of the 2012 You need to know what constitutes: repair; renovation; modification; reconstruction; change of use or occupancy classification; addition (as opposed to subtraction). Each of these activities can reach a degree/scope that “tips” the scales relative to the requirements of new versus existing and if you haven’t made that determination (sounds very much like another risk assessment, don’t it?) then you can leave it in the hands of a surveyor to apply the most draconian logic imaginable (I think draconian logic might be oxymoronic—and you can put the accent on either syllable), which will not bode well for survey success.

 

That’s the word from unity for this week; next week, we’ll check up on some Emergency Management doings in the wake of recent flooding, including some updates to the Joint Commission’s Emergency Management Portal (EMP?). Hope your solstice salutations are merry and bright until next time!

Gathering gobs of Grinchiness

As the ol’ Physical Environment Portal remains barren of new goodies (maybe we will awake the morning of December 25 and find crisply wrapped interpretations under the tree—oh, what joy for every girl and boy!), we will turn yet again to the annals of Perspectives to mayhap glean some clarity from that august source of information. I suspect that as the December issue is chock-a-block full of life safety and emergency management goodness, we’ll be chatting about the contents for a couple of sessions. First, the big news (or what I think/suspect is the news that is likely to have the most far-reaching implications for survey year 2017): a survey process change relative to the evaluation of Interim Life Safety Measures. Actually, I should note that, as the changes were effective November 17, 2016, those of you experiencing surveys ‘twixt then and the end of the year will also be subject to this slight alteration.

So, effective 11/17/16 (the 46th anniversary of the recording of Elton John’s landmark live album 11/17/70—coincidence? Probably…), the evaluation of your ILSM process (inclusive of the policy, any risk assessments, etc.) will be expanded to include discussion of how, and to what extent, ILSMs will be implemented when there are LS deficiencies identified during your survey that (presumably) cannot be immediately corrected, based on your ILSM policy. Sounds pretty straightforward, but it does make me wonder how the LS surveyor is going to have enough time to review your documentation, thoroughly survey your facility, and then sit down to review any LS findings and discuss how your ILSM policy/process comes into play. I have to tell you, when I first read this, my thought immediately went to “one more day of LS surveying to endure for any reasonably-sized hospital” and, taking into consideration all the other changes going on, while I hope I am incorrect, it does make me wonder, wonder, wonder. Also, the ILSM(s) to be implemented until the deficiency is resolved will be noted in the final survey report, so it probably behooves you to have a process in place to be able to FIFI (find it, fix it) every LS deficiency as it is encountered—and since everything counts with the abolition of the “C” performance elements, you know what you probably need to do.

At any rate, with the announcement that we can expect full coverage of the ILSM standard, there was also a note that an additional performance element has been added to provide for any additional ILSMs you might want to use that are not specifically addressed in the other performance elements for this standard. I’m not exactly sure how this would play out from a practical standpoint; maybe you could specifically include in your policy a provision for checking exit routes in construction only when the space is occupied, etc. As near as I can remember, the only instance I can think of somebody being cited for having an ILSM in their policy that did not precisely reflect the performance elements in the standard was back when the EP regarding the prohibition of smoking was discontinued from the standard; there were a few persnickety surveyors who cited folks for not having removed that from their policy (persnickety is as persnickety does), but that’s all I can think of.

Next week, we’ll chat a bit about some of the pending changes to the Life Safety chapter wrought by the adoption of the 2012 Life Safety Code®. In a word, riveting!

History shows again and again how standards (and EPs) point out the folly of men…

It’s beginning to look like the proofreaders in Chicago must be enduring some late nights watching the Cubs! I don’t know about you folks, but I rely rather heavily on the regular missives from The Joint Commission, collectively known as Joint Commission E-Alerts. The E-Alerts deliver regular packages of yummy goodness to my email box (okay, that may be a little hyperbolic) and yesterday’s missive was no exception. Well, actually, there was an exception—more on that in a moment.

While it did not get top billing in the Alert (which seems kind of odd given what’s been going on this year), the pre-publication changes to the Life Safety chapter of the accreditation manual have been revealed, including comparison tables between what we had in January 2016 and what we can expect in January 2017. Interestingly enough, the comparison tables include the Environment of Care (EC) chapter stuff as well (though the EC chapter did not merit a mention in the E-Alert), so there’s lots of information to consider (which we will be doing over the course of the next little while) and some subtle alterations to the standards/EP language. For example (and this is the first “change” that I noted in reviewing the 112 pages of standards/EPs), the note for EC.02.02.01, EP 9 (the management of risks associated with hazardous gases and vapors) expands the “reach” to specifically include waste anesthetic gas disposal and laboratory rooftop exhaust (yes, I know…very sexy stuff!). It does appear that at least some of the changes (tough to figure out the split between what is truly “new” and what is merely a clarification of existing stuff—check out the note under EC.02.03.05, EP 1 regarding supervisory signal devices because it provides a better sense of what could be included in the mix). Another interesting change occurs under EC.02.03.05 (and this applies to all the testing EPs) is that where previously the requirement was for the completion dates of the testing to be documented, now the requirement actually states that the results of the testing are to be documented in addition to the completion dates. Again, a subtle change in the language and certainly nothing that they haven’t been surveying to. Oh, and one addition to the canon is the annual inspection and testing of door assemblies “by individuals who can demonstrate knowledge and understanding of the operating components of the door being tested. Testing begins with a pre-test visual inspection; testing includes both sides of the opening.” At any rate, I will keep plowing through the comparison table. (Remember in the old days, it would be called a crosswalk. Has the 21st century moved so far ahead that folks don’t know what a crosswalk is anymore?)

The top billing in yesterday’s All Hallows Eve E-Alert (making it an Eve-Alert, I suppose) went to the latest installment in that peppiest of undertakings, the Physical Environment Portal. Where the proofreaders comment comes into play is that the Alert mentions the posting of the information relative to LS.02.01.30, (which happened back in August) but when you click on the link, it takes you to the update page, where the new material is identified as covering LS.02.01.35, so there is updated material, though you couldn’t really tell by the Alert. So, we have general compliance information for the physical environment folks, some kicky advice and information for organizational leadership, and (Bonus! Bonus! Bonus!) information regarding the clinical impact of appropriately maintaining fire suppression systems (there is mention of sprinkler systems, but also portable fire extinguishers). I’d be interested to see if anyone finds the clinical impact information to be of particular use/effectiveness. I don’t know that compliance out in the field (or, more appropriately, noncompliance) is based on how knowledgeable folks are about what to do and what not to do, though perhaps it is the importance of the fire suppression systems and the reasons for having such systems (Can you imagine having to evacuate every time the fire alarm activates? That would be very stinky.) that is getting lost in the translation. I have no reason to think that the number of findings is going to be decreasing in this area (if you’re particularly interested, the comparison table section on LS.02.01.35 begins on p. 80 of that document—any changes that I can see do appear likely to make compliance easier), so I guess we’ll have to keep an eye on the final pages of survey year 2016 and the opening of the 2017 survey season. Be still my beating heart!

So many FSAs, so little time…and all we get is MBW

Flexible Spending Account, Federal Student Aid, Food Services of America, Focused Standards Assessment.

So, I am forced to pick one. While I’m sure the lot of them is most estimable in many ways, I suppose the choice is clear: the freaking Focused Standards Assessment (kind of makes it an FFSA, or a double-F S A…what the…).

Just to refresh things a bit, the FSA is a requirement of the accreditation process in which a healthcare organization (I’m thinking that if you weren’t in healthcare, you probably would be choosing one of the other FSAs) reviews its compliance with a selected batch of Joint Commission accreditation requirements. The selections include elements from the National Patient Safety Goals, some direct and indirect impact standards and performance elements, high-risk areas, as well as the RFIs from your last survey—and I know you’ve continued to “work” those Measures of Success from your last survey. Ostensibly, this is very much an “open book” test, if you will—a test you get to grade for yourself and one for which there is no requirement to share the results with the teacher (in this case, The Joint Commission—I really don’t understand why folks submit their results to TJC, but some do—I guess some things are just beyond my ken…).

The overarching intent is to establish a process that enhances an organization’s continuous survey readiness activities (of course, as I see various and sundry survey results, I can’t help but think that the effectiveness of this process would be tough to quantify). I guess it’s somewhat less invasive than the DNV annual consultative visits, though you could certainly bring in consultants to fulfill the role of surveyor for this process if some fresh eyes are what your organization needs to keep things moving on the accreditation front.

I will freely admit to getting hung up a bit on the efficacy of this as a process; much like the required management plans (an exercise in compliance), this process doesn’t necessarily bring a lot of value to the table. Unless you actually conduct a thorough evaluation of the organization’s compliance with the 45 Environment of Care performance elements, 13 Emergency Management performance elements, 23 Life Safety performance elements (15 for healthcare occupancies, eight for ambulatory healthcare occupancies)—and who really has the time for all that—then does the process have any value beyond MBW (more busy work)? I throw the question out to you folks—the process is required by TJC, so I don’t want anyone to get in trouble for sharing—but if anyone has made good use of this process, I would be very interested in hearing all about it.

This is my last piece on the FSA process for the moment, unless folks are clamoring for something in particular. I had intended to list the EPs individually, but I think my best advice is for you to check them out for yourself. That said, I have a quick and dirty checklist of the required elements (minus the EP numbers, but those are kind of etched into my brain at this point). If you want a copy, just email me at smacarthur@greeley.com.

Brother, can you spare any change…

In the interest of time and space (it’s about time, it’s about space, it’s about two men in the strangest place…), I’m going to chunk the EM and LS risk areas that are now specifically included in the Focused Standards Assessment (FSA) process (previously, the risk areas were only in the EC chapter). Next week, I want to take one more chunk of your time to discuss now the FSA process (particularly as a function of what EPs the folks in Chicago have identified as being of critical importance/status). But for the moment, here are the add-ons for 2016:

Emergency Management

 

  • participation of organizational leadership, including medical staff, in emergency planning activities (you need to have a clear documentation trail)
  • your HVA; (interesting that they’ve decided to include this one—they must have found enough folks that have let the HVA process languish)
  • your documented EM inventory (I think it’s important to have a very clear definition of what this means for your organization)
  • participation of leadership, including medical staff, in development of the emergency operations plan (again, documentation trail is important)
  • the written EOP itself (not sure about this addition—on the face of it, it doesn’t necessarily make a lot of sense from a practical standpoint)
  • the annual review of the HVA (my advice is to package an analysis of the HVA with the review of the EOP and inventory)
  • annual review of the objectives and scope of the EOP
  • annual review of the inventory
  • reviewing activations of the EOP to ensure you have enough activations of the right type (important to define an influx exercise, as well as, a scenario for an event without community support)
  • identification of deficiencies and opportunities during those activations—this means don’t try to “sell” a surveyor an exercise in which nothing went awry—if the exercise is correctly administered, there will always, always, always be deficiencies and/or opportunities. If you don’t come up with any improvements, the you have, for all intents and purposes, wasted your time… (Perhaps a little harsh, but I think you hear what I’m saying)

Life Safety

 

  • Maintenance of documentation of any inspections and approvals made by state or local fire control agencies (I think you could make a case for having this information attached to the presentation of waivers, particularly if you have specific approvals from state or local AHJs that could be represented as waivers)
  • Door locking arrangements (be on the lookout for thumb latches and deadbolts on egress doors—there is much frowning when these arrangements are encountered during survey)
  • Protection of hazardous areas (I think this extends beyond making sure that the hazardous areas you’ve identified are properly maintained into the realm of patient spaces that are converted to combustible storage. I think at this point, we’ve all see some evidence of this. Be on the lookout!)
  • Appropriate protection of your fire alarm control panel (for want of a smoke detector…)
  • Appropriate availability of K-type fire extinguishers (this includes appropriate signage—that’s been a fairly frequent flyer in surveys of late)
  • Appropriate fire separations between healthcare and ambulatory healthcare occupancies (a simple thing to keep an eye on—or is it? You tell me…)
  • Protection of hazardous areas in ambulatory healthcare occupancies (same as above)
  • Protection of fire alarm control panels in ambulatory occupancies (same as above)

 

I would imagine that a fair amount of thought goes into deciding what to include in the FSA (and, in the aggregate, the number of EPs they want assessed in this process has gotten decidedly chunkier—I guess sometimes more is more), so next week we’ll chat a bit about what it all means.

Fear is not sustainable

A Welshman of some repute once noted that “fear is a man’s best friend” and while that may have been the case in a Darwinian sense, I don’t know that the safety community can rely as much on it as a means of sustainable improvement. I’ve worked in healthcare for a long time and I have definitely encountered organizational leaders that traded in the threat of reprisal, etc., if imperfections were encountered in the workplace (and trust me when I say that “back in the day” something as simple as a match behind a door—left by a prickly VP to see how long it stayed there—could result in all sorts of holy heck), it typically resulted in various recriminations, fingerpointing, etc., none of which ended up meaning much in the way of sustained improvement. What happened was (to quote another popular bard—one from this side of the pond), folks tended to “end up like a dog that’s been beat too much,” so when the wicked witch goes away, the fear goes too, and with it the driving force to stay one step ahead of the sheriff (mixing a ton of metaphors here—hopefully I haven’t tipped the obfuscation scales).

At any rate, this all ties back to the manner in which the accreditation surveys are being performed, which is based on a couple of “truisms”:

 

  1. There is no such thing as a perfect building/environment/process, etc.
  2. Buildings are never more perfect than the moment before you put people in them.
  3. You know that.
  4. The regulators know that.
  5. The regulators can no longer visit your facility and return a verdict of no findings, because there are always things to find.
  6. See #1.

Again, looking at the survey process, the clinical surveyors may look at, I don’t know, maybe a couple of dozen patients at the most, during a survey. But when it comes to the physical environment, there are hundreds of thousands of square feet (and if you want to talk cubic feet, the numbers get quite large, quite quickly) that are surveyed—and not just the Life Safety (LS) surveyor. Every member of the survey team is looking at the physical environment (with varying degrees of competency—that’s an editorial aside), so scrutiny of the physical environment has basically evolved (mutated?) since 2007 from a couple hours of poking around by an administrative surveyor to upwards of 30 hours (based on a three-day survey; the LS surveyor accounts for 16 hours, and then you will have the other team members doing tracers that accounts for at least another 16 hours or so) of looking around your building. So the question really becomes how long and how hard will they have to look to find something that doesn’t “smell” right to them. And I think we all know the answer to that…

It all comes back (at least in my mind’s eye) to how effectively we can manage the imperfections that we know are out there. People bump stuff, people break stuff, people do all kinds of things that result in “wear and tear” and while I do recognize that the infamous “non-intact surface” makes is more difficult to clean and/or maintain, is there a hospital anywhere that has absolutely pristine horizontal and vertical surfaces, etc.? I tend to think not, but the follow-up question is: to what extent do these imperfections contribute to a physical environment that does not safely support patient care? This is certainly a question for which we need to have some sense of where we stand—I’m guessing there’s nobody out there with a 0% rate for healthcare-acquired infections, so to what degree can we say that all these little dings and scrapes do not put patients at risk to the extent that we cannot manage that level of risk? My gut says that the environment (or at least the environmental conditions that I’m seeing cited during surveys) is not the culprit, but I don’t know. As you all know by now (if you’ve been keeping tabs on me for any length of time), I am a big proponent of the risk assessment process, but has it come to the point where we have to conduct a risk assessment for, say, a damaged head wall in a patient room? Yes, I know we want to try and fix these types of conditions, but there are certain things that you can’t do while a patient is in the room and I really don’t think that it enhances patient care to be moving patients hither and yon to get in and fix surfaces, etc. But if we don’t do that, we run the risk of getting socked during a survey.

The appropriate management of the physical environment is a critical component of the safe delivery of healthcare and the key dynamic in that effort is a robust process for reporting imperfections as soon as possible (the “if you see something, say something” mantra—maybe we could push on “if you do something, say something”) so resources can be allocated for corrective actions. And somehow, I don’t think fear is going to get us to that point. We have to establish a truly collaborative, non-knee-jerk punitive relationship with the folks at the point of care, point of service. We have to find out when and where there are imperfections to be perfected as soon as humanly possible, otherwise, the prevalence of EC/LS survey findings will continue in perpetuity (or something really close to that). And while there may be some employment security pour moi in that perpetual scrutiny, I would much rather have a survey process that focuses on how well we manage the environment and not so much on the slings and arrows of day-to-day wear and tear. What say you?

May I? Not bloody likely! The secret world of ‘NO EXIT’ signs

There’s been something of a “run” on a particular set of findings and since this particular finding “lives” in LS.02.01.20 (the hospital maintains the integrity of egress), one of the most frequently cited standards so far in 2015 (okay, actually egress findings have been among the most frequently cited standards pretty much since they’ve bene keeping track of such things), it seems like it might not be a bad idea to spend a little time discussing why this might be the case. And of course, I am speaking to that most esoteric of citations, the “NO EXIT” deficiency.

For my money (not that I have a lot to work with), a lot of the “confusion” in this particular realm is due to The Joint Commission adopting some standards language that, while perhaps providing something a little bit more flexible (and I will go no further than saying perhaps on this one, because I really don’t think the TJC language helps clarify anything), in doing so, creates something of a box when it comes to egress (small pun intended). The language used by NFPA (Life Safety Code® 2000 edition 7.10.8.1) reads “any door, passage, or stairway that is neither an exit nor a way of exit access and that is arranged so that it is likely [my italics] to be mistaken for an exit shall be identified by a sign that reads as follows: NO EXIT.” To be honest, I kind of like the “likely” here—more on that in a moment.

Now our friends in Chicago take a somewhat different position on this: Signs reading ‘NO EXIT’ are posted on any door, passage, or stairway that is neither an exit nor an access to an exit but may (my italics, yet again) be mistaken for an exit. (For full text and any exceptions, refer to NFPA 101 – 2000: 7.10.8.1.) If you ask me, there’s a fair distance between something that “may” be mistaken for something else, like an exit and something that is likely to be mistaken for something else, like that very same exit. The way this appears to be manifesting itself is those pesky exterior doors that lead out into courtyard/patio areas that are not, strictly speaking, part of an egress route. Of especially compelling scrutiny are what I will generally describe as “storefront doors”—pretty much a full pane of glass that allows you to see the outside world and I will tell you (from personal experience) that these are really tough findings to clarify post-survey. Very tough, indeed.

So it would behoove you to take a gander around your exterior doors to see if any of those doors are neither an exit nor an access to an exit and MAY be mistaken for an exit. For some of you this may be a LIKELY condition, so you may want to invest in some NO EXIT signs. And please make sure they say just that; on this, the LSC is very specific in terms of the wording, as well as the stroke of the letters: “Such sign shall have the word NO in letters 2 inch (5 cm) high with a stroke width of 3/8 inch (1 cm) and the word EXIT in letters 1 inch (2.5 cm) high, with the word EXIT below the word NO.” This way you won’t be as likely to be cited for this condition as you may have before…

One score and no years ago: Guess who’s 20?

John Palmer, who edits Briefings on Hospital Safety, among other nifty periodicals, asked me to weigh in on the 20th anniversary of the EC chapter, with a particular emphasis on how (or where) things are now in comparison to the (oh so very dark) pre-EC days. And he did this in full recognition of my tendency to respond at length (imagine that!). At any rate, I decided that these thoughts would be good to share with all y’all (I can’t absolutely swear to all the dates; I think I’m pretty close on all of them, but if there are temporal errors, I take full responsibility…)

Prior to the “creation” of the Environment of Care as a chapter (you can trace the term Environment of Care back to 1989), The Joint Commission had a chapter in the accreditation manual known as Plant Technology and Safety Management (PTSM). The PTSM standards, while significantly more minimalistic than the present-day requirements, did cover the safety waterfront, as it were, but with the advent of Joint Commission’s Shared Visions/New Pathways marketing (you may assume that I am using “marketing” as the descriptor with a little bit of tongue in cheek) of their accreditation services, I guess the term I would use most is a modernization of the various standards chapters began, including the “birth” of the EC chapter in 1995. With that, things became a little more stratified, particularly with the “reveal” of the seven EC functions (safety, security, hazmat, emergency, life/fire, medical equipment, utility systems). This raised the profile of the physical environment a bit, but a true concerted focus on the EC really didn’t occur until 2007, when the Life Safety surveyor program was introduced, primarily in response to data gleaned by CMS from validation surveys. The Joint Commission survey process, prior to 2007, really didn’t have a reliable means of capturing life safety and related deficiencies. Since then, the survey focus on the physical environment has continued to grow, much to the point that now it very much eclipses the clinical component of the survey process, at least in terms of the number and types of findings.

Are things “better”? I suppose one could make the case that things have improved incrementally over time, but it’s tough to say how much direct influence the EC chapter has had on things (and the subsequent “peeling off” of the Life Safety and Emergency Management chapters). Clearly, the healthcare environment is significantly different than 20 years ago, both in terms of the inherent risks and the resources available to manage those risks. (Increased technology is pretty much a good thing, but reductions in spending, probably not so much. I’m sure you can come up with a pretty good list of pros and cons without too much difficulty.) You could also make the case (purely based on the number of findings…and I think TJC has a dog in that fight) that if hospitals are safer, it’s because of the level of scrutiny. I tend to think that the “true” answer resides on the development of the healthcare safety professional as a vocational endeavor, with the added thought that unsafe places tend not to stay in business for very long these days. So perhaps somewhere in the middle…

Better? Worse? Different…definitely! I will say that I firmly believe that the amount of survey jeopardy being generated at the moment leans towards the hyperbolic; there are certainly organizations that need to get their acts together a little more fully than they do at present. But not every organization that ends up in the manure is completely deserving of that status. I recognize that TJC has to be super-diligent in demonstrating their value to the accreditation process. But being accredited can’t become the be-all, end-all of the process. The responsibility of each healthcare organization (and, by extension, each caregiver) is to take care of patients in the safest possible manner and being attentive to the survey process can’t come at the expense of that responsibility. Sometimes, I fear, it does just that. I could probably say something pithy about job security, but…

Meet the new Survey Activity Guide, same as the old Survey Activity Guide (with apologies to the ‘Orrible ‘Oo)

Sorry I’m a little tardy on this one. I’ve been juggling a bunch of blog ideas and this one faded to the back of the pack a bit.

Back in January (and it seems so very long ago, perhaps due to the lovely weather we’re experiencing in the Northeast), The Joint Commission released the 2015 Survey Activity Guide, which details the ebb and flow of the survey process. Fortunately, they always identify a means of determining what is new (presumably as a function of 2014 Guides and so on), so I always look for any changes to the EC/EM/LS troika to see if anything funky has come to the fore.

Strangely enough, there are three documents that are indicated as being “new” that I’m pretty sure have been in the mix for at least a little while: your written fire response plan, your Interim Life Safety Measures Policy, and your fire drill evaluations. To fact-check myself, I went back to the 2013 SAG (I was a little lazy and skipped the 2014) and sure enough, all three were identified as “new” to the mix back then (I don’t seem to have a copy of the 2012), so no big surprises on the document front.

Likewise, the EC/EM interview sessions appear to be consistent with survey practice for the last couple of survey cycles. To be honest, I’m not entirely convinced that there’s a lot of exposure for organizations during these sessions, so long as the group is “chatty.” I think a good measure of how well you’re doing is in inverse proportion to the number of questions the surveyor has to ask to keep things moving. Strictly speaking, these sessions are designed to gain information on how organizations manage risk/respond to emergencies and how planning and preparedness activities function as a means of improving the various component programmatic elements. You should be able to discuss how the program has gotten “bettah” (I like to inject a little of the Bostonian vernacular from time to time), with an eye towards the use of data to demonstrate/support the notion that things have improved. I’ve not heard of anyone getting in trouble during these sessions, but I suppose there is always the potential for some misfortune. I think as long as there is recognition that compliance is a journey and not a destination, folks will be comfortable describing that journey (including setbacks) with winning survey results.