Search Results for 'safer matrix'

What it is ain’t exactly clear: Hazardous materials management and the SAFER matrix

I was recently asked to ponder the (relative—all things are relative) preponderance of findings under the Hazardous Materials and Wastes Management standard (EC.02.02.01 for those of you keeping track). For me, the most interesting part of the question was the information that (as was apparently revealed at the Joint Commission Executive Briefings sessions last fall) findings under EC.02.02.01 frequently found their way to the part of the SAFER matrix indicating a greater likelihood of causing harm (the metric being low, moderate, and high likelihood of harm) than some of the other RFIs being generated (EC.02.06.01, particularly as a function of survey issues with ligature risks, also generates those upper harm-level likelihood survey results). Once upon a time, eyewash station questions were among the most frequently asked (and responded to in this space), so it’s almost like replaying a classic

Generally speaking, the findings that they’ve earmarked as being more likely to cause harm are the ones relating to eyewash stations (the most common being the surveyors over-interpreting where one “has” to have an eyewash station the remainder pretty much fall under the maintenance of eyewashes—either there’s a missing inspection, access to the eyewash station is obstructed during the survey, or there is clearly something wrong with the eyewash—usually the protective caps are missing or the water flow is rather anemic in its trajectory). All of those scenarios have the “potential” for being serious; if someone needs an eyewash and the thing doesn’t work properly or it’s been contaminated, etc., someone could definitely be harmed. But (and it is an extraordinarily big “but”) it’s only when you have an exposure to a caustic or corrosive chemical, which loops us back to the over-interpretation. OSHA only requires emergency eyewash equipment when there is a risk of occupational exposure to a corrosive chemical (the ANSI standard goes a bit further by indicating eyewash equipment should be available for caustic chemicals as well as corrosives). A lot of the findings I’ve seen have been generated by the clinical surveyors, who are frequently in the company of hospital staff that aren’t really clear on what the requirements are (you could make the case that they should, if only from a Hazard Communications standard standpoint, but we’ll set that aside for the moment), so when the clinical surveyor says “you need an eyewash station here” and writes it up, the safety folks frequently don’t find out until the closeout (and sometimes don’t find out until the survey report is received). The “problem” that can come to the fore is that the clinical folks don’t perceive the eyewash finding as “theirs” because it’s not a clinical finding, so they really don’t get too stressed about it. So, the surveyor may ask to see the SDS for a product in use and if the SDS indicates that the first aid for eye exposure is a 15- or 20-minute flush with water, then they equate that with an eyewash station, which in a number of instances, is not (again, strictly speaking from a regulatory standpoint) “required.” Sometimes you can make a case for a post-survey clarification, but successful clarifications are becoming increasingly rare, so you need to have a process in place to make your case/defense during the survey.

The other “batch” of findings for this standard tend relate to the labeling of secondary containers (usually the containers that are used to transport soiled instruments); again, in terms of actual risk, these conditions are not particularly “scary,” but you can’t completely negate the potential, so (again) the harm level can be up-sold (so to speak).

In terms of survey prep, you have to have a complete working knowledge of what corrosive chemicals are in use in the organization and where those chemicals are being used (I would be inclined to include caustic chemicals as well); the subset of that is to evaluate those products to see if there are safer (i.e., not corrosive or caustic) alternatives to be used. The classic finding revolves around the use of chemical sprays to “soak” instruments awaiting disinfection and sterilization—if you don’t soak them, then the bioburden dries and it’s a pain to be sure it’s all removed, etc.; generally, some sort of enzymatic spray product is used—but not all of them are corrosive and require an eyewash station. Then once you know where you have corrosives/caustics, you need to make sure you have properly accessible eyewash equipment (generally within 10 seconds of unimpeded travel time from the area of exposure risk to the eyewash) and then you need to make sure that staff understand what products they have and why an eyewash is not required (strictly speaking, there really aren’t that many places in a hospital for which an eyewash station would be required) if that is the case—or at least make sure that they will reach out to the safety folks if a question should come up during survey. Every once in a while there’s a truly legit finding (usually because some product found its way someplace where it didn’t belong), but more often than not, it’s not necessary.

You also have to be absolutely relentless when it comes to the labeling of secondary containers; if there’s something of a biohazard nature and you put it in a container, then that container must be properly identified as a biohazard; if you put a chemical in a spray bottle, bucket, or other container, then there needs to be a label (there are exceptions, but for the purposes of this discussion, it is best managed as an absolute). Anything that is not in its original container has to be labeled, regardless of what the container is, the reason for doing it, etc. The hazard nature of the contents must be clear to anyone and everyone that might encounter the container.

At the end of the day (as cliché an expression as that might be), it is the responsibility of each organization to know what’s going on and to make sure that the folks at the point of care/point of service have a clear understanding of what risks they are likely to encounter and how the organization provides for their safety in encountering those risks. We are not in the habit of putting people in harm’s way, but if folks don’t understand the risks and (perhaps most importantly) understand the protective measures in place, the risk of survey finding is really the least of your worries.

An invitation to the regulatory dance—and the band keeps playing faster…

About a year ago, we chatted a bit about the likely changes to the regulatory landscape under a new administration, most of which (at least those related to the changing of the guard) never really materialized to any great extent. But one thing held true—and continues as we embark upon the good ship 2018—the focus on management of the physical environment is very much at the forefront of preparatory activities.

We also chatted a bit about The Joint Commission’s previous exhortations to healthcare leaders to focus more attention on the management of the physical environment (I was going to provide a link to TJC’s leadership blog regarding our little world, but it appears that the page is not so easily found, though I’m sure it has nothing to do with revisionist history…). But it does appear that there’s no reason to think that the number (and probably types) of survey findings in the environment are going to be anything but steady, though hopefully not a steady increase. Remember, we still have two years in the survey cycle before everyone gets to have undergone their first survey with the loss of the rate-based performance elements.

Which brings us squarely to 2018 and our continuing storm of regulatory challenges; I had made a list of stuff that I believed would play some role of significance in 2017 and (strangely enough) appear to be poised to do the same in the coming year (or two…or three?!?):

 

  1. Physical environment standards remain among the most frequently cited during TJC surveys (Nine of the 10 most frequently cited standards for the period January through June 2017). Please check out the September 2017 issue of Joint Commission Perspectives for the details! Just so you know (and I do believe that I’ve mentioned this in the past), I “count” IC.02.02.01 as a physical environment standard. Yes, I know it’s under the Infection Control chapter, but disinfection, the management of equipment and supplies? That all happens in the environment!
  2. CMS, in its report card to Congress, identified the physical environment as the largest “gap” of oversight during all accreditation organization surveys
  3. Also in its report card to Congress, CMS singled out TJC as lagging behind its competition when it comes to improving identification of deficiencies relative to the Conditions of Participation. I firmly believe that the report card to Congress was the proverbial “spark” that fanned the flames of regulatory focus in the environment. I don’t know when we can expect an updated edition of the report card (I suspect that it may be a while), but knowing that CMS is “concerned” can only mean continued focus…
  4. CMS adoption of the 2012 Life Safety Code® (effective survey date of November 1, 2016) definitely did create some level of confusion and uncertainty that always accompanies “change.” And 2017 demonstrated very clearly that it’s not just “us” that have to learn the practical application of the new stuff—the surveyors have to catch up as well! I am definitely starting to see the impact of the adoption of the 2012 Health Facilities Code (NFPA 99)—if you don’t have a copy in your library, it might just be time.
  5. TJC is in the process of revising its Environment of Care and Life Safety chapters to more closely reflect CMS requirements. January 2018 continues the rollout of the standards/performance elements updates—and they’re still not done. As we’ve discussed over the last few weeks, there’s still a lot of shifting requirements (some we always knew were in place, others merely rumored).
  6. Recent TJC survey reports indicate an increasing focus (and resulting vulnerabilities) on outpatient locations, particularly those engaging in high-level disinfection and/or surgical procedures. The physical environment in all areas in which patients receive care, treatment, and services are generating up to 60% of the total physical environment findings in recent surveys. That was just as true in 2017 as in 2016—each care location in the organization has to be prepared for multi-day scrutiny.
  7. CMS published its final rule on Emergency Preparedness (including Interpretive Guidelines, effective November 2016, with full implementation of requirements due November 2017). While organizations in compliance with current TJC Emergency Management standards will be in substantial compliance with the new rule, there will be some potential vulnerabilities relative to some of the specific components of the rule. The key sticking points at the moment appear to relate to the Continuity of Operations Plan (COOP) and the processes for delegating authority and leadership succession planning during extended events.
  8. Introduction of TJC’s SAFER matrix, which did indeed result in every deficiency identified during the survey process being included in the final survey report. Formerly, there was a section called Opportunities For Improvement for the single findings that didn’t “roll up” into a Requirement For Improvement. With the SAFER matrix, everything they find goes into the report. And there did seem to be a preponderance of findings “clustered” (make of that descriptor what you will) in the high risk sections of the matrix.
  9. As a final “nail” in the survey process coffin, effective January 2017, TJC will no longer provide for the clarification of findings once the survey has been completed. While this didn’t result in quite the devastation in the process as it might have first appeared (mostly because I think it forced the issue of pushing back during the survey), it also appears that clarification only during survey was not the hard line in the sand it appeared to be when this first “dropped.” That said, there very definitely seems to be a reluctance on the part of the folks at the Standards Interpretation Group (SIG) to “reverse the call on the field” once the survey team has left the building; just as there is a reluctance to vacate physical environment findings once the LS surveyor has hit the bricks. If you feel that a finding is not valid, there is no time like the present when it comes to the pushback.
  10. One unexpected “change” during 2017: The focus on ligature risks in the various environments in which behavioral health patients receive care, treatment, and/or services. We’ve discussed the particulars fairly extensively in this space and while I didn’t see it “coming,” it has certainly leaped to the top of the concern pile. The recent guidance from the regulators has (perhaps) helped to some degree, but this one feels a lot like the focus on the procedural environment over the past couple of years. I don’t think they’re done with this by any stretch…

 

In my mind, still working from the perspective of CMS calling out the physical environment as an area of concern, the stuff noted above indicates the likely result that the next 12-24 survey months will show a continued focus on the physical environment by the entire survey team (not just the Life Safety surveyor) and a likely continued plateau or increase in findings relating to the physical environment. I still believe that eventually the regulatory focus will drift back more toward patient care-related issues, but right now the focus on the physical environment is generating a ton of findings. And since that appears to be their primary function (generating findings), there’s always lots to find in the environment.

As I like to tell folks (probably ad nauseum, truth be told), there are no perfect buildings/environments, so there’s always stuff to be found—mostly fairly small items on the risk scale, but they are all citable. The fact of the matter is that there will be findings in the physical environment during your next survey, so the focus will shift to include ensuring that the corrective action plans for those findings are not only appropriate, but also can demonstrate consideration of sustained compliance over time. Preparing for the survey of the physical environment must reflect an ongoing process for managing “imperfections”—not just every 36 (or so) months, but every day.

Don’t ask, don’t tell, don’t tell, don’t get in trouble…

Hope everyone is having a good week and that the rather stormy weather impacting so many parts of the country has not created too much of a challenge for you and your organizations.

This week is another (sort of) catch-all of topics, starting first with a little bit of CYA advice.

Lately there have been several instances (of which I am aware—can’t say for sure if this is an iceberg, but it “feels” like it might) of some very adverse accreditation/deemed status decisions based on insufficient documentation that organizational leadership had been effectively informed of conditions in the physical environment that required additional resources, etc. It’s not that organizational leadership was unaware of the conditions, but more that there was no trail of documented discussion (committee minutes, surveillance rounds, etc.) by which the organization could demonstrate to the surveyors that they had everything under control. In fact, the impression given because of the lack of a documented trail was exactly the opposite.

While nobody is really keen on telling their boss about problems of significance, especially problems for which the means of resolving them are elusive or beyond one’s resources (don’t want to look like you can’t do your job effectively), it is of critical importance to be able to escalate these types of issues to (or near) the top of the organization. Typically, this is about having to fund something (at least in my experience); maybe it’s a roof replacement; maybe it’s replacing some HVAC equipment—I’m sure most folks have a list of things for which it is a struggle to get traction. Let’s face it, unless it’s a new building, facilities infrastructure improvements, safety stuff, etc., is not particularly sexy, so when the capital improvement budgets come and go, it’s a tough sell. But sell it you must and you must keep pushing it—eventually those improvements (or lack thereof) are going to impact patient care and that’s when things can go south in a hurry. We always want to be respectful and not panicky, etc., but, please believe me, when the three- and four-letter regulatory folks knock on the door, you want to be in a position to describe how issues are brought to the attention of leadership. It may not be too pleasant in the moment (okay, in all likelihood, it won’t be pleasant at all), but it can save a whole lot of grief later on.

Next up (and this is something in the way of a commercial), The Joint Commission is hosting a webinar on Tuesday, February 7 to provide information on the new SAFER matrix, which is going to be an important feature of your survey report. We first covered it back in May, but now that they’ve been using it for the past few months (in behavioral health hospitals), it’s possible (I’m hoping likely, but I don’t want to get too amped up) that they will be sharing some useful information from the field. At any rate, particularly for those of you anticipating surveys in the next six to 12 months, I would try to make time for this one. I truly believe that every good intention is put into these survey changes, but I think we can all agree that those good intentions figure very prominently on a certain road…

Finally, this week, I would encourage you to look really, really, really closely at your interim life safety measures (ILSM) policy. TJC conducted a consultant conference last week and it is my understanding that the one significant shift in the survey of the physical environment is that there is going to be a lot of focus on the practical application of ILSMs as a function of Life Safety Code® deficiencies that cannot be immediately corrected. You have to make sure that your policy reflects an ongoing, robust process for that part of the equation. I think the conclusion has been drawn that folks generally have it together when I comes to ILSMs and construction, but are rather less skilled when it comes to those pesky LS deficiencies. We know they tend to focus on areas where they feel there are vulnerabilities (how else might one explain the proliferation of EC/LS/EM findings in recent years). This is a big one folks, so don’t hesitate to dial in with questions.

 

In season, out of season: What’s the difference?

…when you don’t know the reason…

Some Joint Commission goodness for your regulatory pleasure!

For those of you in the audience that make use of the online version of the Accreditation Manual, I would implore you to make sure that when you are reviewing standards and performance elements that you are using the most current versions of the requirements. I think we can anticipate that things are going to be coming fast and furious over the next few months as the engineering folks at TJC start to turn the great ship around so it is in accordance with the requirements of the 2012 edition of just about everything, as well as reflecting the CMS Conditions of Participation. To highlight that change, one example is the requirement for the testing of the fire alarm equipment for notifying off-site fire responders (decorum prevents me from identifying the specific standard and performance element, but I can think of at least 02.03.05.5 things that might serve as placeholders, but I digress); the January 1, 2017 version of the standards indicates that this is to occur at a quarterly frequency (which is what we’ve been living with for quite some time), but the January 9, 2017 version indicates that this is to occur on an annual basis, based on the 2010 edition of NFPA 72. In looking at the 2010 edition of NFPA 72, it would appear that annual testing is the target, but I think this speaks to the amount of shifting that’s going to be occurring and the potential (I don’t know that I would go so far as to call it a likelihood, but it’s getting there) for some miscommunications along the way. At any rate, if you use the online tool (I do—it is very useful), make sure that you use the most current version. Of course, it might be helpful to move the older versions to some sort of archived format, but that’s probably not going to happen any time soon.

Speaking of updates, last week also revealed additional standards changes that will be taking effect July 1, 2017 (get the detailed skinny here). Among the anticipated changes are the official invocation of NFPA 99 as guidance for the management of risk; some tweaking of the language regarding Alternative Equipment Management (AEM) program elements, including the abolition (?!?) of the 90% target for PM completion and replacing it with the very much stricter 100% completion rate (make sure you clearly define those completion parameters!); expansion of the ILSM policy requirements to include the management of Life Safety Code® deficiencies that are not immediately corrected during survey (you really have to look at the survey process as a FIFI—Find It, Fix It!—exercise); the (more or less) official adoption of Tentative Interim Agreements (TIA) 1, 2, and 4 (more on those over the next couple of weeks) as a function of managing fire barriers, smoke barriers, and egress for healthcare occupancies; and, the next (and perhaps final) nail in the coffin of being able to sedate patients in business occupancies (also to be covered as we move into the spring accreditation season). I trust that some of this will be illuminated in the upcoming issues of Perspectives, but I think we can safely say that the winds of change will not be subsiding any time soon.

Also on the TJC front, as we move into the 2017 survey year, those of you that will likely be facing survey, I encourage you to tune in to a webinar being presented on the SAFER (Survey Analysis For Evaluating Risk) matrix, which (aside from being transformative—a rather tall order and somewhat scary to consider) will be the cornerstone of your survey reports. We’ve covered some of the salient points here in the past (this is quickly becoming almost very nearly as popular a topic for me as eyewashes and general ranting), but I really cannot encourage you enough to give this topic a great deal of attention over the coming months. As with all new things TJC, there will be a shakedown cruise, with much variability of result (or this is my suspicion based on past experiences)—it is unlikely that this much change at one time is going to enhance consistency or it’s hard to imagine how it would/could (should is another matter entirely). At any rate, the next webinar is scheduled for Tuesday, March 7, 2017; details here.

Please remember to keep those cards and letters coming. It’s always nice to hear from folks. (It almost makes me think that there’s somebody out there at the other end of all those electrons…) Have a safe and productive week as we await the arrival of Spring!

 

I’ve got a feeling…

Just a quick drop of the microphone to let you know that our friends in Chicago are presenting a webinar on the SAFER methodology that The Joint Commission will use during hospital surveys starting in January. As we’ve discussed previously, with the removal of standard types (As and Cs and whatever else you can conjure up) and the introduction of the “Survey Analysis for Evaluating Risk (SAFER) matrix to prioritize resources and focus corrective action plans in areas that are in most need of compliance activities and interventions,” it appears that once again we are heading into some white water rapids (certainly Class 4, with intermittent burst of Class 5/6—better wear your life vest). That said, I appears that the webinar (scheduled for November 15) is for a limited audience number, but I do think that it might be useful to listen in to hear what pearls may (or may not) be uttered. You can register here and it also appears that the session will be recorded and made available on the TJC website (as near as I can tell, the webinar is free, so check your local listings).

Ciao for now. Back next week with more fun than you can shake a stick at…

You better start swimming or you’ll sink like a stone…

In their pursuit of continuing relevance in an ever-changing regulatory landscape, The Joint Commission announced what appears to be a fairly significant change in the survey reporting process. At first blush, it appears that this change is going to make the post-survey process a little simpler, recognizing that simplification of process sometimes ends up not being quite so simple. But as always, I will choose to remain optimistic until proven otherwise.

So the changes in the process as outlined in this week’s missive shake out into three categories: scoring methodology, post-survey follow-up activities, and submission time frames for Evidence of Standards Compliance (ESC). And I have to say that the changes are not only interesting, but appear to represent something of a shift in the framework for administering surveys. Relative to the scoring methodology, it appears that the intent is to do away with the “A” and “C” categories, as well as the designation of whether the performance element is a direct or indirect impact finding. The new process will revolve around a determination of whether a deficient practice or condition is likely to cause harm and, more or less, how frequently the deficient practice or condition is observed. As with so many things in the regulatory realm, this new methodology reduces to a kicky new acronym: SAFER (Survey Analysis For Evaluating Risk) and comes complete with a matrix upon which each deficiency will be placed. You can see the matrix in all its glory through the link above, but it’s basically a 3 x 3 grid with an x-axis of scope (frequency with which the deficiency was observed) and a y-axis of likelihood to result in harm. This new format should make for an interesting looking survey report, to say the least.

Relative to the post-survey follow-up activities, it appears that the section of the survey report (Opportunities for Improvement) that enumerates those single instances of non-compliance for “C” Elements of Performance will “no longer exist” (which makes sense if they are doing away with the “C” Element of Performance concept). While it is not explicitly noted, I’m going to go out on a limb here and guess that this means that the deficiencies formerly known as Opportunities for Improvement will be “reported” as Requirements for Improvement (or whatever RFIs become in the SAFER model), so we may be looking at having to respond to any and all deficiencies that are identified during the course of the survey. To take that thought a wee bit further, I’m thinking that this might also alter the process for clarifying findings post-survey. I don’t imagine for a moment that this is the last missive that TJC will issue on this topic, so I guess we’ll have to wait and see how things unfold.

As far as the ESC submission timeframes, with the departure of the direct and indirect designations for findings comes a “once size fits all” due date of 60 days (I’m glad it was a “45 days fits all” timeframe), so that makes things a little less complicated. But there is a notation that information regarding the sustainment of corrective actions will be required depending on where the findings fall on the matrix, which presumable means that deficiencies clustered in the lower left corner of the matrix (low probability of harm, infrequent occurrence) will drive a simple correction while findings in the upper right corner of the matrix will require a little more forethought and planning in regards to corrective actions.

The rollout timeframe outlined in the article indicates that psychiatric hospitals that use TJC for deemed status accreditation will start seeing the new format beginning June 6 (one month from this writing) and everyone else will start seeing the matrix in their accreditation survey reports starting in January 2017. I’m really curious to see how this is all going to pan out in relation to the survey of the physical environment. Based on past practices, I don’t know that (for the most part) the deficiencies identified in the EC/EM/LS part of the survey process wouldn’t mostly reside in that lower left quadrant, but I suppose this may result in focus on fewer specific elements (say, penetrations) and a more concerted approach to finding those types of deficiencies. But with the adoption of the 2012 Life Safety Code®, I guess this gives us something new to wait for…