Search Results for 'safer matrix'
Hope everyone is having a good week and that the rather stormy weather impacting so many parts of the country has not created too much of a challenge for you and your organizations.
This week is another (sort of) catch-all of topics, starting first with a little bit of CYA advice.
Lately there have been several instances (of which I am aware—can’t say for sure if this is an iceberg, but it “feels” like it might) of some very adverse accreditation/deemed status decisions based on insufficient documentation that organizational leadership had been effectively informed of conditions in the physical environment that required additional resources, etc. It’s not that organizational leadership was unaware of the conditions, but more that there was no trail of documented discussion (committee minutes, surveillance rounds, etc.) by which the organization could demonstrate to the surveyors that they had everything under control. In fact, the impression given because of the lack of a documented trail was exactly the opposite.
While nobody is really keen on telling their boss about problems of significance, especially problems for which the means of resolving them are elusive or beyond one’s resources (don’t want to look like you can’t do your job effectively), it is of critical importance to be able to escalate these types of issues to (or near) the top of the organization. Typically, this is about having to fund something (at least in my experience); maybe it’s a roof replacement; maybe it’s replacing some HVAC equipment—I’m sure most folks have a list of things for which it is a struggle to get traction. Let’s face it, unless it’s a new building, facilities infrastructure improvements, safety stuff, etc., is not particularly sexy, so when the capital improvement budgets come and go, it’s a tough sell. But sell it you must and you must keep pushing it—eventually those improvements (or lack thereof) are going to impact patient care and that’s when things can go south in a hurry. We always want to be respectful and not panicky, etc., but, please believe me, when the three- and four-letter regulatory folks knock on the door, you want to be in a position to describe how issues are brought to the attention of leadership. It may not be too pleasant in the moment (okay, in all likelihood, it won’t be pleasant at all), but it can save a whole lot of grief later on.
Next up (and this is something in the way of a commercial), The Joint Commission is hosting a webinar on Tuesday, February 7 to provide information on the new SAFER matrix, which is going to be an important feature of your survey report. We first covered it back in May, but now that they’ve been using it for the past few months (in behavioral health hospitals), it’s possible (I’m hoping likely, but I don’t want to get too amped up) that they will be sharing some useful information from the field. At any rate, particularly for those of you anticipating surveys in the next six to 12 months, I would try to make time for this one. I truly believe that every good intention is put into these survey changes, but I think we can all agree that those good intentions figure very prominently on a certain road…
Finally, this week, I would encourage you to look really, really, really closely at your interim life safety measures (ILSM) policy. TJC conducted a consultant conference last week and it is my understanding that the one significant shift in the survey of the physical environment is that there is going to be a lot of focus on the practical application of ILSMs as a function of Life Safety Code® deficiencies that cannot be immediately corrected. You have to make sure that your policy reflects an ongoing, robust process for that part of the equation. I think the conclusion has been drawn that folks generally have it together when I comes to ILSMs and construction, but are rather less skilled when it comes to those pesky LS deficiencies. We know they tend to focus on areas where they feel there are vulnerabilities (how else might one explain the proliferation of EC/LS/EM findings in recent years). This is a big one folks, so don’t hesitate to dial in with questions.
…when you don’t know the reason…
Some Joint Commission goodness for your regulatory pleasure!
For those of you in the audience that make use of the online version of the Accreditation Manual, I would implore you to make sure that when you are reviewing standards and performance elements that you are using the most current versions of the requirements. I think we can anticipate that things are going to be coming fast and furious over the next few months as the engineering folks at TJC start to turn the great ship around so it is in accordance with the requirements of the 2012 edition of just about everything, as well as reflecting the CMS Conditions of Participation. To highlight that change, one example is the requirement for the testing of the fire alarm equipment for notifying off-site fire responders (decorum prevents me from identifying the specific standard and performance element, but I can think of at least 02.03.05.5 things that might serve as placeholders, but I digress); the January 1, 2017 version of the standards indicates that this is to occur at a quarterly frequency (which is what we’ve been living with for quite some time), but the January 9, 2017 version indicates that this is to occur on an annual basis, based on the 2010 edition of NFPA 72. In looking at the 2010 edition of NFPA 72, it would appear that annual testing is the target, but I think this speaks to the amount of shifting that’s going to be occurring and the potential (I don’t know that I would go so far as to call it a likelihood, but it’s getting there) for some miscommunications along the way. At any rate, if you use the online tool (I do—it is very useful), make sure that you use the most current version. Of course, it might be helpful to move the older versions to some sort of archived format, but that’s probably not going to happen any time soon.
Speaking of updates, last week also revealed additional standards changes that will be taking effect July 1, 2017 (get the detailed skinny here). Among the anticipated changes are the official invocation of NFPA 99 as guidance for the management of risk; some tweaking of the language regarding Alternative Equipment Management (AEM) program elements, including the abolition (?!?) of the 90% target for PM completion and replacing it with the very much stricter 100% completion rate (make sure you clearly define those completion parameters!); expansion of the ILSM policy requirements to include the management of Life Safety Code® deficiencies that are not immediately corrected during survey (you really have to look at the survey process as a FIFI—Find It, Fix It!—exercise); the (more or less) official adoption of Tentative Interim Agreements (TIA) 1, 2, and 4 (more on those over the next couple of weeks) as a function of managing fire barriers, smoke barriers, and egress for healthcare occupancies; and, the next (and perhaps final) nail in the coffin of being able to sedate patients in business occupancies (also to be covered as we move into the spring accreditation season). I trust that some of this will be illuminated in the upcoming issues of Perspectives, but I think we can safely say that the winds of change will not be subsiding any time soon.
Also on the TJC front, as we move into the 2017 survey year, those of you that will likely be facing survey, I encourage you to tune in to a webinar being presented on the SAFER (Survey Analysis For Evaluating Risk) matrix, which (aside from being transformative—a rather tall order and somewhat scary to consider) will be the cornerstone of your survey reports. We’ve covered some of the salient points here in the past (this is quickly becoming almost very nearly as popular a topic for me as eyewashes and general ranting), but I really cannot encourage you enough to give this topic a great deal of attention over the coming months. As with all new things TJC, there will be a shakedown cruise, with much variability of result (or this is my suspicion based on past experiences)—it is unlikely that this much change at one time is going to enhance consistency or it’s hard to imagine how it would/could (should is another matter entirely). At any rate, the next webinar is scheduled for Tuesday, March 7, 2017; details here.
Please remember to keep those cards and letters coming. It’s always nice to hear from folks. (It almost makes me think that there’s somebody out there at the other end of all those electrons…) Have a safe and productive week as we await the arrival of Spring!
Just a quick drop of the microphone to let you know that our friends in Chicago are presenting a webinar on the SAFER methodology that The Joint Commission will use during hospital surveys starting in January. As we’ve discussed previously, with the removal of standard types (As and Cs and whatever else you can conjure up) and the introduction of the “Survey Analysis for Evaluating Risk (SAFER) matrix to prioritize resources and focus corrective action plans in areas that are in most need of compliance activities and interventions,” it appears that once again we are heading into some white water rapids (certainly Class 4, with intermittent burst of Class 5/6—better wear your life vest). That said, I appears that the webinar (scheduled for November 15) is for a limited audience number, but I do think that it might be useful to listen in to hear what pearls may (or may not) be uttered. You can register here and it also appears that the session will be recorded and made available on the TJC website (as near as I can tell, the webinar is free, so check your local listings).
Ciao for now. Back next week with more fun than you can shake a stick at…
In their pursuit of continuing relevance in an ever-changing regulatory landscape, The Joint Commission announced what appears to be a fairly significant change in the survey reporting process. At first blush, it appears that this change is going to make the post-survey process a little simpler, recognizing that simplification of process sometimes ends up not being quite so simple. But as always, I will choose to remain optimistic until proven otherwise.
So the changes in the process as outlined in this week’s missive shake out into three categories: scoring methodology, post-survey follow-up activities, and submission time frames for Evidence of Standards Compliance (ESC). And I have to say that the changes are not only interesting, but appear to represent something of a shift in the framework for administering surveys. Relative to the scoring methodology, it appears that the intent is to do away with the “A” and “C” categories, as well as the designation of whether the performance element is a direct or indirect impact finding. The new process will revolve around a determination of whether a deficient practice or condition is likely to cause harm and, more or less, how frequently the deficient practice or condition is observed. As with so many things in the regulatory realm, this new methodology reduces to a kicky new acronym: SAFER (Survey Analysis For Evaluating Risk) and comes complete with a matrix upon which each deficiency will be placed. You can see the matrix in all its glory through the link above, but it’s basically a 3 x 3 grid with an x-axis of scope (frequency with which the deficiency was observed) and a y-axis of likelihood to result in harm. This new format should make for an interesting looking survey report, to say the least.
Relative to the post-survey follow-up activities, it appears that the section of the survey report (Opportunities for Improvement) that enumerates those single instances of non-compliance for “C” Elements of Performance will “no longer exist” (which makes sense if they are doing away with the “C” Element of Performance concept). While it is not explicitly noted, I’m going to go out on a limb here and guess that this means that the deficiencies formerly known as Opportunities for Improvement will be “reported” as Requirements for Improvement (or whatever RFIs become in the SAFER model), so we may be looking at having to respond to any and all deficiencies that are identified during the course of the survey. To take that thought a wee bit further, I’m thinking that this might also alter the process for clarifying findings post-survey. I don’t imagine for a moment that this is the last missive that TJC will issue on this topic, so I guess we’ll have to wait and see how things unfold.
As far as the ESC submission timeframes, with the departure of the direct and indirect designations for findings comes a “once size fits all” due date of 60 days (I’m glad it was a “45 days fits all” timeframe), so that makes things a little less complicated. But there is a notation that information regarding the sustainment of corrective actions will be required depending on where the findings fall on the matrix, which presumable means that deficiencies clustered in the lower left corner of the matrix (low probability of harm, infrequent occurrence) will drive a simple correction while findings in the upper right corner of the matrix will require a little more forethought and planning in regards to corrective actions.
The rollout timeframe outlined in the article indicates that psychiatric hospitals that use TJC for deemed status accreditation will start seeing the new format beginning June 6 (one month from this writing) and everyone else will start seeing the matrix in their accreditation survey reports starting in January 2017. I’m really curious to see how this is all going to pan out in relation to the survey of the physical environment. Based on past practices, I don’t know that (for the most part) the deficiencies identified in the EC/EM/LS part of the survey process wouldn’t mostly reside in that lower left quadrant, but I suppose this may result in focus on fewer specific elements (say, penetrations) and a more concerted approach to finding those types of deficiencies. But with the adoption of the 2012 Life Safety Code®, I guess this gives us something new to wait for…