RSSAll Entries Tagged With: "survey preparation"

The song changes and yet remains the same…

There was a time when The Joint Commission actually seemed to be encouraging folks to fully engage with the clarification process in all its bountiful goodness. And I certainly hope that folks have been using that process to ensure that they don’t (or didn’t) have to “fix” processes, etc., that might not have been absolutely perfect in execution, but were not, by any stretch of the imagination, broken. But now, it appears that the bounty is going to be somewhat less bountiful as TJC has announced changes to the process, effective January 1, 2017. Please forgive my conspiracy theorist take on this, but it does seem that the new order in the accreditation world appears to lend itself to survey reports that will be increasing in the number of findings, rather than a reduction—and I am shocked! Okay, perhaps “shocked” is a tad hyperbolic. BTW, in a new Advocacy Alert to members, it appears that ASHE has come to the same conclusion, so it’s not just me…hoorah!

And so, the changes:

 

  • Any required documents that are not available at the time of survey will no longer be eligible for the clarification process (basically, the vendor ate my homework). It is important for everyone to have a very clear understanding of what TJC means by “required documents”—there is a list on your organization’s Joint Commission extranet site. My advice, if you have not already done so, is to immediately coordinate the download of that list with your organization’s survey coordinator (or whoever holds the keys to accessing that information—it may even be you!) and start formulating a process for making sure that those documents are maintained in as current a fashion as possible. And make sure your vendors are very, very clear on how much time they have to provide you with the documentation, as well as letting you know ASAP whether you have any deficiencies/discrepancies to manage—that 60-day correction window can close awfully quickly!
  • While I never really liked to employ this strategy, there were times when you could use clerical errors in the survey document to have things removed from the survey report. Areas that were misidentified on the report (non-existent to your facility; not apropos to the cited finding, for example, identification of a rated door or wall where there is none, etc.) or perhaps the location of the finding was so vague as to be impossible to identify—these have all been used successfully, but (apparently) no more. Now whether this means that there will be more in-depth discussions with the survey team as they prepare the report is unknown at this time, but even if one slips by (and I can tell you, the survey reports in general are much more exact—and exacting—in their description of the deficiencies and their locations), it won’t be enough to remove it from the report (though it could make your ESC submittal a bit more challenging if you can’t tell what it is or where it is).
  • The other piece of this is, with the removal of “C” Elements of Performance, you can no longer go the audit route to demonstrate that you were in substantial compliance at the time of survey. So now, effectively, everything is being measured against “perfection” (son of a…); miss one month’s check on a single fire extinguisher and—boom—finding! One rated door that doesn’t latch? Boom—finding! One sprinkler head with dust or a missing escutcheon? Boom—finding! And, as we touched on last week, it’s not just your primary location (aka, “the hospital”) that’s in play—you have got to be able to account for all those pesky little care sites, even the ones for which you are not specifically providing services. Say, for example, the landlord at one of your off-sites is responsible for doing the fire extinguisher checks; if something is missed (and hey, what’s then likelihood of that happening…), then you are vulnerable for a finding. So, unless you are prepared to be absolutely, positively perfect, you’d best be making sure that your organization’s leadership understands that the new survey reality is not likely to be very pretty.

I would like nothing better than to tell you that with the leadership change in Washington there will be a loosening of the regulatory death grip that is today’s reality, but somehow I don’t think that’s gonna happen…

A funny thing happened on the way somewhere…

One of the benefits of doing a lot of traveling and having a fairly well-developed case of OCD is having plenty of time to ponder ponderables and imponderables (I think it may be more fun than it sounds, but there are days when I’m not at all sure). At any rate, one of the things I’ve been tossing around in my head is the whole deal with the testing of defibrillators, particularly in areas that are not open 24/7. I suspect that, in most, if not all, instances, defibrillators are considered to be in the high-risk/life support category (I don’t think I’ve run into anyone that is managing them otherwise), so the question becomes this: in light of the Joint Commission performance elements indicating that high-risk/life support equipment is to be maintained in accordance with the Original Equipment Manufacturer (OEM) recommendations, if you have defibrillators in your inventory that require daily user testing (or some similarly constructed variation) and have those defibrillators in areas that are not open 24/7, how are you ensuring compliance relative to the OEM recommendations? I know of at least one hospital (which shall remain nameless) that does the user test on a once-monthly basis and I am curious as to how that might play out during a survey. Minimally, I think it would be a good idea to review the defibrillators you have in the inventory and see if there are any funky requirements/recommendations beyond the traditional preventive maintenance cycles. And please include the blogosphere when you figure out what you have; my fear is that this could become a means of generating more unpleasant findings during surveys if we don’t get out ahead of this thing. High risk and life support combine to make a very scary survey scenario in my mind’s eye (can you say immediate jeopardy?) and in the hands of a by-the-book, pain in the keister surveyor (not that there are any of those)…

A change will do you good…but what about no change? Exact change?

I’m sure you’ve all had a chance to look over the April 2014 issue of Perspectives, in which EC and LS findings combined to take seven of the top 10 most frequently cited standards during 2013, with issues relating to the integrity of egress taking the top spot.

At this point, I don’t think there are any surprises lurking within those most frequently occurring survey vulnerabilities (if someone out there in the audience has encountered a survey finding that was surprising, I would be most interested in hearing about it). The individual positions in the Top 10 may shift around a bit, but I think that it’s pretty clear that, at the very least, the focus of the TJC survey process has remained fairly constant these past couple of years.

Generally speaking, my sense about the TJC survey cycle is that specific focus items tend to occur in groups of threes (based on the triennial survey cycle, with the assumption being that during each three year period, every hospital would be surveyed—and yes, I do know what happens when you assume…) and I think that 2013 may well represent the end of the first go-round of the intensive life safety survey process (I really believe that 2009-2010 were sort of beta-testing years). So the question I have for you good citizens of the safety world: Has anyone been surveyed yet this year? With follow-up questions of:

  • Did you feel you were better prepared to manage the survey process this time?
  • Was the survey process different this time?
  • More of the same?
  • More difficult?
  • Less difficult?

I’m hoping to get a good sense of whether the tidal wave of EC/LS findings has indeed crested, so anyone interested in sharing would have my gratitude. Please feel free to respond to the group at large by leaving a comment here or if you prefer a little more stealthy approach, please e-mail me at smacarthur@greeley.com or stevemacsafetyspace@gmail.com.

Well, hello there, Mr. Vendor Man: Can we see your papers?

Reaching into the old e-mailbag, a question was raised regarding who “owns” the process for credentialing and what survey vulnerabilities might be lurking in the process. Now I can start by saying that there are no specific requirements as to a credentialing process for vendors; the overarching expectation is that vendors are like any other risk—something to be managed appropriately. Certainly, if you have equipment vendors scrubbing out and assisting in the OR, then that has a more far-reaching implication than a vendor who is responsible for managing copy machines. I suppose if you had to stretch things a bit, whoever is responsible in the organization for managing contracts would certainly be in a leadership position for stuff like this, but that responsibility can be more or less genericized as a function of “services will be provided in accordance with all applicable standards and regulations, including CMS, Joint Commission, state, etc. This would include consideration of such things as competence of the vendors (as an example, I will invoke Clinical Engineering relative to oversight of the contract services provided by external vendors—how do you make sure that contract equipment services personnel are adequately competent, etc.?). I don’t know that you could ever really trace it back to one or two folks in terms of ownership of the process—an organization of any complexity, etc. is going to have many, many contracts for various and sundry services, so there would almost have to be some division of responsibility (I say almost because I suppose you could maybe find the person with worst case of OCD in the organization and hand the responsibility to them—you’ll sleep at night—but he or she probably never will again) that ultimately ties back to senior leadership.

All that said, survey preparation comes down to knowing that the organization is effectively managing contract service vendors (and I’m using that term at its most expansive definition—everybody that provides services that is not directly employed by the hospital would have to be considered in the mix). You could certainly distill this group down to those which would be considered most critical (if that sounds like a risk assessment, you would be correct) and then identify a strategy for monitoring and periodic evaluation of performance. It’s all about having an effective process; generally speaking, TJC generally “leans” on this only when they’ve identified a clear and present failure mode; so if the vendors are adequately competent and behave themselves while under your roof, you should be okay—but you have to have some sense of whether than is indeed the case.

BTW: I had no intention of sexism in the headline; I was going to do the split Mr./Ms. Vendor Person,  but, I don’t know, Mr. Vendor Man seems a little more rock and roll…

Steak tips, turkey tips, and compliance tips—such a deal!

I’m not entirely certain what to make of this, but I always try to share anything I come across that might prove useful to you folks in the field. Back in July (yes, I know we are now edging towards the wintry portion of our year—we’ve had a lot of stuff to discuss), in one of the regular editions of Joint Commission Online, there was a list of compliance tips for the most frequently cited Life Safety standards.

Nothing wrong with that, as a going concern, but where I kind of got bogged down when I looked at the tips is that they weren’t necessarily in reference to stuff I’ve seen most in the field. All code compliance tidbits, to be sure, but again, not necessarily the type of stuff with which I’ve seen folks struggle.

As an example, the first tip deals with the required clear width of doors to sleeping rooms, egress doors, and doors to diagnostic treatment areas (existing construction must have at least 32 inches of clear width; new construction must have at least 41½ inches of clear width). Now I can think of a few instances in which I’ve encountered doors that were a might on the narrow side, but I’m also thinking door width is driving this particular standard (LS.02.01.20—egress requirements) to be among the most cited in all of survey land? I don’t know, which is why I’m not sure what to make of this. At any rate, make of it what you will—just make sure you grill those tips to your liking.