RSSAll Entries Tagged With: "FSA"

So many FSAs, so little time…and all we get is MBW

Flexible Spending Account, Federal Student Aid, Food Services of America, Focused Standards Assessment.

So, I am forced to pick one. While I’m sure the lot of them is most estimable in many ways, I suppose the choice is clear: the freaking Focused Standards Assessment (kind of makes it an FFSA, or a double-F S A…what the…).

Just to refresh things a bit, the FSA is a requirement of the accreditation process in which a healthcare organization (I’m thinking that if you weren’t in healthcare, you probably would be choosing one of the other FSAs) reviews its compliance with a selected batch of Joint Commission accreditation requirements. The selections include elements from the National Patient Safety Goals, some direct and indirect impact standards and performance elements, high-risk areas, as well as the RFIs from your last survey—and I know you’ve continued to “work” those Measures of Success from your last survey. Ostensibly, this is very much an “open book” test, if you will—a test you get to grade for yourself and one for which there is no requirement to share the results with the teacher (in this case, The Joint Commission—I really don’t understand why folks submit their results to TJC, but some do—I guess some things are just beyond my ken…).

The overarching intent is to establish a process that enhances an organization’s continuous survey readiness activities (of course, as I see various and sundry survey results, I can’t help but think that the effectiveness of this process would be tough to quantify). I guess it’s somewhat less invasive than the DNV annual consultative visits, though you could certainly bring in consultants to fulfill the role of surveyor for this process if some fresh eyes are what your organization needs to keep things moving on the accreditation front.

I will freely admit to getting hung up a bit on the efficacy of this as a process; much like the required management plans (an exercise in compliance), this process doesn’t necessarily bring a lot of value to the table. Unless you actually conduct a thorough evaluation of the organization’s compliance with the 45 Environment of Care performance elements, 13 Emergency Management performance elements, 23 Life Safety performance elements (15 for healthcare occupancies, eight for ambulatory healthcare occupancies)—and who really has the time for all that—then does the process have any value beyond MBW (more busy work)? I throw the question out to you folks—the process is required by TJC, so I don’t want anyone to get in trouble for sharing—but if anyone has made good use of this process, I would be very interested in hearing all about it.

This is my last piece on the FSA process for the moment, unless folks are clamoring for something in particular. I had intended to list the EPs individually, but I think my best advice is for you to check them out for yourself. That said, I have a quick and dirty checklist of the required elements (minus the EP numbers, but those are kind of etched into my brain at this point). If you want a copy, just email me at smacarthur@greeley.com.

Brother, can you spare any change…

In the interest of time and space (it’s about time, it’s about space, it’s about two men in the strangest place…), I’m going to chunk the EM and LS risk areas that are now specifically included in the Focused Standards Assessment (FSA) process (previously, the risk areas were only in the EC chapter). Next week, I want to take one more chunk of your time to discuss now the FSA process (particularly as a function of what EPs the folks in Chicago have identified as being of critical importance/status). But for the moment, here are the add-ons for 2016:

Emergency Management

 

  • participation of organizational leadership, including medical staff, in emergency planning activities (you need to have a clear documentation trail)
  • your HVA; (interesting that they’ve decided to include this one—they must have found enough folks that have let the HVA process languish)
  • your documented EM inventory (I think it’s important to have a very clear definition of what this means for your organization)
  • participation of leadership, including medical staff, in development of the emergency operations plan (again, documentation trail is important)
  • the written EOP itself (not sure about this addition—on the face of it, it doesn’t necessarily make a lot of sense from a practical standpoint)
  • the annual review of the HVA (my advice is to package an analysis of the HVA with the review of the EOP and inventory)
  • annual review of the objectives and scope of the EOP
  • annual review of the inventory
  • reviewing activations of the EOP to ensure you have enough activations of the right type (important to define an influx exercise, as well as, a scenario for an event without community support)
  • identification of deficiencies and opportunities during those activations—this means don’t try to “sell” a surveyor an exercise in which nothing went awry—if the exercise is correctly administered, there will always, always, always be deficiencies and/or opportunities. If you don’t come up with any improvements, the you have, for all intents and purposes, wasted your time… (Perhaps a little harsh, but I think you hear what I’m saying)

Life Safety

 

  • Maintenance of documentation of any inspections and approvals made by state or local fire control agencies (I think you could make a case for having this information attached to the presentation of waivers, particularly if you have specific approvals from state or local AHJs that could be represented as waivers)
  • Door locking arrangements (be on the lookout for thumb latches and deadbolts on egress doors—there is much frowning when these arrangements are encountered during survey)
  • Protection of hazardous areas (I think this extends beyond making sure that the hazardous areas you’ve identified are properly maintained into the realm of patient spaces that are converted to combustible storage. I think at this point, we’ve all see some evidence of this. Be on the lookout!)
  • Appropriate protection of your fire alarm control panel (for want of a smoke detector…)
  • Appropriate availability of K-type fire extinguishers (this includes appropriate signage—that’s been a fairly frequent flyer in surveys of late)
  • Appropriate fire separations between healthcare and ambulatory healthcare occupancies (a simple thing to keep an eye on—or is it? You tell me…)
  • Protection of hazardous areas in ambulatory healthcare occupancies (same as above)
  • Protection of fire alarm control panels in ambulatory occupancies (same as above)

 

I would imagine that a fair amount of thought goes into deciding what to include in the FSA (and, in the aggregate, the number of EPs they want assessed in this process has gotten decidedly chunkier—I guess sometimes more is more), so next week we’ll chat a bit about what it all means.

Plus ça change, plus c’est la même chose: Vive Le Joint Commission!

I apologize for not having gotten to this sooner, but sometimes the wind comes out of nowhere and you find yourself heading in a rather unexpected direction (I’ve never spent so much time in Texas!).

With the advent of each new year, our three-lettered friends in Chicago unveil the changes to the accreditation standards for the upcoming cycle. Most of the changes in the EC/LS/EM world (with a couple of fairly notable exceptions—more on those in a moment) have to do with a shift in focus for the Focused Standards Assessment (FSA) process as a function of the various specific risk areas (I will freely admit that this is a wee bit convoluted, but should not necessarily come as a surprise). At any rate, as part of the accreditation process, each organization is supposed to evaluate its compliance based on specific areas of concern/risk identified by The Joint Commission. Thus for 2016, some of the risks to be evaluated have gone away (at least for the moment) and other have been added to the mix:

Please remember: These are not going away entirely, they just don’t have to be included in your organization’s FSA process!

So, we bid adieu to specific analysis of the safety, hazardous materials, medical equipment, and utility systems management plans (leaving security and fire safety in the mix) and we say bonjour to the identification of safety and security risks (as you may have noted, I’m not indicating the specific standard and EP numbers—our friends get a little protective of their content, but if you really need to check out the numbers, please see your organization’s accreditation manual).

We say goodbye to implementing our hazardous material and waste spill/exposure procedures, the monitoring of gases and vapors, and proper routine storage and prompt disposal of trash; and say hello to the hazardous materials and waste inventory, the actual written hazmat and waste spill/exposure procedures, minimization of hazmat risks, ensuring that you have proper permits, licenses, etc., for hazardous materials, and labeling of hazardous materials.

We say howdy to a focused look at fire drills, including the critiques.

We greet a focus on the testing documentation relating to duct detectors, electromechanical releasing devices, heat detectors, manual fire alarm boxes and smoke detectors, as well as the documentation relating to fire dampers.

We say auf wiedersehen to focusing on the selection, etc., of medical equipment, the written inventory of medical equipment, SMDA reporting, inspection, testing and maintenance of non-high risk medical equipment, and performance testing of all sterilizers.

We say guten tag to the written utility components inventory, written frequencies for maintenance, inspection and testing of utility system components, written procedures for utility system disruptions, and minimization of pathogenic, biological agents in aerosolizing water systems.

We say konichiwa to a focus on the provision of safe and suitable interior spaces for patients and the maintenance of ventilation, temperature and humidity in all those other pesky areas (e.g., soiled utility rooms, clean utility rooms, etc.).

We say hola to a focus on whether or not staff (including LIPs) are familiar with their roles and responsibilities relative to the environment of care (I predict that this is one is going to start showing up on the top 10 list soon unless there is a dramatic shift in survey focus).

And we say “Hey-diddly-ho, good neighbor” to the use of hazard surveillance rounds to identify environmental deficiencies, hazards, and unsafe hazards, as well as ensuring that you have a good mix of participants in your EC Committee activities, particularly the analysis of data—clinical, administrative, and support services have to be represented.

Now, there are three standards changes that went into effect on January 1, 2016: one a shift to a different spot in the standards, one a fairly clarifying clarification, and one about which I am not quite sure what to make, though I somehow fear the worst…

The requirement for the results of staff dosimetry monitoring (CT, PET, nuclear medicine) to be reviewed at least quarterly shift from Safety to Hazardous Materials. The EP number remains the same (and I can give you 17 reasons for that…), but it’s only a shift in where it would be scored (another important reason for making sure that you have a solid relationship between your EOC Committee and your Radiation Safety Committee—I’m a great believer in having compliance information in a location where surveyors are more likely to encounter it: EOC Committee minutes).

The requirement for managing the risks associated with smoking activities was clarified to indicate that the risks have to be managed regardless of the smoking types (e-cigarettes and personal vaporizers are officially in the mix); I’m presuming that this is helpful to folks who have perhaps faced some resistance in this area.

And finally (we’ll cover the EM and LS changes next time—nothing particularly scary, but a little too voluminous for this rather dauntingly wordy blog post), the requirements based around the inspection, testing and maintenance of non-high-risk utility system equipment components has gone from a “C” Element of Performance (EP) to an “A” EP (they did remove the Method of Success requirement for a deficient finding in this area—I suspect that was as much for their own sanity as anything else. Plus, it never really made a great deal of sense to figure out how to monitor something over four months’ time that frequently occurs every six or 12 months). My sense is that they are making the change to increase the “cite-ability” of managing utility systems equipment; now they only need to find one instance of noncompliance for a finding. I don’t know that I’ve seen a ton of findings in this area, but I can’t honestly say that I’ve been doing a close count of the OFI section of the reports, so it may be that they’re seeing a trend with the non-high-risk utility equipment that makes them think we’re not doing as good a job managing as we should, but that is wholly and completely conjecture on my part. I will, of course, be keeping a close eye on this one; I have a sneaking suspicion that the focus on utility systems equipment is going to continue into the immediate future and this might just become another pressure point.

As a closing thought relative to the FSA and risk area discussion, I think we can reasonably intuit that (particularly since the FSA process, represents a process for self-reporting) the expectation is for folks to be looking very carefully at the requirements contained within the above-noted areas and that your compliance plans relative to those requirements will be well in hand come survey. For some reason, this shift “smells” like an approach that’s going to be that much more focused on organizational leadership when there are gaps (and ask anyone who’s had a bumpy survey these past couple of years—leadership gets dragged into the fray on a regular basis). The fact of the matter is that they will find something deficient in your facility—if they don’t, they didn’t look hard enough. It’s about having processes in place to recognize and manage those deficiencies appropriately (and yes, I recognize that I am running the risk of repeating myself). This is big-time crazy focus on this stuff—and we need to be continuously improving how we go about doing it (whatever “it” might be).

Back next week to cover the EM/LS stuff. Arrivederci for now!

If only it were a tankless job…

And yet another story from the survey wars, this time regarding the number of oxygen cylinders that are allowed in a smoke compartment. As was the case regarding the eyewash station risk assessment discussion, this one comes from a Focused Standards Assessment (FSA) survey that I did not personally attend, so if you feel the grain of salt is once again needed, I will wait for you to fetch said salt before I start. Ready? Okay.

Anyway, in this particular survey, the FSA surveyor informed the organization that it could only have 12 oxygen cylinders in a smoke compartment, in this case, the ED. But wait, you say, what’s wrong with that? Read on, read on! Further discussion ensued in which the surveyor indicated that the 12 oxygen cylinders included the cylinders that were on, for example, the stretchers in the individual bays in the ED (this particular ED is designated as a suite of rooms). Now this kind of (okay, very much so) flies in the face of the whole “in use” versus “storage” concept where you can have “storage” of no more than 12 cylinders in a smoke compartment, but you can also have a number of cylinders that are considered “in use.” You will find a most excellent examples of how this works (and please try not to focus on the irony of this information source) in the December 2012 issue of Perspectives; on the right hand column of p. 10, George Mills describes a situation that uncannily resembles the condition that the FSA surveyor indicated was not compliant. And says that it’s okay, because the cylinders on the stretchers would be considered “in use.” If that don’t beat all…

I guess this ultimately goes back to the importance of “knowing” where you stand in terms of compliance. “Knowing” that the oxygen cylinders are considered in use and thus, within allowances, then you can respectfully (perhaps even silently) disagree with the surveyor and go back to more important things. And I suppose if you wanted to be fresh, you could suggest the surveyor sign up for a subscription to Perspectives. Unfortunately, they don’t have those little cards that fall out and can be mailed in as a gag…

What’s the frequency, Kenneth?

In our continuing coverage of stories from the survey beat, I have an interesting one to share with you regarding my most favorite of subjects: risk assessments. During a recent FSA survey (what’s that, you ask? Why, that’s the nifty replacement for the “old” PPR process—yet another kicky acronym, in this case standing for Focus Standards Assessment), a hospital was informed by the surveyor that it was required to conduct an annual risk assessment regarding emergency eyewash stations. Now I will admit that I got this information secondhand, so you may invoke the traditional grain of salt. But it does raise an interesting question in regards to the risk assessment process: Is it a one-and-done or is there an obligation to revisit things from time to time?

Now, purely from a contrarian standpoint, I would argue against a “scheduled” risk assessment on some specific recurring basis, unless, of course, there is a concern that the management of the risk (in question) as an operational consideration is not as easily assured as might otherwise serve the purpose of safety. If we take the eyewash equipment as an example, as it deals primarily with response to a chemical exposure, I would consider this topic as being a function of the Hazardous Communications standard, which is, by definition, a performance standard. So as long as we are appropriately managing the involved risks, we should be okay. And I know that we are monitoring the management of those risks as a function of safety rounds and the review of occupational injury reports, etc. If you look at a lot of the requirements relating to monitoring, a theme emerges—that we need to adjust to changes in the process if we are to properly manage the risks. If someone introduces a new chemical product into the workplace, then yes, we need to assess how that change is going to impact occupational safety. But again, if we are monitoring the EC program effectively, this is a process that “lives” in the program and really doesn’t benefit from a specific recurrence schedule. We do the risk assessment to identify strategies to manage risks and then we monitor to ensure that the risks are appropriately managed. And if they aren’t being appropriately managed…then it’s time to get out the risk assessment again.