RSSAuthor Archive for Steve MacArthur

Steve MacArthur

Steve MacArthur is a safety consultant with The Greeley Company in Danvers, Mass. He brings more than 30 years of healthcare management and consulting experience to his work with hospitals, physician offices, and ambulatory care facilities across the country. He is the author of HCPro's Hospital Safety Director's Handbook and is contributing editor for Briefings on Hospital Safety. Contact Steve at stevemacsafetyspace@gmail.com.

Prioritize this…

During a recent survey, an interesting question was posed to the folks in Facilities, a question more than interesting enough to bring to your attention. The folks were asked to produce a policy that describes how they prioritize corrective maintenance work orders and they, in turn, asked me if I had such a thing. In my infinitely pithy response protocol, I indicated that I was not in the habit of collecting materials that are not required by regulatory standard. Now, I’m still not sure what the context of the question might have been (I will be visiting with these folks in the not too distant future and I plan on asking about the contextual applications of such a request), but it did give me cause to ponder the broader implications of the question.

I feel quite confident that developing a simple ranking scheme would be something that you could implement without having to go the whole policy route (I am personally no big fan of policies—they tend to be more complicated than they need to be and it’s frequently tougher to follow a policy 100% of the time, which is pretty much where the expectation bar is set during survey). I think something along the lines of:

Priority 1 – Immediate Threat to Health/Safety

Priority 2 – Direct Impact on Patient Care

Priority 3 – Indirect Impact on Patient Care

Priority 4 – No patient care impact

Priority 5 – Routine repairs

would work pretty well under most, if perhaps not all, circumstances. The circumstance I can “see” that might not quite lend itself to a specific hierarchy is when you have to run things on a “first come, first served” basis. Now I recognize that since our workforces are incredibly nimble (unlike regulatory agencies and the like), we can re-prioritize things based on their impact on important processes, so the question I keep coming back to is how can a policy ever truly reflect the complexities of such a process without somehow ending up with an “out of compliance with your policy” situation? This process works (or I guess in some instances, doesn’t) because of the competence of the staff involved with the process. I don’t see where a policy gets you that, but what do I know?

If only it were a tankless job…

And yet another story from the survey wars, this time regarding the number of oxygen cylinders that are allowed in a smoke compartment. As was the case regarding the eyewash station risk assessment discussion, this one comes from a Focused Standards Assessment (FSA) survey that I did not personally attend, so if you feel the grain of salt is once again needed, I will wait for you to fetch said salt before I start. Ready? Okay.

Anyway, in this particular survey, the FSA surveyor informed the organization that it could only have 12 oxygen cylinders in a smoke compartment, in this case, the ED. But wait, you say, what’s wrong with that? Read on, read on! Further discussion ensued in which the surveyor indicated that the 12 oxygen cylinders included the cylinders that were on, for example, the stretchers in the individual bays in the ED (this particular ED is designated as a suite of rooms). Now this kind of (okay, very much so) flies in the face of the whole “in use” versus “storage” concept where you can have “storage” of no more than 12 cylinders in a smoke compartment, but you can also have a number of cylinders that are considered “in use.” You will find a most excellent examples of how this works (and please try not to focus on the irony of this information source) in the December 2012 issue of Perspectives; on the right hand column of p. 10, George Mills describes a situation that uncannily resembles the condition that the FSA surveyor indicated was not compliant. And says that it’s okay, because the cylinders on the stretchers would be considered “in use.” If that don’t beat all…

I guess this ultimately goes back to the importance of “knowing” where you stand in terms of compliance. “Knowing” that the oxygen cylinders are considered in use and thus, within allowances, then you can respectfully (perhaps even silently) disagree with the surveyor and go back to more important things. And I suppose if you wanted to be fresh, you could suggest the surveyor sign up for a subscription to Perspectives. Unfortunately, they don’t have those little cards that fall out and can be mailed in as a gag…

What’s the frequency, Kenneth?

In our continuing coverage of stories from the survey beat, I have an interesting one to share with you regarding my most favorite of subjects: risk assessments. During a recent FSA survey (what’s that, you ask? Why, that’s the nifty replacement for the “old” PPR process—yet another kicky acronym, in this case standing for Focus Standards Assessment), a hospital was informed by the surveyor that it was required to conduct an annual risk assessment regarding emergency eyewash stations. Now I will admit that I got this information secondhand, so you may invoke the traditional grain of salt. But it does raise an interesting question in regards to the risk assessment process: Is it a one-and-done or is there an obligation to revisit things from time to time?

Now, purely from a contrarian standpoint, I would argue against a “scheduled” risk assessment on some specific recurring basis, unless, of course, there is a concern that the management of the risk (in question) as an operational consideration is not as easily assured as might otherwise serve the purpose of safety. If we take the eyewash equipment as an example, as it deals primarily with response to a chemical exposure, I would consider this topic as being a function of the Hazardous Communications standard, which is, by definition, a performance standard. So as long as we are appropriately managing the involved risks, we should be okay. And I know that we are monitoring the management of those risks as a function of safety rounds and the review of occupational injury reports, etc. If you look at a lot of the requirements relating to monitoring, a theme emerges—that we need to adjust to changes in the process if we are to properly manage the risks. If someone introduces a new chemical product into the workplace, then yes, we need to assess how that change is going to impact occupational safety. But again, if we are monitoring the EC program effectively, this is a process that “lives” in the program and really doesn’t benefit from a specific recurrence schedule. We do the risk assessment to identify strategies to manage risks and then we monitor to ensure that the risks are appropriately managed. And if they aren’t being appropriately managed…then it’s time to get out the risk assessment again.

Abduction drills as emergency response exercises

One of the survey stories I hear from time to time deals with the efficacy (or the perceived efficacy as a function of Joint Commission surveyors) of using an infant abduction exercise as an emergency management exercise, with the “opinion” usually being that you “can’t” use them. My sense of that has always been that, if you think about it, there are few more disruptive events in any healthcare organization than an abduction event. So while an abduction exercise is not expressly mentioned in the standards, neither is an abduction exercise specifically excluded from the mix.

I believe (and this belief was borne out during a recent survey) is that as long as you plan, execute, monitor, and evaluate an abduction exercise at/to the same degree as you would any other emergency response activity, then there is no real reason why you couldn’t “count” an abduction exercise towards your annual allotment (and yes, I do recognize that an abduction exercise is not an influx exercise, but it could be part of an escalating scenario or as a means of practicing with the local community).

Standing up your incident command structure in response to the abduction exercise—yes, you would definitely want to do that. You also want to make sure that you evaluate the six critical function areas: communications; resources and assets, safety & security; utility systems, staff roles & responsibilities, and patient care and support activities should all be considered in the critique of the organization’s response. For any improvement opportunities that cannot be immediately implemented, make sure you identify any interim measures to bridge those opportunity gaps until they can be finalized. Opportunities and strengths should be communicated to the EC Committee and (ultimately) senior leadership. Basically, it’s the process you should already be using for your “regular” exercises and “real” emergency response plan implementations. If you keep these requirements in mind when, then you can feel confident that you have met the required elements. Bring on the survey!

Hey, how about that new app(liance focus during TJC surveys)?

I don’t know that it represents a significant focus change or if it’s just one of those blips that one might encounter when you hear about survey results, but there is a little groundswell relative to the management of appliances (basically everything that is not clinical equipment, which it appears could extend to utility systems equipment, but there’s no clear sense of that just yet).

I think we can agree than the healthcare environment is chock-a-block full of all manner of devices and appliances from toasters and microwave ovens to refrigerators; from desk lamps to radios and who knows what else. So in that great expanse of possibilities, there have been at least two recent surveys in which the process for managing these types of appliances/devices have come under come scrutiny, resulting in some RFIs for folks.

Now, there are no specific standards or EPs that speak to the management of these appliances/devices, but it appears that where opportunities in this realm are being funneled is our old friend EC.02.01.01, generally a “there was no policy or risk assessment in place to indicate how the risks associated with…” (quotes are mine as I am paraphrasing the general concept). Not that long ago we talked about how far one might need to go when it comes to the ever-present specter of the risk assessment process, and I guess the short answer is: Here’s another instance to flex the ol’ risk assessment muscles.

And so I ask of you: How are you guys managing these pesky appliances? Incoming functional safety inspection (you turn it on and presto, it works) with periodic visual inspections during surveillance rounds? Regularly scheduled preventive maintenance (PM) activities? Re-inspection when something gets busted and is repaired? Inquiring minds (as they are wont to do) await your input!

Thela Hun Ginjeet (and a great big dose of humidity)

Not to belabor or otherwise abuse the deceased equine, I wanted to share with you a potential solution for those of you who might be struggling with high humidity levels in your surgical procedure areas. Let me first say that I’m not an engineer, so I can’t necessarily speak to the science/mechanics of this strategy, but my friends in a nationwide hospital system have employed this with some success. As they say on TV (and radio, and just about anywhere there’s a legal disclaimer), actual results may vary. Consult your (insert professional here) if conditions persist…

And so we have this: Set the discharge temperature of the air handler(s) feeding your ORs (or any other spots where you are having challenges with humidity) to a lower setpoint, to where the reheat coils come on and dry the air. The colder supply air temps from the air handler should trigger the reheat coils to come on, and potentially dry some of that moist air.

Just to give you some geographical context, the folks who appear to be having some luck with this strategy for managing humidity are in those quintessentially arid locations such as Florida, the Carolinas, and Mississippi (when these folks get a new sweater for Christmas, it’s not necessarily something they’d wear). So in the interest of sharing (which generally equates with caring), I figured I’d throw this out there for consideration.

Any folks out there in radioland who’ve tried this and had successes (or not), let me know in the comments. I don’t think this one’s going away any time soon as a survey hot topic, so anything we can do to help each other makes a lot of sense to me, but that might just be me…

Alien invasion: Take me to your (Emergency Management) leader!

It’s been a fairly busy year when it comes to updates of standards and such (short of the anticipated adoption of the 2012 Life Safety Code®…as Tom Petty once noted, the waiting is the hardest part, but I digress) and this week we’ll take a look at the new requirements relative to leadership and oversight of the Emergency Management (EM) function. I’m still not entirely certain what we’re gaining by this, unless as a means of ensuring that organizational leadership is inclined to provide sufficient resources to the task of being appropriately prepared for emergencies, but I’m sure it will all be made clear in the fullness of time.

So, we start with LD.04.01.05 which (in EP 5) mandates hospital leaders to identify an individual (and it does say “individual,” not the usual “individual(s)”—sounds like only one person’s going to be on the hook for this) to be accountable for matters of EM that are not within the responsibilities of the incident commander role. This includes such processes as staff implementation of the four phases of EM (mitigation, preparedness, response, and recovery); staff implementation of EM across the six critical areas (communications, resources and assets, safety and security, staff roles and responsibilities, utilities, and patient clinical and support activities); collaboration across clinical and operational areas relative to EM; and collaboration with the community relative to EM stuff. I think that’s pretty straightforward and, to be honest, I can’t say that I’ve run into any organizations that have not taken things to this level.

Next up we have LD.04.04.01. EP 25, which ties hospital senior leadership in as the drivers of EM improvements across the organization, including prioritization of improvement opportunities, as well as a specific review of EM planning reviews (a review of the review, if you will) and a review of the emergency response plan (exercises and real events) evaluations. So this speaks to a very specific communications process from the “boots on the ground” EM resources up to senior leadership. This one is very doable and even “done-able” if you’ve been including consideration of EM program evaluations as a function of your annual evaluation of the Environment of Care Management program. Lots of folks are doing this, so this one’s not so much of a stretch.

Finally, we have EM.03.01.03, EPs 13 and 15, which basically establish the requirement to have a specific process for the evaluation of EM exercises and actual response activities. You’re doing this, I am quite certain, but what you might not be succinctly documenting is the multidisciplinary aspect of the evaluation process (don’t forget to include those licensed independent practitioners—we want them at the table). It goes on to the process for reporting the results of the exercise/event evaluations to the EOC committee. Again, I’m pretty confident that this is in place for many (probably most, maybe even all) folks.

That’s the scoop on this. The changes are effective January 1, 2014 and I don’t think this is going to present much of a problem for folks, though please feel free to disagree (if you are so inclined). Certainly what’s being required fits into the framework of processes and activities that are already in place, so less fraught with peril than other changes that could have been made. (I’m still waiting for the influx exercise requirement to be changed to an evacuation exercise requirement. I think we do influx pretty well; evacuation, that’s a whole other kettle of fish.)

Well, while I don’t think that you’d have to include alien invasion on your HVA, if such a thing were to occur, at least we’ll know who to take them to when they ask…

What time is it? It’s JCST (Joint Commission Standard Time)!

In the June 2013 edition of The Joint Commission’s Perspectives, George Mills covers the thorny topic of the Environment of Care management plans. Within his dissertation, he makes note that he doesn’t recommend inclusion of the Joint Commission standards and Elements of Performance (EP) in the management plans (see p. 6 of the article “Environment of Care Management Plans” for the skinny). The reasons include the caution not to “merely” restate the EPs and standards (I’ve seen management plans that consist of nothing but a reiteration of the standards and performance elements, verbatim, with no supporting description of the organization’s strategies for complying with each of the required elements—not a good thing at all), as well as to avoid the “tedious” task of making sure that minor changes to the standards (which happen periodically, but I don’t know that I’d get to the point where I’d call it tedious to review the standards from year to year) don’t trip you up during a survey. He finishes with the statement that surveyors know the standards and EPs, so they don’t need to be repeated in the management plans.

Now I don’t necessarily disagree with any of those statements, but I don’t know that there isn’t a benefit to indicating the specific performance elements as a function of the management plans, if only to ensure that it is very clear to everyone (internal reviewer, regulatory surveyor, etc.) how your organization manages compliance with each of those elements and standards. My personal experience (and those of folks with whom I have worked with on their management plans) has been that the easier it is for the surveyor to tie a standard or an EP to a specific portion of your management plan, the greater the likelihood that they will “tick” that element off and move on to other things. To be honest, when I’m looking at management plans, I tend to focus as much on what has changed recently as anything—it provides evidence that the folks charged with managing the EC program are making sure that they’re staying on top of changes  to the standards.

As a further enticement to you folks who’ve not yet added Perspectives to your monthly reading list, p. 8 of the June issue also includes a rubric for evaluating the “quality” (my interpretation) of your management plans. It’s an interesting exercise that you might even consider covering as a group exercise with your EC committee. One of the most important aspects of this whole magillah is for your committee to have a comprehensive sense of how risk is managed in the physical environment—from the identification of opportunities through the strategies developed to make good on those opportunities through the monitoring and evaluation of performance relative to those opportunities. While there will always be content experts in the mix, it is of critical importance to a highly performing committee for the committee as a whole to be able to speak to what’s going on. If you can get to that point, you have really got something powerful upon which to sustain your program.

Take this leaking boat (with apologies to The Swell Season)

As I was cruising through the updated State Operations Manual from CMS, I happened upon something that I can honestly say I’d not really “seen” before. And so, I direct you to the following paragraph from the guidance provided to surveyors conducting CMS surveys:

“Determine whether the hospital maintains the ABHR [alcohol-based hand rub] dispensers in accordance with the manufacturer’s guidelines, or, if there are no manufacturer’s guidelines, that the hospital has adopted policies and procedures to ensure that the dispensers neither leak nor the contents spill.”

“(T)he dispensers neither  leak nor the contents spill.” Now I don’t know about you, but from my experiences, that (as they say) is a pretty tall order and I can’t say that I’ve run into a whole lot of folks who’ve had unbridled success in this regard. I’ve seen more paint, wood, just about any surface you could name, pretty much ruined by alcohol-based hand sanitizer, so I’m thinking that the whole no leak or spill thing has a ways to go.

What are your experiences with this? Anyone have any success stories they’d like to share with the class? The floor is open and we eagerly await your contribution.

This just in: Absolutely nothing

While we are on the subject of the CMS, you may be interested to know that an update of the State Operations Manual (which is basically the foundation resource for the conduction of CMS surveys) was unveiled on June 7. You can find the transmittal here.

The good news is that there are no changes to the content relative to the survey of the physical environment, including the Life Safety Code® (LSC) requirements. The bad news is that there are no changes to the content relative to the survey of the physical environment, including the LSC requirements. So, no green light on the 2012 edition of the LSC—and the peasants don’t rejoice.

I can’t think of anything that’s more keenly anticipated than the 2012 LSC, at least in healthcare safety circles (and hopefully circle doesn’t become a pejorative term—that would be most unfortunate). Like children on the eve of a birthday, we wait, and wait, and wait…